Amazon scrapped ‘sexist AI’ tool

Latest news

    Women and men in office on laptopsImage copyright Getty Images
    Image caption The algorithm repeated bias towards men, reflected in the technology industry

    An algorithm that was being tested as a recruitment tool by online giant Amazon was sexist and had to be scrapped, according to a Reuters report.

    The artificial intelligence system was trained on data submitted by applicants over a 10-year period, much of which came from men, it claimed.

    Reuters was told by members of the team working on it that the system effectively taught itself that male candidates were preferable.

    Amazon has not responded to the claims.

    Reuters spoke to five members of the team who developed the machine learning tool in 2014, none of whom wanted to be publicly named.

    They told Reuters that the system was intended to review job applications and give candidates a score ranging from one to five stars.

    “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those,” said one of the engineers who spoke to Reuters.

    ‘Women’ penalised

    By 2015, it was clear that the system was not rating candidates in a gender-neutral way because it was built on data accumulated from CVs submitted to the firm mostly from males, Reuters claimed.

    The system started to penalise CVs which included the word “women”. The program was edited to make it neutral to the term but it became clear that the system could not be relied upon, Reuters was told.

    The project was abandoned, although Reuters said that it was used for a period by recruiters who looked at the recommendations generated by the toll but never relied solely on it.

    According to Amazon, its current global workforce is split 60:40 in favour of males.

    About 55% of US human resources managers said that AI would play a role in recruitment within the next five years, according to a survey by software firm CareerBuilder.

    It is not the first time doubts have been raised about how reliable algorithms trained on potentially biased data will be.

    Image copyright MIT
    Image caption An MIT AI system, dubbed Norman, had a dark view of the world as a result of the data it was trained on

    An experiment at the Massachusetts Institute of Technology, which trained an AI on images and videos of murder and death, found it interpreted neutral inkblots in a negative way.

    And in May last year, a report claimed that an AI-generated computer program used by a US court was biased against black people, flagging them as twice as likely to reoffend as white people.

    View the original article: https://www.bbc.co.uk/news/technology-45809919

    Predictive policing algorithms were spotted to be similarly biased, because the crime data they were trained on showed more arrests or police stops for black people.

    In the same category are

    Brexit: UK ‘may consider longer transition period’ Image copyright AFPTheresa May is willing to consider extending the proposed 21-month transition period after Brexit to break the current deadlock ove...
    Nimble bear manages to open car door Share this with These are external links and will open in a new window Copy this link Read more about sharing. These are external links and will open ...
    Angry crowds block women from Sabarimala Hindu temple Angry crowds have prevented women from entering one of Hinduism's holiest temples. The Supreme Court previously ruled against the Sabarimala temple in...
    Jamal Khashoggi disappearance: US asks Turkey for recording evidence Media playback is unsupported on your device Media captionPresident Trump speaking in the Oval OfficeThe US has asked Turkey for a recording said to ...
    Jamal Khashoggi disappearance: US asks Turkey for recording evidence Media playback is unsupported on your device Media captionPresident Trump speaking in the Oval OfficeThe US has asked Turkey for a recording said to ...
    Women’s Champions League: Chelsea 1-0 Fiorentina Karen Carney's eighth-minute penalty proved the only goal of the game in KingstonChelsea outplayed Fiorentina but had to settle for a mere 1-0 lead i...

    Leave a comment

    Your email address will not be published. Required fields are marked *

    This site uses Akismet to reduce spam. Learn how your comment data is processed.