Microsoft's news AI mis-identifies Little Mix singer Jade Thirlwall, highlighting racial problems in tech

  • Microsoft's new AI-based news editor backfired this week, accidentally mixing up images of Little Mix singers Jade Thirlwall and Leigh-Anne Pinnock.

    Microsoft recently announced plans to replace all of its human news editors on news service MSN with artificial intelligence software. Not long after the announcement, MSN came under fire when it published an article about Little Mix singer Jade Thirlwall’s thoughts on racism and mistakenly used a photo of her bandmate Leigh-Anne Pinnock instead.

    Posting on Instagram, Jade wrote "@MSN If you’re going to copy and paste articles from other accurate media outlets, you might want to make sure you’re using an image of the correct mixed race member of the group." It later emerged that the mistake had been made by the AI-based editor system rather than a human editor, sparking renewed outrage at potential racial bias in the tech.

    Media editor at the Guardian newspaper, Jim Waterson broke this story initially after multiple MSN journalists reported it to him. He tweeted that "Staff at MSN have also been told to await the publication of this Guardian article and try to manually delete it from the website, because there is a high risk the Microsoft robot editor taking their jobs will decide it is of interest to MSN readers."

    The past several years have highlighted serious racial equality issues in the development of emerging technologies such as Artificial Intelligence and computer vision. Amazon has faced severe backlash for its AI-based facial recognition platform, for example, which was shown to disproportionately mis-identify black people and is being actively used by US police authorities.

    This is the latest in a long line of racial problems in artificial intellignece, which came to the forefront several years ago when it was found that several AI-based recruitment tools were inadvertantly giving white men priority. The problem with AI is that it learns based on the data it's given, meaning that if it's trained on existing data then it will actually bake existing racial biases into its selection process.

    Source: The Guardian

    About the author

    Brendan is a Sync NI writer with a special interest in the gaming sector, programming, emerging technology, and physics. To connect with Brendan, feel free to send him an email or follow him on Twitter.

    Got a news-related tip you’d like to see covered on Sync NI? Email the editorial team for our consideration.

Share this story