When a machine moderates content, it evaluates text and images as data using an algorithm that has been trained on existing data sets. The process for selecting training data has come under fire as it’s been shown to have racial, gender and other biases.

  • topinambour_rex@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    1
    ·
    1 year ago

    An important point, buried deeply in the article :

    She said she has observed differential treatment across the social media platform, depending on the race of the subject in the image in question or who posted it.

    “We’ve seen this time and again, Meta taking down content by and about people of color,” she said. “While similar content by and about white people remains up.”