The Oversight Board which examines content moderation decisions made by Meta’s social platforms is to look at three cases linked to posts shared during the summer riots in the UK.
Violence erupted across the country after a knife attack in Southport which killed three girls and injured eight others, fuelled by misinformation spreading rapidly on social media about the attacker’s identity, including false claims that he was an asylum seeker who had arrived in the UK on a small boat.
There have since been calls to tighten online safety laws to better respond to misinformation and disinformation because of the real world impact it can have.
The Oversight Board has now confirmed it will look at cases involving three posts from that time which were reported to Facebook for violating either its hate speech or violence and incitement policies.
The first post expressed agreement with the riots, called for mosques to be attacked and buildings to be set on fire which housed migrants.
The third post is another AI-generated image, of four Muslim men, running in front of the Houses of Parliament after a crying blond-haired toddler in a Union flag T-shirt, with the image carrying the caption “wake up”.
All three posts were originally kept on Facebook after being assessed by Meta’s automated tools – none of the posts were reviewed by humans – before the same users who had reported the posts appealed to the Oversight Board over the decision.
The board said it had selected these cases to examine Meta’s policy preparedness and crisis response to violent riots targeting migrant and Muslim communities.
It said that as a result of selecting these cases, Meta has now determined that its previous decision to leave the first post on Facebook was an error and has removed it.
The social media giant confirmed to the board it still believes its decisions to leave the second and third post on Facebook was correct.
The Oversight Board said it would now accept public comments on the issue, including the role social media played in the UK riots and the spreading of misinformation.
It is expected to issue decisions on the cases in the coming weeks, and can make policy recommendations to Meta, which although not binding, must be responded to by the tech giant within 60 days.