Fixing mistakes for better content moderation: Facebook

San Francisco: Facing ire over reports that it is protecting far-right activists and under-age accounts, Facebook on Wednesday said it takes the mistakes incredibly seriously and is working on to prevent these issues from happening again.

Channel 4 Dispatches — a documentary series that sent an undercover reporter to work as a content moderator in a Dublin-based Facebook contractor, showed that moderators at Facebook are preventing Pages from far-right activists from being deleted even after they violate the rules.

In a blog post, Monika Bickert, Vice President of Global Policy Management at Facebook, said the TV report on Channel 4 in the UK has raised important questions about our policies and processes, including guidance given during training sessions in Dublin.

“It’s clear that some of what is in the programme does not reflect Facebook’s policies or values and falls short of the high standards we expect.

“We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention. We have been investigating exactly what happened so we can prevent these issues from happening again,” Bickert wrote.

The documentary also showed that Facebook moderators have turned blind eye to under-age accounts.

“Moderators are told they can only take action to close down the account of a child who clearly looks 10-year-old if the child actually admits in posts they are under-aged,” reports said, citing the documentary.

Facebook said it has immediately required all trainers in Dublin to do a re-training session — and is preparing to do the same globally.

“We also reviewed the policy questions and enforcement actions that the reporter raised and fixed the mistakes we found,” the Facebook executive said.

In a separate letter written to Nicole Kleeman, Executive Producer at Glasgow-based Firecrest Films who raised the issues with Facebook, Bickert said a review is going on regarding training practices across Facebook contractor teams, including the Dublin-based CPL Resources, the largest moderation centre for UK content.

“In addition, in relation to the content where mistakes were clearly made, we’ve gone back an taken the correct action,” she said.

Facebook had earlier promised to double the number of people working on its safety and security teams this year to 20,000. This includes over 7,500 content reviewers.

The company said it does not allow people under 13 to have a Facebook account.

If a Facebook user is reported to us as being under 13, a reviewer will look at the content on their profile (text and photos) to try to ascertain their age.

“If they believe the person is under 13, the account will be put on a hold. This means they cannot use Facebook until they provide proof of their age. We are investigating why any reviewers or trainers at CPL would have suggested otherwise,” Bickert said.

Facebook said it does have a process to allow for a second look at certain Pages, Profiles, or pieces of content to make sure it has correctly applied its policies.

“While this process was previously referred to as ‘shield’, or shielded review, we changed the name to ‘Cross Check’ in May to more accurately reflect the process,” the company said.

IANS