Zuckerberg says Facebook's failure to remove militia page an 'operational mistake'

Facebook said it would continue to evolve its policies for identifying potentially dangerous organisations


Reuters August 29, 2020
Facebook CEO Mark Zuckerberg listens to a question from the audience after unveiling a new messaging system during a news conference in San Francisco, California November 15, 2010. PHOTO: REUTERS

Facebook made an “operational mistake” in not acting sooner to remove a page for a militia group that posted a call to arms in Kenosha, Wisconsin, the company’s Chief Executive Mark Zuckerberg said on Friday.

The social media company said on Wednesday it had removed the page for the Kenosha Guard, and an event listing there for Armed Citizens to Protect Our Lives and Property as it violated the company’s policy against “militia organisations”.

Facebook’s action came after two people were shot and killed during protests in the town on Tuesday night, part of three nights of civil unrest that followed the shooting by a white police officer that left a Black man, Jacob Blake Jr, paralysed.

Earth flying through ancient supernovae's dust

Zuckerberg, speaking in a video message published on his Facebook profile, acknowledged the company had received complaints from “a bunch of people” about the Kenosha Guard posting.

“The contractors and reviewers who the initial complaints were funneled to basically didn’t pick this up,” he said. “And on the second review, doing it more sensitively, the team that’s responsible for dangerous organisations recognised that this violated the policies and we took it down.”

Zuckerberg said the company had not found any evidence to show that the person charged with the fatal shooting during Tuesday’s unrest followed the Kenosha Guard page.

Fitbit unveils stress-tracking smartwatch with Google deal pending

News website BuzzFeed quoted an internal Facebook report as showing the event associated with the Kenosha Guard was flagged at least 455 times, and a Facebook worker as saying it accounted for 66% of all event reports that day.

Facebook declined to comment on those findings, Buzzfeed said.

Facebook said it would continue to evolve its policies for identifying potentially dangerous organisations.

“This is a new policy we launched last week and we’re still scaling up our enforcement of it by a team of specialists,” a spokesperson said.

COMMENTS

Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ