The drama around the Jan. 6 riot that left one police officer dead in a Washington, D.C., warehouse has since shifted from questions about how authorities handled the incident to the idea that Facebook officials might have helped incite the violence.
Now, a former employee of a company that helped Amazon and Black Lives Matter prepare disaster-response reports has accused Facebook of fueling the violence, according to a memo obtained by The Intercept.
Antoine Dodson’s sensational portrayal of a rape threat from a police officer sparked a network of Facebook Live users to share edited, sensational videos of the violent confrontation. Known for the “Bed Intruder Song,” Dodson also became an internet celebrity and received a $100,000 offer to appear on “Dancing With the Stars.”
Anissa Rivera, the daughter of a reputed MS-13 member, used Facebook Live to broadcast live from her bedroom after she was shot. During the broadcast, a large crowd gathered and began to chant “FEDS HAVE TO GO!” and spray smoke canisters at police officers.
Related: How social media helped spark the deadly Washington, D.C., fire
Even as the details of the events were being disputed, Facebook CEO Mark Zuckerberg told Congress at a hearing that the company could not review every video until it built a team dedicated to monitoring them.
The Intercept report, which was first reported by the Times, indicated that Facebook had reached out to some of the participants to determine the moment they arrived at the warehouse before the riot began. The company is alleged to have promised that if the protesters were recording the event, Facebook would create a page to publicize it.
However, after an activist from Facebook declined to participate in the research, four unnamed people unaffiliated with the company told employees that the platform had contributed to the riot. The employees said that the videos were removed before they could be flagged, and when they were retrieved were then replaced with distorted video over text.
Facebook did not comment on the contents of the memo, but told The Intercept that it had no evidence to suggest that it had contributed to the violence.
“We have an internal process to review videos that might be graphic, and we work to ensure that content is removed when it violates our standards,” the company said in a statement. “Our goal is always to help people have a safe experience while on Facebook, and a critical part of this is being sure that we respond quickly to reports of violent content.”
Related: Study: Facebook’s new ‘safety’ options don’t really work
Read the full story at The Intercept.
Thousands of posts hurtful to transgender people could have found their way to Facebook