As the nation waits for jurors to decide the fate of Derek Chauvin – the former police officer standing trial for the murder of George Floyd – Facebook says it’s actively working to “limit content that could lead to civil unrest or violence.”

“We are doing what we can to prepare for the verdict. This means preventing online content from being linked to offline harm and doing our part to keep our community safe,” reads a blog post from Monika Bickert, a vice president of content policy at the tech giant.

For years, social media sites like Facebook and Twitter have been used to organize rallies, demonstrations, and protests. Sometimes that’s had a positive impact. Twitter was credited for fueling The Arab Spring, a series of anti-government uprisings in oppressive regions in the Middle East and beyond.

On the other hand, Facebook has often been used as a staging ground for militia groups. A recent report by the Tech Transparency Project found that “Facebook allowed election conspiracies and far-right militias to proliferate on its platform, fueling the radicalization that drove the Jan. 6 insurrection.”

Last summer, 17 year-old Kyle Rittenhouse traveled to Kenosha, Wisconsin after he saw a Facebook post calling for volunteers to defend the city from rioters. Rittenhouse killed two men, Anthony Huber and Joseph Rosenbaum. He faces murder charges. Facebook is facing a lawsuit for its role in Rittenhouse’s behavior.

Authorities are preparing for unrest after the Chauvin verdict. National Guard troops are being deployed throughout the country. Protests – many peaceful, some violent – rocked the nation after Floyd’s death last year.

Congresswoman Maxine Waters, a California Democrat, urged protesters to ‘get more confrontational’ if Chauvin is acquitted. “We got to stay on the street. And we’ve got to get more active, we’ve got to get more confrontational. We’ve got to make sure that they know that we mean business,” she said.

In her blog post, Bickert writes:

We will remove content that violates our Community Standards, including our policies against hate speech, bullying and harassment, graphic violence, and violence and incitement. As we have done in emergency situations in the past, we may also limit the spread of content that our systems predict is likely to violate our Community Standards in the areas of hate speech, graphic violence, and violence and incitement. We will remove Pages, groups, Events and Instagram accounts that violate our violence and incitement policy and we will remove events organized in temporary, high-risk locations that contain calls to bring arms.