Hundreds of active and retired police officers and law enforcement personnel are congregating in private Facebook groups where they engage in open racism, Islamophobia, and even lend support to violent, anti-government groups, according to an investigation from nonprofit news organization Reveal, which is run by the US Center for Investigative Reporting.
After Reveal notified law enforcement agencies, more than 50 departments have reportedly opened internal investigations. In some cases, departments say they’ll be evaluating officers’ online activity to see if it may have influenced past policing conduct. At least one officer has been fired for violating department policies as a result of participating in these groups, some of which bear names like “White Lives Matter” and “Death to Islam Undercover.”
Reveal reports that the groups contain a full range of right-wing political ideologies, from standard conservativism to far-right initiatives that center around outright racism and Islamophobia. Some go even further: some Facebook groups surveyed by Reveal were associated with anti-government and militia movements, like the Oath Keepers. Reveal says that 150 of the 400 or so officers that it identified as belonging to these groups were part of that more extreme end.
The unifying thread to all of these Facebook groups is that they are frequented and sometimes founded and operated by active and retired police officers, and that they actively recruit other police officers to join. Reveal reports that members of small rural departments and officers in the largest precincts in the nation, in Los Angeles and New York City, are participating in these groups.
Reveal’s findings are troubling for Facebook’s ongoing moderation efforts. Like most of Silicon Valley’s large social platforms that host media and speech, Facebook is struggling to deal with its outsize impact on society; the company has neither the resources nor the wherewithal to combat the flood of hate groups, extremism, and misinformation on its platform. In some rare but tragic cases, activity on platforms like Facebook and Google’s YouTube has contributed to the radicalization of certain individuals who go on to commit offline violence. And in some disturbing cases, like the Christchurch shooting earlier this year, that offline violence is then rebroadcast on Facebook and YouTube for maximum effect.
Facebook has leaned on artificial intelligence as a kind of panacea for its moderation woes. But at the F8 developer conference earlier this year, Facebook also announced a shift away from the News Feed and toward private groups as a way to lessen the influence of its algorithms. The shift also, in a way, absolves the company of responsibility for moderation. If public posts and pages wane in favor of private group activity, the logic is that those groups will self-moderate, and that by nature of being private they’ll reduce the reach of potentially harmful activity, too.
But there’s no evidence to suggest Facebook is taking a more active role in moderating these groups’ activities — in fact, the opposite appears to be true. And the notion of active duty police officers with access to firearms participating openly in bigotry and potentially violent online behavior is worrisome for how it could translate into offline actions in the future.
Facebook bans content that targets individuals based on their skin color or religion under its hate speech policies, and it also has rules around violent incitement and groups that have been known to organize and take action offline. It’s taken action against groups like far-right figure Gavin McInnes’ Proud Boys and individuals like conspiracy theorist Alex Jones for violations of those policies.
But it’s often difficult for Facebook to take such action against individuals without large followings or specific groups if those groups are private and if those groups have taken measures to conceal the nature of their purpose. As such, some organizations on Facebook use coy in-jokes and other far-right dog whistling tactics to circumvent Facebook’s algorithmic filters. So a group with the phrase “Ku Klux Klan” in its title will easily get taken down, but one titled “Confederate Brothers & Sisters” will go unnoticed.
Reveal says it identified these officers with a strategy that, ironically enough, involved using data Facebook has since stopped providing to third-parties due to developer misuse. Yet it’s this data that allows watchdogs like Reveal to do the investigations Facebook seemingly won’t.
To find cops with connections to extremist groups, we built lists of two different types of Facebook users: members of extremist groups and members of police groups.
We wrote software to download these lists directly from Facebook, something the platform allowed at the time. In mid-2018, in the wake of the Cambridge Analytica scandal and after we already had downloaded our data, Facebook shut down the ability to download membership lists from groups. Then we ran those two datasets against each other to find users who were members of at least one law enforcement group and one far-right group.
We got 14,000 hits.
Reveal says it could not initially assume that every member in a police Facebook group was an actual officer or even a retired one. They could have been individuals with general affinity and respect for law enforcement, relatives of officers, or those who aspire to join the police. So Reveal says it did research on hundreds of individuals, sometimes calling local departments to confirm active employment or retirement status. Reveal also joined dozens of these groups to verify its findings.
“Ultimately, we confirmed that almost 400 users were indeed either currently employed as police officers, sheriffs or prison guards or had once worked in law enforcement,” the report reads. It is not clear at the moment how Facebook plans to review these groups or under what policies it might take action. Meanwhile, Reveal reports that the law enforcement agencies it contacted are continuing to conduct their own investigations into the officers’ online and offline conduct.
Facebook was not immediately available for comment.