Software holes are common. Try as they might, companies can’t avoid all software flaws. Unfortunately for moderators working at Facebook, those holes have allowed some extremist groups to identify individual moderators. Some of those moderators are now living in hiding afraid for their lives.
The Role of Moderator
Almost every website has a moderator. That role, when it comes to Facebook, is different. Facebook moderators look for extremist posts and groups. They look for offensive videos and content that is considered violent. They look for ways to oust those people that may pollute the environment.
Facebook moderators are responsible for spreading hate. They look to get rid of terrorist groups and sympathizers. They make it impossible for those people to congregate on the site. They get in the way. So when moderator profiles are accidentally made public due to a programming flaw, those people become targets.
The Vital Flaw
The security hole that Facebook missed impacted 1,000 users across 22 Facebook departments. The flaw exposed the complete name of any person moderating an account. The flaw was noted last Spring, and made the names of moderators that removed terrorist groups visible to those groups. Worse, the names appeared to terrorist groups as a notification.
They were impossible to miss. Even though the bug was noted last August, it was not fixed until November 2016. For two months, the names of those moderators was visible to terrorist groups. Facebook has told press that the company has now fixed the flaw, and that it is unlikely those people will be targeted.
This logic was followed with the summation that group notifications are plentiful, and terrorists were likely to have assumed moderators were part of the group. But this is little consolation to moderators that are now in hiding.
The Tough Job of a Moderator
Facebook employees moderators to try and stop violence from spreading. But those moderators do not assume aliases. They also have to face viewing violent videos and learning about bothersome information on a daily basis. They watch the videos that Facebook doesn’t want you to see.
Facebook is currently in the process of developing Artificial Intelligence software to try and replace some of the tasks of moderators. But AI has to be taught, and those that teach the computers are humans.
In other words, Facebook will need moderators (and has expanded its team of moderators) for a long time to come. A few moderators have gone into hiding now that news of the leak has spread. Some have also noted the pay rate of these employees, stating that it should be higher given the dangerous implications of the job at hand.
Can these people be protected? That is the question at hand. Facebook has determined that these people are probably not in danger, but that does sit well with those that are already in hiding. These people have a very real fear for their lives. Hopefully, Facebook will develop the AI needed to stop violence on the social network, or actively monitor their software for potentially fatal, literally, flaws.