Enlarge / Facebook CEO Mark Zuckerberg in 2017.Mark Zuckerberg

A Facebook content moderator named Selena Scola has sued the social media giant, claiming that repeated exposure to graphic and disturbing images has given her post-traumatic stress disorder.

"From her cubicle in Facebook's Silicon Valley offices, Ms. Scola witnessed thousands of acts of extreme and graphic violence," her lawsuit says.

The lawsuit quotes another Facebook moderator who told the Guardian last year, "Youd go into work at 9am every morning, turn on your computer and watch someone have their head cut off." Some Facebook moderators are also exposed to child pornography.

"Scola developed and continues to suffer from debilitating PTSD as a result of working as a Public Content Contractor at Facebook," the lawsuit alleges. And, her lawyers argue, the situation runs afoul of California worker safety laws.

"We recognize that this work can often be difficult," Facebook said in an emailed statement. But the company argued that—contrary to the claims in the lawsuit—Facebook provides extensive services to help moderators cope with the disturbing images they see on the job.

It's a problem that extends beyond just Facebook. Last year, Microsoft moderators filed a lawsuit against Microsoft, making arguments similar to those in this week's Facebook lawsuit.

Facebook says it supports its moderators

The uncomfortable reality here is that someone is going to have to look at disturbing images submitted by users to platforms like Facebook. As long as user-generated content platforms exist, some users are going to submit this kind of content, and no mainstream platform is going to want to expose its users to these images.

But there are steps that Facebook can take to make the task less traumatic. The company could warn prospective employees about the nature of the job, reduce the resolution of potentially graphic images, give employees shorter shifts, and allow employees to review less offensive content for a few hours after having to moderate the most extreme, graphic images.

Scola's lawsuit charges that a Facebook-backed organization called the Technology Coalition published recommendations for best practices to help moderators, including limiting the amount of time employees are exposed to disturbing images, offering counseling, and permitting moderators to switch to other tasks after viewing disturbing imagery. According to Scola, Facebook failed to implement those recommendations.

But Facebook says it has taken significant steps to protect moderators.

"We take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources," the company said in an emailed statement.

Scola didn't officially work for Facebook. Instead, she worked for a contractor called Pro Unlimited that managed moderators on Facebook's behalf. But Facebook says that its contract with Pro Unlimited requires the contractor to "provide resources and psychological support, including onsite counseling—at the location where the plaintiff worked."

The problem with outsourcing moderation

Workers like Scola represent the unglamorous underbelly of major social platforms like Facebook and YouTube. The very existence of content moderation teams is awkward for companies like Facebook that like to portray themselves as purely neutral, automated online services, argues Sarah Roberts, an information studies professor at UCLA.

"The existence of third-party human beings between users and platforms who were making decisions about other people's online self-expression was contrary to the premise" of platforms like Facebook, Roberts says.

Companies like Facebook have developed two-tier workforces. On top are Facebook's official employees—engineers, designers, managers, and others who enjoy high pay, lavish perks, and a lot of respect and autonomy.

Content moderators occupy a lower rung of Facebook's labor hierarchy. Many of these workers are not officially Facebook employees at all—they're employed by contracting firms instead, which severely limits their access and visibility to Facebook management.

And that, Roberts argues, is the root problem with Facebook's approach to moderation. Facebook has chosen to make the job of moderating content a low-pay, low-status job, she says. So it's not surprising that Facebook is facing allegations that it hasn't prioritized the psychological welfare of those same workers. The most important thing Facebook could do to help its moderators, she said, would be to make them formal employees so they could enjoy the social standing that comes with being a full employee.

And that would not only benefit Facebook's thousands of moderators, it could also be beneficial to Facebook—because effective and efficient moderation is important for the company's long-term success.

There are numerous examples of Facebook facing backlash over moderation decisions. Roberts points to the 2016 controversy when Facebook moderators censored a famous Vietnam war photo of a naked girl running from a napalm attack—Facebook eventually reversed its ruling. The same year, Facebook fired a team of contractors that had been curating its "trending topics" sidebar after allegations that the team was biased against conservatives.

In a recent documentary, a British reporter went undercover as a trainee Facebook moderator. "If you start censoring too much, then people lose interest in the platform," one moderator told the reporter. "It's all about making money at the end of the day."

Roberts argues that treating moderators as regular employees—and offering them the better pay, job security, and working conditions that would likely come along with that change in status—would allow Facebook to improve the quality of its moderation efforts. Facebook would likely be able to hire a higher caliber of workers for these jobs. Lower turnover would allow Facebook to invest more in their training. And integrating these workers into Facebook's official workforce would improve communication between moderators and the company's senior management, which might allow managers to learn about brewing moderation issues more quickly.

Still, much of the challenge of moderating Facebook comes from the sheer scale and complexity of the task. With millions of pieces of content coming every day, some mistakes are inevitable. Facebook has worked hard to make sure its policies are enforced consistently, but it's inherently difficult to get thousands of people to interpret the same complex rulebook in exactly the same way. Raising the status of moderators seems likely to improve the quality of Facebook's moderation efforts, but it probably won't be a silver bullet.

Original Article

[contf] [contfnew]

Ars Technica

[contfnewc] [contfnewc]