体彩283二带一奖金多少:Facebook moderators have PTSD-like symptoms from horrifying and violent images, fringe content

Facebook's low-paid army of content moderators, who are often subjected to poor working conditions, suffer PTSD-like symptoms as they are exposed on a daily basis to some of the vilest and fringe content posted to the social network, according to a scathing new investigative report by The Verge.

The tech publication starts off describing how Chloe, a content moderator at Phoenix, Ariz.-based Cognizant – where 1,000 people work to make very fast decisions under intense pressure about whether content that's been flagged is in violation of Facebook's rules – on that day has to moderate posts in front of her fellow soon-to-be-moderators as she's being trained.

"The video depicts a man being murdered. Someone is stabbing him, dozens of times, while he screams and begs for his life. Chloe’s job is to tell the room whether this post should be removed. She knows that section 13 of the Facebook community standards prohibits videos that depict the murder of one or more people. When Chloe explains this to the class, she hears her voice shaking," The Verge reports, adding that she later leaves the room and cries so hard she can barely breathe.


Facebook, which has faced criticism from all corners for its content moderation mistakes and for the massive rulebook that guides for moderators, had more than 30,000 employees working on safety and security by the end of last year. Of those, about half are content moderators, and the tech giant relies on contract labor for most of that work. In the face of a never-ending firehose of content, moderators are expected to maintain a 95 percent accuracy rate while reviewing more than 1,000 posts per week to see if they violate Facebook's community standards.

The Verge's report, which is based on interviews with a dozen former and current Cognizant employees, depicts a soul-crushing, morbid environment where workers joke about self-harm, do drugs on the job, develop severe anxiety or have panic attacks because of the horrifying content they're forced to view. Most of the moderators interviewed quit after one year.

The Phoenix moderators, according to the report, make about $28,000 per year, while the average Facebook full-time employee earns $240,000. In contrast to the perk-filled life at Facebook's Frank Gehry-designed Menlo Park, Calif. headquarters, moderators in Phoenix are closely surveilled by managers and allotted very short breaks for using the bathroom or so-called wellness time.

In addition, moderators told the tech news site that some colleagues have even embraced the fringe, conspiracy-laden views of the memes and posts they're forced to view each day.


Mark Zuckerberg, chief executive officer and founder of Facebook Inc. attends the Viva Tech start-up and technology gathering at Parc des Expositions Porte de Versailles on May 24, 2018, in Paris.

Mark Zuckerberg, chief executive officer and founder of Facebook Inc. attends the Viva Tech start-up and technology gathering at Parc des Expositions Porte de Versailles on May 24, 2018, in Paris. (Getty Images)

Both Cognizant and Facebook pushed back on some aspects of The Verge's reporting.

Bob Duncan, who oversees Cognizant’s content moderation operations in North America, told The Verge that recruiters carefully explain the graphic nature of the job to applicants. “The intention of all that is to ensure people understand it. And if they don’t feel that work is potentially suited for them based on their situation, they can make those decisions as appropriate.”


At a later stage of the reporting, Facebook allowed The Verge's reporter to visit the Phoenix site after telling her that the moderators' experiences don't reflect those of most contractors, either in Phoenix or worldwide. New positive-message posters were put up and several content moderators who spoke to The Verge expressed satisfaction with their jobs and how they're treated, claiming that the very awful, violent content is only a small fraction of what they view.

When the reporter asks one of the on-site counselors about the potential for workers to develop PTSD, he tells the reporter about something called "post-traumatic growth."

The Verge concludes that the "call center model of content moderation is taking an ugly toll on many of its workers. As first responders on platforms with billions of users, they are performing a critical function of modern civil society, while being paid less than half as much as many others who work on the front lines. They do the work as long as they can — and when they leave, a non disclosure agreement ensures that they retreat even further into the shadows."


A former contract content moderator sued Facebook in September, claiming that her work for the tech giant left her with PTSD.

  • 中考期间 太原27个公共自行车服务点有人值守 2019-03-21
  • 中船重工总经理孙波涉嫌严重违纪违法被查 2019-03-17
  • 2014首届丝绸之路经济带国际论坛嘉宾(二) 2019-03-17
  • 坚定绿色发展 探索转型之路brspan style=font-family 楷体,楷体 2019-03-09
  • 端午当天赣64地超35℃ 最热的地方在上栗 2019-02-23
  • 安福交警殴打老人不属实 实则阻碍交警执行公务 2019-02-15
  • 陕西省非公经济发力“三个经济”发展对话会召开    2019-01-28
  • 流浪犬问题急切待解(民生调查·关注流浪犬管理①) 2019-01-28
  • 传销陷阱深似海 迷失自我难回头(图) 2018-12-10
  • 特鲁多:美对我征税一元,我们也对美征税一元! 2018-12-10
  • 做强茶企 告别“大而不强” 2018-12-05
  • 雨中赛龙舟 今天想看的快去曲江池吧! 2018-12-05
  • 苍井空果宝酱组合《我最宅》MV首播 苍老师大秀舞技 2018-12-04
  • 北京冬奥场馆建设注重赛后利用 2018-11-29
  • 中国主张熠熠生辉(望海楼) 2018-11-27
  • 58| 698| 714| 562| 477| 466| 302| 744| 314| 420|