Ex-Facebook Content Moderator Says She Got PTSD From Exposure To Graphic Content

Even though Facebook is heavily reliant on artificial intelligence to detect graphic or disturbing content, the company still employees a large team of human content moderators to review and flag these posts, too. Of course, these content moderators see a ton of disturbing content during the course of their everyday work, and now one former employee is alleging she developed PTSD from exposure to so much graphic content.

The former content moderator, Selena Scola, filed a complaint in court last week, and if the court agrees to hear it, it could become a class action suit. Scola alleges that her experience was “typical” for moderators at the company. In other words, Facebook could soon face a big legal problem from some of its former employees.

According to Scola, she has experienced symptoms consistent with people suffering from Post-Traumatic Stress Disorder.

“Ms. Scola’s PTSD symptoms may be triggered when she touches a computer mouse, enters a cold building, watches violence on television, hears loud noises, or is startled,” the complaint reads. “Her symptoms are also triggered when she recalls or describes graphic imagery she was exposed to as a content moderator.”

While this case has a long way to go before it makes a real impact on Facebook, the substance of it is notable. After all, if a platform contains so much disturbing content that it can inflict stress like this, perhaps a greater change needs to be made to its guidelines.