Facebook Adding 3,000 Employees To Monitor, Remove Violent Content

Original Round Blue Facebook Web IconFacebook has come under near-constant criticism in recent weeks for not doing enough to curb violent content. Mark Zuckerberg himself even said the site needed to improve. It now appears the social media giant has put its money where its mouth is. This week, Facebook announced a plan to hire 3,000 new employees responsible for screening and removing posts that depict violence, crimes, hate speech and more.

According to Facebook, there are about 4,500 current employees that handle these tasks, so the site’s hiring effort will increase their resources by about 70 percent. Facebook still wants to lean on technology to detect illegal content, but putting more human eyes on the problem will help ease some bottlenecks in the process and ensure problems are dealt with faster.

“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg wrote in a blog post announcing the hiring spree. “We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”

Zuckerberg went on to say that the new staff will also help Facebook improve its processes for removing hate speech and child exploitation. The Facebook CEO also noted that it will continue to rely on local officials and law enforcement to help out if a Facebook user is in real trouble.

For as much as the site wants to rely completely on Artificial Intelligence, sometimes a little old-fashioned human connection is best.