Facebook catches a lot of heat for intruding into users’ private lives. However, sometimes the social media giant uses its all-knowing and all-seeing power for good. For instance, the company has recently begun using artificial intelligence to predict if a user is acting suicidal, and then reporting them to local authorities.
The function was first introduced about a year ago, and according to Facebook, it has dramatically increased in effectiveness since. The company now says that its flagging 20 times more instances of users displaying suicidal thoughts. The technology is extremely precise, flagging the most common words used by suicidal people and even reading posts for context (for instance, if a user writes “I’m gonna kill myself” in a joking way, it isn’t flagged). And the reporting mechanism acts quickly to connect with law enforcement. One sheriff said his county’s 911 department in New York received a call from a Facebook dispatcher in Ireland regarding a potentially suicidal person in the area.
“This is helping us in public safety,” sheriff Joseph Gerace told CNBC. “We’re not intruding on people’s personal lives. We’re trying to intervene when there’s a crisis.”
There’s no doubt that Facebook’s “big brother” tendencies are concerning — but at least the company is doing something positive with its massive ability to surveil us.