A gruesome and unfathomably evil video may have made its way onto one of your social media feeds Monday morning.
On Easter Sunday, a Cleveland grandfather was slain, seemingly at random, in a video posted to Facebook.
The victim, Robert Godwin Sr., was walking home from a family dinner when a stranger approached him. “Here’s somebody I’m about to kill. I’m about to kill this guy right here. An old dude,” a man can be heard saying on camera before fatally shooting Godwin.
It took more than two hours for Facebook to disable the profile page of the alleged killer. By that time, the video had been shared widely, tormenting Godwin’s family.
The alleged murderer later killed himself in Pennsylvania after a police chase.
The tragedy paints a bleak outlook for the future of our online social circles if Facebook doesn’t do more to curb such detestable, attention-seeking behavior.
“This is a horrific crime, and we do not allow this kind of content on Facebook,” a spokesman said Monday. Later that day, the company’s vice president for global operations, Justin Osofsky, said the social network was working to “be sure people can report videos and other material that violates our standards as easily and quickly as possible.”
The problem with Facebook’s response, which came just 24 hours before the start of the company’s annual developer conference, F8, is that the solutions it suggests already exist, and have for some time.
Tuesday, during day one of the conference, Mark Zuckerberg acknowledged the slaying and reiterated that the company has work to do.
It sure does. Facebook needs to do more to ensure that users aren’t accosted by violent videos auto-playing in their feeds.
Recordings of killings, deaths and suicides are not a new phenomenon; nor, unfortunately, is their being posted on the web. However, as Facebook increases its push for more live-streaming video via Facebook Live, the problem has mushroomed.
High-profile tragedies recently streamed on Facebook include the torture of a mentally disabled teen, the sexual assault of a 15-year-old and the suicide of a Texas actor.
Facebook’s policy against the posting of such violent content is clear. The company pays thousands of contractors to review posts, including a small, specialized team that handles live video. But that has not been sufficient to enforce its own policies. The vast amount of content posted to Facebook every hour makes quick detection of sadistic content a challenge, but if anyone has the resources — and responsibility — it is Facebook.
The social media giant has created a platform that can share the joyful, even mundane, details of everyday life in real time to millions of people. But it must take seriously its role as a gatekeeper on this mammoth publishing platform. These acts will continue to proliferate on their site until new technology prevents it.