Companies are social, but not responsible
How much “sharing” should people be allowed to do on social media? The misguided premise of social media is that just about anything and everything can — and should — be uploaded and shared and virally spread.
Social media companies have convinced us that every sneeze and every ham sandwich is deserving of a photo and a “like.”
There are limits to what can be posted, but users almost always find a way around them, meaning content that is supposedly banned is there for all to see. Friday’s horrific shooting in New Zealand was no different.
One of the shooters reportedly livestreamed the massacre on Facebook, which claimed it did not know the video existed until police alerted the company. It was up for 17 minutes, CNN reported.
“New Zealand Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video,” Mia Garlick, Facebook’s director of policy for Australia and New Zealand, said in a statement.
The world’s most powerful social media company was not aware of the video for 17 minutes? That’s hard to believe, especially considering the company’s algorithms seem to know every website we visit, every link we click, every nuance of our online lives.
Facebook was not alone. Copies of the livestream video appeared on YouTube and Twitter hours after the attack, CNN reported. Google, which owns YouTube, said in a statement that it removes “shocking, violent and graphic content” but it’s unclear how long it took the company to remove this video.
Again, it’s hard to fathom that Google could not figure out how to keep this video off its platforms.
Facebook and Google have promised us a utopian experience online, complete with content and advertisements tailored specifically to our wants and needs. The companies track our movements through smartphone apps, know our favorite restaurants, our kids’ names, and secrets we wouldn’t tell a best friend. We implicitly trust them every time we use one of their apps or features.
But should we? Should we trust companies with our online lives when they seem to place such little value on the offline versions of us? Should we trust companies that allow murders to be livestreamed and shared?
Friday’s mass murder was not the first instance of this happening, and it won’t be the last. Either social media companies simply choose to allow these kinds of videos to circulate, or they have not made detecting and removing them a priority. Either is not acceptable.
There is a limit to what we should be able to share on social media, and there is room for debate on what exactly those limits should be. But mass murder is a no-brainer. No company should allow users to share that kind of violence, and no user should trust a company that does.
Email publisher Luke Horton at email@example.com