Beanbag brainstorms and fridges stocked with coconut water are the sort of benefits we associate with working at Facebook. Watching live footage of torture unsurprisingly isn't one of the perks that earned them second place on Glassdoor's 'Best places to work' list, but that's what their newest recruits will be tasked with doing.

SEE ALSO: 5 Ways Technology Will Change your Life In 2017

This week Facebook announced its plan to hire an extra 3,000 people to help stop the tidal-wave of distressing videos unleashed by their new live stream function. The announcement comes too late for the 11-month-old girl whose murder was broadcast by her father last week or those scarred by the video after it amassed over 370,000 views in the day it remained on the social network.

According to Facebook, like the 4,500 people already reviewing posts that violate its terms of service, the new roles are not exclusively to police live video content, but the timing is notable. The company has recently come under fire for hosting content such as the gang rape of a 15-year-old girl in Chicago, the torture of a disabled man in the same state and the live streaming of a murder in Cleveland. Given the decision to almost double the task force, these high profile incidents are presumably just the tip of a growing iceberg of violent and distressing being broadcast on Facebook every day.

this image is not availablepinterest
Bear Grylls//Digital Spy
Mark Zuckerberg

The new positions - which are yet to be advertised on the tech giant's shiny Facebook Careers page - will require significant training, support and financial compensation for the draining task of sifting through potentially scarring material every day. The concern is that, based on troubling reports of people in similar roles at Facebook and other tech giants, they will not.

A 2014 Wired article profiled contractors who moderate content on social media and found many were made to sign strict nondisclosure agreements preventing them from speaking to anyone, even other employees, about their work and many quit after a very short time.

Similarly a 2012 BuzzFeed article where an anonymous Google employee was interviewed revealed the psychological trauma he had suffered from looking at images of child abuse. "Google covered one session with a government-appointed therapist — and encouraged me to go out and get my own therapy after I left" he says, before revealing they then didn't hire him full-time afterwards, or disclose why.

In January of this year two former Microsoft employees announced they were suing the company having developed PTSD from monitoring Microsoft users' communications for child pornography and other crimes. The two employees say they they were not warned about the full nature of the job, were not given psychological support and have claim to have suffered permanent psychological injuries.

xView full post on X

When Slate writer Will Oremus asked whether the 3,000 additional workers would be employees or contractors Facebook declined to comment - a worrying early sign that these mistakes could be repeated, while conjuring a Black Mirror-like image of digital sweatshop workers given targets on eliminating gruesome videos.

Chairman Mark Zuckerberg responded to the chilling Thailand murder calling the footage "heartbreaking," and added that Facebook are trying to make reporting content easier and get it taken down faster. "Just last week, we got a report that someone on [Facebook] Live was considering suicide," he said. "We immediately reached out to law enforcement, and they were able to prevent him from hurting himself."

The CEO added, "We're also building better tools to keep our community safe. We're going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards, and easier for them to contact law enforcement if someone needs help."

Clearly it takes the worst acts of humanity to install a little humility into the world's most powerful company

But it is debatable whether computer technology will prove any better at fighting the growing expanse of troublesome content being dumped in Zuckerberg's online world. Quartz writes that, "Perhaps in the future, Facebook will develop automated tools able to detect what's going on in a video, and what's being said, potentially even remove an offensive post before a human user or moderator even has to see it." Yet they also point out the issues Facebook faced last year when they fired all human editors of the Trending news section replacing it with an AI system: "Within days of the AI taking over, fake news had found its way into the Trending section." Would AI curated videos be another disaster that shows machine can't replace man?

The upshot is that Facebook are finally taking responsibility for the content they host, a step forward from their petulant response to the fake news scandal many felt helped sway the US election. Then, Zuckerberg simply claimed Facebook "is not a media company" in an attempt to absolve themselves of any culpability. Clearly it takes the worst acts of humanity to install a little humility into what is arguably the world's most powerful company.

Initially, Facebook's live function was restricted to those with account verification such as brands or media outlets, a restriction which admittedly sounds does sound more dull and less democratic. When it was rolled out to all users, there was only one moderator for every half a million people - a sisyphean task as well as a grim one.

Perhaps, then, Facebook's big mistake was to rush ahead with the deployment of its latest shiny new toy before it had the infrastructure in place to deal with the ghoulish end of the consequences - something taking a simple stock of the web's history, rather than obsessing with its future, could have prevented.

Even as this small step is taken to try and rectify the problem after the tragedies have occurred, it's worth noting what Zuckerberg said as recently as April 2016 about Facebook's latest fad: "Because it's live, there is no way it can be curated. It's live; it can't possibly be perfectly planned out ahead of time."

Cool idea, bro. But not something anyone should die for.