Facebook moderator claims watching graphic videos gave him PTSD

Facebook moderator sues Mark Zuckerberg’s firm over claims that watching child porn, rape and videos of mass shootings gave him PTSD

  • Clifford Jeudy, 47, claims the social media giant failed to give him proper support
  • He had to repeatedly watch footage of the Christchurch mosque terror attack 
  • Mr Jeudy says the job has left him with severe emotional and psychological scars

A Facebook content moderator is suing the social media giant claiming that watching hours of graphic footage of child pornography, violent rapes and mass shootings left him with PTSD.

Clifford Jeudy, 47, has filed a class-action lawsuit against the social network and the Florida subcontractor that employs him claiming the two companies do not provide enough support to moderators viewing traumatic content.

He said watching white supremacist Brenton Tarrant live-stream the Christchurch mosque terror attack which left 51 dead in New Zealand last year was particularly horrific.

Mr Jeudy, who works for Cognizant Technology Solutions in Tampa, said he took the job because it sounded interesting but wasn’t fully aware of how horrific some of the content of Facebook Live videos would be.

Clifford Jeudy says he was traumatised by graphic videos including footage of Brenton Tarrant, 28, (pictured) killing Muslim worshippers in Christchurch last year

Mr Jeudy claims Facebook doesn’t provide enough support for moderators who are forced to watch hours of horrific footage of murders, rapes and child pornography on a daily basis

He said: ‘I was like, Why not? This is going to be exciting.’

But the footage took its toll on his mental health.

He said: ‘We watch that for a living, all day – people getting murdered, killed, raped, child pornography, snuff films. 

‘You’re policing the internet, but the platform has drugs, guns, terrorist recruitment, human trafficking, prostitution.’ 

Mr Jeudy said he had to watch the Christchurch terror attack live from his cubicle at Cognizant and then view it several more times.

‘It’s just horrific to see it over and over and over again,’ he said. ‘As the video goes viral, it’s not like they delete it one time and it’s gone. People are sharing it.’

Each time it was shared, he was forced to watch it again.

‘It’s tough to watch that many people get killed,’ he said.

Mr Jeudy and several co-workers claim that although Facebook has safety standards to help moderators, Cognizant didn’t follow those procedures and Facebook didn’t check and ensure that they did.

Mr Jeudy says he suffered severe emotional and psychological trauma forcing him to take time off and that he even suffered a stroke.

Mourners lay flowers on a wall at the Botanical Gardens in Christchurch, New Zealand

A victim arriving at a hospital following the mosque shooting in Christchurch last March

‘When you watch somebody die literally in front of you, isn’t that the definition of PTSD?’ he asked.

The lawsuit also accuses Facebook of negligence and Cognizant of ‘deliberately concealing or misrepresenting known dangers’ of the job. 

Mr Jeudy said he is still working for Cognizant because he’s not ready to try to find a new line of work.

Cognizant’s Tampa facility is preparing to shutdown and lay off more than 500 employees.

Jay Lechner, the attorney of the employees taking action, told the Tampa Bay Times: ‘Workers in Tampa will basically be left with no recourse and will be suffering mental impairments for a long time.

‘The company will leave town without having to be responsible for it.’

In 2019, Facebook moderator Keith Utley died from a heart attack as he sifted through gruesome videos on the social media platform.

Keith died at his desk at 42-years-old, according to an investigation by The Verge.

Facebook came under fire last August when it emerged sick footage of the Christchurch atrocity could still be found on the platform. 




Source: Read Full Article