Casey Newton: The Advocate for Facebook Moderators

By

October 22, 2019

Categories

Science

Share

(Source: Eston Bond)
PHILADELPHIA, Penn. – Imagine if social media was raw, obscene, and absolutely unfiltered. Any person in the world would have the ability to post disturbing images of violence or use offensive, hateful language, without having the post flagged for offensive behavior and deleted.

Casey Newton’s February piece for The Verge “The Trauma Floor: The secret lives of Facebook moderators in America”, highlights the importance for monitoring disturbing content for the social media giant, Facebook, while also bringing forth the jarring working conditions moderators deal with in their day to day. 

Newton, Silicon Valley Editor for Vox Media’s The Verge, is an advocate for the Facebook moderators, detailed in his piece, by unearthing ugly, harsh truths these individuals live with in monitoring the largest social media network in the world. His non-objective viewpoints show a brand of advocacy journalism that takes all sides into account seen in his reporting on these Facebook moderators.

Facebook users will constantly flag posts for violating its community standards, often on content deemed offensive or disturbing. Moderators then do the dirty work to keep the platform clean, by ensuring the content violates Facebook’s policies which often involved viewing said content, from videos of people being stabbed and murdered to 9/11 conspiracy theory videos to racist language filled posts and ensuring it is deleted.

Moderators work long hours in keeping Facebook free of these flagged posts, while technically not even being actual employees of the company. Instead they are contract employees provided by Cognizant, a vendor Facebook pays to moderate content. Cognizant employees average salaries ($28,800) are significantly less than what an average Facebook employee makes ($240,000 including salary, stock, and bonuses), according to Newton’s piece. 

Newton uncovers how moderators viewing such traumatic content have been diagnosed with PTSD(post-traumatic stress disorder)-like symptoms or abuse drugs at work to cope with what they encounter online. The dozen current and former Cognizant employees Newton interviewed for the story asked not to be identified since each had signed a non-disclosure agreements (NDA) barring them from talking about their Facebook experience.

“One of the things I’ve tried to highlight in my stories,” Newton explained on NPR’s Fresh Air to Terry Gross about his original article and follow-up piece in June highlighting one contractor’s death, ”is that while we pay these folks as if the work is low-skill labor, in many cases, in my opinion, it is very high-skilled labor because they’re making these very nuanced judgments about the boundaries of speech on the Internet. So if you accept that Facebook and Instagram, these big services that these folks are moderating, represent an essential component of political speech in this day and age, my guess is you might want them to be paid more than $28,000 a year.”

Advocating for better treatment of Facebook moderators is Newton’s intention, since the working conditions are mentally and physically dangerous to their health and well being. By hiding the identities of his sources, he is not directly supporting or advocating their struggles, but because of the NDAs but he was left with no choice. In Newton’s follow-up piece, it was confirmed that 3 moderators broke the agreements in talking with Newton about the moderator sites. His investigative writing is fact-based, coming straight from his sources, even without the identities being revealed in his original reporting.

Newtown advocates for Facebook to reanalyze how they want to go about moderating their content, without damaging individuals they don’t even recognize as fellow peers. His findings certainly struck a nerve with the social network, by advocating for moderators and all the hard work they do, considering if they weren’t monitoring the platform content would be much darker.

As a result from Newton’s reporting, Facebook made 3 big changes highlighted in an internal blog post, according to an NPR Fresh Air interview with Newton, which were raising the contractors wages by $3 an hour, better job at screening candidates, and providing employees that get fired or quit services for counseling, which Facebook would cover.

Newtown’s piece does not violate ethical journalism due to the fact that he uses reliable sourcing, factual evidence, and unbiased viewpoints. His piece is honest in telling how the moderators have it worse than full-time Facebook employees, who are more focused on expanding their technology in artificial intelligence and machine learning, according to Newton.

Newton’s brand of advocacy journalism is more important than ever in 2019, as tech giants like Facebook continue to grow larger every day while the attention to detail from the ground level may go unnoticed.

Related Posts

March 22, 2024

Q&A: The Making of a Mentor: Inspiring the Next Generation Through STEM Education

Empowering Tomorrow’s Innovators: Dr. Belinda Jones Journey in STEM Education #STEMFuture #InnovativeEducation #EmpowermentInSTEM

November 13, 2023

A L(AI)tin America News Project

Social media content creation through automation and artificial intelligence, bringing news from Latin America to the English-speaking world.