Quantcast
Channel: siliconvalley
Viewing all articles
Browse latest Browse all 178

The Hidden Sacrifice: Those Who Keep Us Safe on YouTube

$
0
0

Facebook acknowledged the horrors their content moderators face. It’s time for Google to do the same.

[I’m posting on behalf of myself and 2 fellow undergraduates at Stanford University].

When you type YouTube.com into your browser, hoping to watch the latest news, sports highlights, or influencer vlogs, do you ever worry about coming across horrifically graphic videos recommended to you on YouTube’s homepage? We have never had such an experience – and we are thankful for it – but our privilege to safely browse billions of videos throughout each day comes at a price. Every minute, content moderators contracted out by Google must bear the responsibility of viewing and removing YouTube videos that are so traumatizing, giving censored descriptions of the videos here would likely make this article violate most publications’ submission guidelines. As a result, we are calling on Google to take the following steps to better protect their content moderators: upgrade content moderators to full-time employee status; work to fully automate content moderation while utilizing the current content moderators in different, less traumatizing roles; and lastly, compensate moderators appropriately for the lasting health issues they suffer due to the nature of their work.

As you might expect, Google is not the only tech giant to contract out such work. Recently, Facebook settled a $52 million class-action lawsuit with their content moderators, acknowledging their unconscionable work conditions. To give us an idea of what they experience, content moderators have previously spoken on the record describing symptoms that resemble PTSD. As author & journalist David Cohen describes, some content moderators suffer anxiety attacks when “entering a cold building, seeing violence on television, hearing loud noises, or being startled.”2 In a video published by The Verge, a content moderator contracted out by Facebook through the company Cognizant broke his non-disclosure agreement and spoke with journalist Casey Newton to discuss his official diagnosis of PTSD, which manifested through frequent night-terrors and anxiety attacks.

When we corresponded with a prominent journalist who has reported extensively on content moderation, the journalist spoke of the moral obligation that technology companies owe to the employees they put at risk. Might Facebook’s settlement mark the beginning of a long-overdue reckoning for Silicon Valley companies that have consistently shirked moral responsibility for the devastating psychological toll of content moderation?

In their settlement, Facebook acknowledged many of the horrifying components of content moderation, but it does not stop here. One Washington Post journalist who has written on the issue gave us insights that, as a consequence of their contractor status and location outside the United States, many moderators do not have the same legal protections or rights to unionize compared to American employees. Fortunately, we noticed Facebook agreed to give full-time employees the responsibility to moderate content. Here, these workers are better suited to protect themselves in the workplace as a result of their employee status.

We are pleased Facebook acknowledged the challenges many of their content moderators face and has taken concrete steps to better their working conditions. Unfortunately, the same cannot be said for Google. As you might imagine, countless investigations and reports show that content moderators for Google, specifically those who moderate YouTube, face similar dire working conditions and lasting mental health trauma through their work. Further, Google contracts out their workers through Accenture, causing the workers to similarly face less guaranteed rights compared to their peers with employee status.[6]

As a result, we are calling on Google to do the right thing and at minimum match Facebook’s efforts to better the lives of their content moderators. Facebook’s settlement demonstrates their willingness to confront how they have treated their content moderators up to this point and establishes a precedent for Google to follow. Clearly, it is time for Google to take Facebook’s lead and help the content moderators who keep us safe on the internet daily. As frequent users of YouTube, it is your duty, along with ours, to stand up for moderators’ livelihoods and work-conditions as they continue to protect us. In addition to compensating their moderators for the lasting health effects of their work and giving these responsibilities to full-time employees – we call on Google to ultimately automate this line of work, and utilize these people in departments where they can make a difference without having to suffer lifelong consequences we cannot begin to fathom. By making these changes, Google can continue to protect users from exposure to graphic content uploaded to its platforms, while also protecting those who sacrifice their health to make sure we have positive experiences online.


Viewing all articles
Browse latest Browse all 178

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>