NEED HELP?
CHAT NOW
Moderating Facebook Content and Ugly Business | Arvig Blog Skip to main content

By October 8, 2019March 3rd, 2020For Home
Reading Time: 4 minutes
Person managing Business Facebook on laptop

The Ugly Business of Moderating Facebook Content

Doing the worst job in technology

While working in social media for a technology company may be a cool career path, there is one awful but necessary job: Facebook Content Moderator. Actual humans must watch gruesome content to decide whether to ban it from Facebook or Instagram before more members can see it.

Monitors scour through text looking for hate speech and other forbidden content including explicit images and videos.

Posts are initially flagged by Facebook’s own artificial intelligence tools or by other Facebook members. After flagging comes a human review, intended to protect Facebook’s 2.3 billion users from exposure to people’s darkest thoughts and urges. The bulk of these reviews are outsourced to contractors.

According to interviews of current and former employees by The Verge, some workers turn to drugs or sex at work to cope with the barrage of violent posts, and are plagued by PTSD, even after leaving the job. Some people acknowledge adopting fringe viewpoints—like 9/11 or the Holocaust being fake—that same dangerous and factually disproven information they are trained to weed out.

Constant exposure to the world’s worst content is ravaging the mental health of an estimated 7,500 content monitors. Disturbing images have a unique power to inflict traumatic injury upon the human brain, according to the suicide prevention website Alive to Thrive. Constant exposure to negative, violent or pornographic images and text can create a devastating impact beyond what the brain can absorb and process. This may lead to a person losing their psychological balance.

Three people using cell phones and laptops at a table

One former worker in this role has already filed a lawsuit against Facebook for psychological trauma caused by repeated exposure to violent images. “Every day, Facebook users post millions of videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder,” the lawsuit filing reads. The complaint made by Selena Scola alleges Facebook failed “to provide a safe workplace for the thousands of contractors who are entrusted to provide the safest environment possible for Facebook users.” Other former content moderators have joined the lawsuit, which is now seeking class action status.

In our oversharing universe, Content Moderator has become the worst job in the technology.

Facebook content moderators are asked to review more than 10 million posts per week that may violate the social network’s rules. Facebook acknowledged the work can be stressful when the suit was filed, but said it requires the companies it works with for content moderation to provide support such as counseling and relaxation areas.

One of several companies Facebook outsources content management to is Cognizant, a multinational provider of services such as app development, consulting, IT and digital strategy. A quick review of an open content management position with Cognizant on Indeed.com lists little qualifications necessary past a high school diploma, but having empathy is mentioned several times.

Workers at the Cognizant Phoenix office providing content management services for Facebook get two 15-minute breaks, a 30-minute lunch and nine minutes of “wellness time” per day to deal with all of the stress and emotional toll of the job. They have access to a counselor and a hotline, according to The Verge report, but a counselor was not there for all three shifts of the operation.

In follow-up interviews, Cognizant’s Tampa, Fla., employees describe filthy workstations and bathrooms andcompared the Tampa office to a high school. Loud altercations, often over workplace romances, regularly take place between co-workers. Verbal and physical fights break out on a monthly basis.”

Facebook claimed recently it works with its contractors “to provide a level of support and compensation that leads the industry.” However, these outsourced workers do not enjoy the lavish salary or perks of a Facebook employee. The average contracted content moderator in the U.S. makes about 12 percent of what the average Facebook employee makes. It is a high pressure, highly monitored position where performance is pushed hard and bathroom breaks are counted. And if an employee doesn’t match the appropriate action to flagged content a handful of times, they can be fired.

Facebook came under heavy criticism in 2016 for failing to prevent abuse to its Community Standards. The company responded by expanding its workforce including adding outsourced content moderators around the world. Clearly moderation of vile content is necessary. But Facebook maintaining an arms-length relationship with the companies providing these services leaves a serious gap, with an urgent need for improvement and accountability.

Related Posts