facebook
| Mar 1, 2022

Best Practices for Being Safe, Secure, and Resilient in the Content Moderation Workplace

We must do all we can to keep these digital first responders safe.
By Akash Pugalia |

<1 minutes

To ensure society is running safely and appropriately, we rely on heroes like police, firefighters, and paramedics to make regular proactive checks and balances. This shields people from harm and trauma before they can be exposed to it. But what if the trauma is online? In these scenarios, the digital first responders to the scene are content moderators, professionals who identify and remove heinous content before it can spread and become viral. While the general public benefits greatly from their work, what people may not realize is the toll it takes on those individuals who subject themselves to graphic depictions of abuse, violence, and sexual content on a daily basis. 

Exposure to trauma leaves many first responders battling PTSD and other mental health issues. Unfortunately, scholarship on the mental health effects of content moderation is still lacking, but industry lore strongly suggests that content moderators risk similar mental and emotional perils. Until we know more, the following best practices can help those in the industry navigate these troubled waters, a growing problem in the digital industry.

Best Practices for Employees: Ask Questions Early

Work can be inherently risky. Before we take any job, we should assess the impact it will have on our mental health. Though some may need to avoid these jobs completely due to personal  circumstances, others can be well-supported by programs put in place to protect them. If you’re pursuing a career in content moderation, explore companies that take measures to guard you against these perils.

Ask yourself what the company would do to protect you if you were hired. Do they have programs and processes in place to safeguard employees’ mental well-being? Leaders and hiring managers should welcome questions like these and proactively respond to them.

Historically, the response to mental health concerns in the workplace has been more reactive than proactive. Fortunately, this is changing. The pandemic has brought these concerns to the forefront, and many employers are stepping up to help alleviate them.

MORE FOR YOU

Best Practices for Employers: Take Care of Your People

Employers, you simply can’t sleep on this. Human beings are tough, but they are emotional, too. Making these changes involves more than just implementing new initiatives or benefits. The most critical part of an employee’s well-being is the culture in which they work. The benefits don’t speak for themselves — human interaction, compassionate leadership, and attention to the well-being of others in a way that transcends policy will be what makes or breaks a mental health program.

1. 24/7 counseling 

Those who are subjected to trauma should have round-the-clock access to independent counselors. Your content moderators need a contact who can guarantee absolute confidentiality. These counselors should be on-site and ready to support workers at any time. But beyond making these resources available, your organization should make a proactive attempt to pursue employees on a regular basis to ensure nothing is flying under the radar. There will be times when people need help before they realize it, and you’ll be glad you chose to intervene early.

2. Workflow flexibility 

Content moderation should be divided into workflows around different content policies: violent extremism, child safety, harassment, etc. Employees should feel comfortable asking to be opted out of a workflow. Content moderation is a team sport. Nobody should have to do it all. This flexibility will be made possible on the foundation of strong workplace relationships. Managers should be approachable, and a sense of camaraderie should be prioritized.

3. Resiliency training 

Content moderation, as well as any field that exposes someone to potential trauma on this level, demands resiliency. Fortunately, resilience is a skill that can be learned. At Teleperformance, we implement a four-day resiliency course upfront to teach our employees strategies for the challenges they’ll face on the job. Moderators should receive resiliency training before they even begin to moderate content. And this should be continued periodically over a set period of time depending on the workflow and the egregiousness of content in the workflow. 

4. Mandate breaks 

Heinous content wears people down, and they need opportunities to recharge. We must re-envision what a workday looks like for someone in such a unique position. Employees shouldn’t be looking at any content for more than two hours at a time, and they shouldn’t be looking at egregious content for more than five and a half hours a day. Take some time to determine what the ideal workday looks like for your team, and be ready to pivot when things aren’t working.

5. No highly egregious content moderation from home 

This one isn’t for your employees so much as it’s for their families. No child needs to walk into their mom’s office to discover a living nightmare on her computer screen. Avoiding secondary exposure like this is paramount.

6. Build a community 

Employees will be more resilient when they feel connected. Host group sessions where employees can come together to talk about anything, work-related or not. Office game nights are a great way for employees to have fun and relax, and off-site events like going bowling can lead to bonding among employees outside of a working context. Implement a buddy system so nobody falls through the cracks.

7. Cultivate a sense of purpose 

Employees are better off when they feel a sense of purpose at work. Everybody wants to know why society needs their labor. You can help employees find their purpose by reminding them why what they do is important. You may also try something less direct. At a group session, you could ask employees to discuss why they think their work matters rather than simply telling them why you think it matters. One way we seek to formulate this in our business is to help our moderators understand their role as a reviewer rather than a viewer, which reminds them that they’re “on the job” and helps them avoid getting sucked into the content. 

Where Is This Industry Headed?

Scholars still haven’t produced much formal research on mental health or burnout in the content moderation workplace, but they are making some progress.  

Technical advancements are another hopeful avenue for progress. Computers can already moderate the majority of content, and they can pre-moderate content as well, preventing flagged content from posting publicly until a moderator gives it the greenlight. Moreover, automated tools help moderators process potentially harmful images and videos much more efficiently.

But for now, online content moderation remains a largely human endeavor. It’s on us to ensure it’s also a humane endeavor.

Akash Pugalia
Akash Pugalia
Executive Author

Global President of Trust & Safety, Teleperformance

Akash Pugalia is the Global President of Trust & Safety at Teleperformance. He focuses on designing and implementing T&S solutions for leading online platforms to make the internet a safe place. view profile

OTHER ARTICLES

Tags:

Related Posts