An online platform that hosts illegal or problematic content and does nothing about it will take a big reputational hit. And its finances could take an even bigger hit if it falls foul of regulations protecting users of the platform from harmful content.

You have options: moderating your content yourself, outsourcing it to a partner that specializes in this process, or a hybrid model.

This page explains the pros and cons of both in-housing and outsourcing, how to find the right outsourcing partner and how to bring key stakeholders along with your decision.

If you’re interested in getting a comprehensive overview of the importance of and challenges involved in moderating user-generated content, check out our article: The ultimate guide to content moderation.

Contents:

  1. Does your platform need outsourced moderation?
  2. The pros and cons of outsourcing content moderation
  3. Automated content moderation: In-house or outsourced?
  4. How to outsource your content moderation: The key steps

Does your platform need outsourced moderation?

Content moderation can be demanding work, typically requiring a team of carefully selected, highly trained human content moderators working in concert with sophisticated machine learning algorithms.

It takes a lot of resources to do all that yourself—but outsourcing has its own complexities.

Before looking at the pros and cons of both approaches, you should assess your needs, carefully considering:

  • Your moderation needs: Think about the variety and volume of user-generated content being submitted to your platform—and how this content can be expected to scale in the future. Assess the geographic spread, local cultural contexts and languages spoken by your user base—if you want to insource, you’ll need moderators who understand all three.
  • Your organization’s resources: You’ll need to moderate within your means, and insourcing tends to be the more expensive option—you miss out on the efficiencies of shared resources and expertise, as well as cheaper labor pools. Does your business have the budget to support hiring a large content moderation team? And what about funding the development and deployment of automated tools?
  • Your approach to moderation: Your needs and resources will help determine the approach your platform should take to content moderation: what mix of human moderation and automation you need, whether you pre- or post-moderate content, and to what extent you rely on your platform users to self-moderate (which can make moderation less labor-intensive and/or less reliant on AI tools).

With needs, resources and methodology in mind, you’ll be in a better position to weigh up the strengths and limitations of in-house and outsourced content moderation.

The pros and cons of outsourcing content moderation

If you build your own moderation team you’ll have more control over the moderation process. You’ll have more of a say when it comes to recruitment and training and will enjoy a direct line of communication with your team members.

But building your own team could cost you considerably more than outsourcing.

You’ll have to foot the bill for starting up a new function, funding recruitment, paying salaries (which can be higher depending on local labor rates), overhead, employee benefits, training and wellbeing. All of these costs multiply if you’re having to cover multiple countries and time zones.

Recruitment may be tricky, too, as you’ll ideally be looking to employ a mix of moderators from multiple locations, reflecting the local culture of your users.

Plus you won’t have the deep process engineering expertise BPOs develop to keep things efficient.

If this cost and complexity isn’t dissuading you or your exec team from building in-house, and it’s a function you think it’s better to own, read our article How to build your own content moderation team—it covers everything from how to recruit and train your team to looking after their mental health.

A world-class BPO can get your content moderation up and running faster and will allow you to rapidly scale your content moderation capabilities. More sophisticated BPOs will be capable of covering every country, region and language your platform and its users interact in.

Outsourcing can therefore take a lot of the cost and complexity out of hiring, training and employing human content moderators.

Automated content moderation: In-house or outsourced?

Considering the sheer volume of content that even relatively small platforms have to screen, there’s a good chance your human content moderators will need to be supported by artificial intelligence (AI) and machine learning (ML) algorithms.

In many content moderation programs, AI screens user-submitted content, blocking unambiguously unacceptable content and flagging potentially unacceptable content for your human moderators to review.

We dig deeper into the pros, cons, methods and metrics of automated content moderation in How to get automated content moderation right.

In terms of outsourcing, you have three options for using automated content moderation tools…

Option 1: Implement a third-party tool

There’s certainly no shortage of powerful automated content moderation tools out there.

Of course, you’ll have to pay for these tools to be implemented (and managed), and then train your in-house team to use them.

The downside of third-party is that they offer little transparency into how their algorithms work. More importantly, you won’t have much input into the algorithms so it could be difficult to resolve any issues with how the tool is categorizing your content.

Option 2: Develop your own AI/ML algorithms

This has the benefit of giving you direct insight into how your automation tools are functioning (or failing). Your engineers and human content moderators can work closely together to ensure that the algorithms are supporting the human moderators’ work—and are being constantly updated with new data from human-moderated content.

This is also the costliest option on the table. It can be hugely expensive and time-consuming to develop your own algorithms, and it means that you’ll have to fund a content moderation team that includes not only moderators and ops personnel but also data scientists and engineers. (And they don’t come cheap.)

Option 3: Outsource your automated content moderation

BPOs with data scientists and others on staff who know how to build or string together tool sets are an in-between option.

Sometimes they’ll use your in-house tools but they’re also a great resource to recommend the right AI moderation company based on experience in your industry and various use cases.

Look for BPOs that can adjust their AI and ML models for different customers, have significant content annotation expertise to train models, and possess process engineering experience to effectively pair AI and human teams.

How to outsource your content moderation: The key steps

So you’ve decided (hypothetically, at least) that your organization should outsource its content moderation. What are your next steps?

Get buy-in from the C-suite

Content moderation might seem a niche area to some, but protecting the safety of your platform’s users and your brand’s reputation affects every part of the business.

This message needs to be endorsed from the top down, so securing buy-in from your executive team is essential.

The C-suite needs to understand how vital content moderation is to your business’s reputation and revenue. You might not have to educate them here, since many exec teams are already concerned with content moderation—especially with the appearance of heavily fined regulations such as the EU’s Digital Services Act.

But if you want the C-suite to back your plan, they need to know not only the stakes involved in getting content moderation right, but also the scale and complexity of the challenge.

Gather as much evidence as you can regarding the volume and variety of problematic content that your moderation team will have to deal with—and make sure you can get across the value of outsourcing this work. (We hope this article will help you out there!)

How to choose the right BPO partner for content moderation

Some things to look for here include:

  • How experienced are they in providing content moderation services? How many clients have they worked with—and who are those clients?
  • What countries, languages and time zones do their content moderation services cover?
  • What’s their approach to recruiting—do they diligently test to ensure they’re hiring people with the right mentality to deal with frequently upsetting material?
  • Do they hire full-time moderators or hire it out to freelancers and low-paid, untrained gig workers?
  • Can they provide the work from a secured environment, centralized offices, with remote workers or all of the above?
  • What about recruiting for specific projects—will they find the right people, with the right cultural awareness, for each project?
  • Do they look after the mental wellbeing of their moderators?
  • Do they have their own AI/ML models, and are they working on continually improving those models so they can reduce your need for human content moderators?
  • How do they approach augmenting their human moderators with AI?

Measure their success—and sell it back to the business

If you’ve picked your BPO wisely, you won’t have to chase them for metrics and KPIs that prove their success—they’ll be as transparent as possible about their operations and the impact of their content moderation services on your platform users.

Some metrics you may want to ask your outsourcing partner to track include:

  • Proportion of your platform users exposed to violations
  • Proportion of your users who are violating your content guidelines
  • Percentage of content flagged as inappropriate
  • How accurately their automated content moderation systems are categorizing user-generated content—and how much this accuracy improves over time as the systems are fed more of your platform’s data
  • The response times of their human content moderators (between activity/content being flagged by automated systems and the ticket being closed)
  • Tickets responded to by individual moderators per hour
  • Average time to mitigation of security threats
  • Turnover rate of content moderators—this can indicate how well your partner is taking care of their moderators’ wellbeing, which directly impacts decision-making and performance
  • Impact on customer satisfaction, tracked via metrics such as NPS and average customer review scores

Ironically, when content moderation goes smoothly, neither your platform users or stakeholders will notice it. When things go wrong, though, you’re certain to hear about it.

This can make it tough to prove the value of your outsourced content moderation services to the business—and why tying it to positive business outcomes, such as an uptick in NPS, is so valuable.

To outsource or not to outsource?

If that was your question coming to this page, we hope we’ve answered it for you.

When it comes to outsourcing content moderation (and other content and community services), we have a lot of experience and excitement for optimizing the human and technical elements that make us excited to talk about content moderation.

Webhelp provides outsourced moderation services for over 250 clients worldwide, serving billions of users in over 20 languages. We’ve applied our process engineering, data science and behavioral science to build offerings for clients that accrue value over time. Our approach isn’t just to fill seats and scale up resources as your content scales up. Instead, we build for future value.

That means shifting as much work away from moderators as possible over time by developing processes and refining AI to take on more of the work.

We’re also certified Facebook Marketing partner and the first BPO to partner with the Trust & Safety Professional Association (TSPA). TSPA supports the global community of trust and safety professionals enforcing principles of acceptable behavior and content online.

Best practices are changing quickly, and we’re happy to support each of our moderators with a TSPA membership so they can connect with peers and attend workshops and other educational events.

If you’d like to know more about trust and safety, you can check out our article Trust and safety: Why it matters and how to get it right.

If you’d like to learn more about how Webhelp can provide content moderation services that positively impact your customer experiences, drop us a line.

Get in touch