What are the main issues with content moderation today?

A recent report published by NYU, shows that there is over 3 billion pieces of content on Facebook (in the first quarter of 2020) that is the responsibility for content moderators to check; remove or provide a warning ‘cover’ of disturbing content before viewing.

Facebook founder and CEO, Mark Zuckerberg recently reported in a 2018 Whitepaper , Facebook’s review teams “make the wrong call in 1 out of 10 cases”, which can be a result of relying on AI to identify harmful content, or the pressure and lack of training with moderators.

With this type of role, comes a great deal of pressure and responsibility to ensure the safety of the community, 24/7 (2.6 billion active users daily).

One of the main issues content moderators face today, is the hundreds of items they are required to moderate within a six to eight-hour shift.    Therefore, expertise is essential, as it is up to content moderators to act with governance to uphold high standards. Content is not responsible of the platform,  this is the freedom users have for ‘free speech’, but the onus is on the moderators to control obscenity showcased to them.

Subsequently, the second issue is the pressure of fulfilling these number of items to moderate. Setting high targets and efficiency rates can prove to be unattainable and have the consequences of diminished performance and mental health and wellbeing.

Recommendations from NYU

The NYU report discusses recommendations major social media platforms can do to improve their content moderation.

While the main theme of the article is constructed on the basis “A call for outsourcing”, we can conversely demonstrate outsourcing is instrumental to content moderation, moreover how we align with these recommendations outlined in the report.

Human first approach when outsourcing content moderation

At Webhelp, we know many mistakes have been done concerning content moderation services, therefore we decided when we entered this ‘community service’, to adopt a completely different approach – 74% of our operators recommend Webhelp as an employer (NPS).

Investing in people

A human first approach to content moderation is Webhelp’s understanding that people’s mental health and wellbeing is not to be disregarded when managing afflictive content.

Wellness is our differentiator, enabled through our Webhealth Wellness Programme:

  • Mental Health Awareness training is provided for managers to recognise symptoms of stress, and the coping mechanisms to support colleagues
  • providing a safe, working environment to ensure colleagues have a sense of security, trust, and reliability.
  • access to certified Psychologists, councillors, and trained coaches to support content moderators with mental, physical, financial, and nutritional health.

Wellbeing Analytics to take proactive action

As part of our approach to content moderators and their mental health, we monitor their performance using Wellbeing Analytics.

Using this tool enables us to identify issues through a combination of observing colleagues, using data analytics and machine learning for proactive action.

Team leaders and coaches will have daily updates on colleagues MTI score which indicates how colleagues are performing and  , identify ; this allows supervisors to take appropriate actions to support them, for example, reworking a shift or allow for longer breaks – 100% of our operators moderating sensitive content have shorter shifts which achieves up to 4 points of attrition reduction.

Improving content moderation

Managing content moderation is not to be taken lightly. It requires expertise and knowledge about this area and understanding there is a balance between the impact it has on individual’s wellbeing and the value it adds to first and third parties.

Outsourcing for content moderation is a way in which social media companies can employ experts within that field to deliver outcomes and improve performance.

As NYU have reported, content moderation should not be outsourced because it lacks on moderator’s health and wellbeing.

As we have demonstrated above, we have a strong focus on this. Not all outsourcing is conducted by ‘customer service centres’ that exploit their team without support, on the contrary.

Taking a human first approach with our Webhealth programme and Wellbeing Analytics tool enables colleagues to develop their understanding of mental health and is essential in proving a safe, healthy environment for moderators.