As online engagement will be the next boomtown for customer experience, Andrew Hall, Managing Director Customer Solutions, Webhelp UK region, looks at the future of this new frontier and how content moderation will be critical to protect brands and users.

Andrew Hall - content moderation boomtown - Twitter

Back in 1996, when the internet and social media as we know it now was pure science fiction, Bill Gates wrote a pivotal essay entitled ‘Content is King’ saying:

“Content is where I expect much of the real money will be made on the Internet, just as it was in broadcasting.” And he realised the new freedom for self-expression that this would provide by adding; “One of the most exciting things about the Internet is that anyone with a PC and a modem can publish whatever content they can create.”[1]

Fast forward to today, and across every conceivable social platform, the internet is now heaving with content marketing – from thought leadership, brand videos, sponsorship, influencer tie-ins and stories promoting everything from consumer goods to dating services.

And, accelerated by the physical limitations introduced by COVID-19, this new digital frontier is still growing. In the UK for example, in June 2020 Ofcom reported a substantial rise in the incidence of social media accounts on platforms like WhatsApp (70%, up from 61% in 2018), Instagram (43%, up from 38% in 2018), and YouTube (42%, up from 35% in 2018).[2]

And although the shadow cast by the mountain of Facebook (forecast to hit 1.7 billion users worldwide by the close of 2020) continues to dominate this landscape, it now shares engagement time with multiple platforms.

So, it’s clear that navigating this expanding territory could be a rough ride for many companies, with this gold-rush of new users, bringing fresh disruptions and challenges.

We know that it is vitally important to reach your customers where there are most active, and as McKinsey reports, that is now online:

“Demand patterns have shifted. Overall online penetration in China increased by 15–20 percent. In Italy, e-commerce sales for consumer products rose by 81 percent in a single week, creating significant supply-chain bottlenecks. Customers need digital, at-home, and low-touch options. Digital-led experiences will continue to grow in popularity once the coronavirus is quelled, and companies that act quickly and innovate in their delivery model to help consumers navigate the pandemic safely and effectively will establish a strong advantage.”[3]

It’s clear that any new delivery model must include Content Moderation, and as digital-experiences assume more importance in our lives, user-generated content (UGC) will undoubtedly continue to grow in impact and variety.

In simple terms, content moderation helps companies monitor, analyse and respond to UGC including comments, reviews, videos, social media posts or forum discussions, using predefined criteria and legal boundaries to establish suitability for publication.

As Webhelp Group Senior Director of Content Management and Moderation Solutions, Chloé de Mont-Serrat, explains:

“Leveraging user-generated content is fast becoming a powerful and flexible tool to raise brand recognition and enhance customer trust, especially in the booming e-commerce industry. Consumer content is instrumental in influencing both purchase decision making and in the uptake, visibility and popularity of brands online.”

“However, despite these benefits, utilising externally produced content is not without risk, especially for companies that are unaware of the detrimental impact this can have on the user perspective of the brand if not properly managed.”

Source: Content Moderation for Dating Applications

And the danger is that, when left unmanaged UGC can permanently damage brand reputation and revenues, leaving the barn door open for harmful content like flame wars, online abuse, mounting customer complaints, unsuitable imagery, fake news, fraud and cyber bulling. Not to mention clearing the field for automated spam content, troll farms and false reviews. 

Controlling and making the most of this vitally important demographic is where, much like a local sheriff looking after the townsfolk and wellbeing of the community, content moderation becomes key to creating healthy, responsive two-way engagement that benefits the brand and protects all the users.

The research article ‘Re-humanizing the platform: Content moderators and the logic of care’ describes content moderators as;

“The hidden custodians of platforms, the unseen and silent guardians who maintain order and safety by overseeing visual and textual user-generated content.”[4]

The report highlights, as we believe at Webhelp, that thinking human and maintaining empathy and insight, should be a critical and creative factor in current and future platform arrangements.

Webhelp’s recent research paper Reimagining Service for the New World a joint publication with Gobeyond Partners, part of the Webhelp group, spotlights the tensions and challenges between the need to be simultaneously both more digital and more human. As companies are increasingly being tasked to deliver seamless, technology-enabled, and experience-led service across multiple channels, while demonstrating transparency and creating genuine and deep emotional connections with customers.

And, with 78% of leaders agreeing that customers will be paying much closer attention to their business practices, maintaining a human face online, especially in reacting to confrontational or illegal content, will become more important than ever.

At Webhelp we are passionate about supporting our clients through their content moderation challenges, and have guided them through a range of topics; such as identifying under-age members, inappropriate images, tackling online harassment and preventing accounts being used as a platform for illegal activity such as scams and fraud.

We protect brand reputation and enhance user experience, mobilising effective and skilled teams with specific sector experience. They utilise both their human judgement and cutting edge analytical services to effectively police and nurture online communities, providing growth for the brand and safety for the user.

Later blogs will focus on specific industry moderation pain points and the best ways to correct them, but for now we leave you with this thought;

“There’s a new sheriff in town – and they’re called called Webhelp!”

 

To discover more about customer experience models post COVID-19 read our new Whitepaper, a joint publication with Gobeyond Partners, part of the Webhelp group, on Reimagining service for the new World which is underpinned by our unique industry perspective alongside new research to discover the operating models of the future. Or read our new paper exploring the Content Moderation pain points in the Dating application sector and the way towards a more comprehensive and game changing solution.