Blog

Lumina Datamatics
Lumina Datamatics is a trusted partner in providing Content Services, eCommerce Support Services, and Technology Solutions to several global companies in the Publishing and eCommerce industries worldwide.

    Subscribe



    BUSTED! 4 COMMON CONTENT MODERATION MYTHS THAT JUST AREN’T TRUE
    July 1, 2020

    Without content moderation, your online marketplace or classifieds site could end up a little like the Wild West: lawless. While user generated content (UGC) is – for the most part – contributed by genuine customers, a system that’s open to the public is open to abuse: which is why content moderation is a must.

    However, despite its importance, there are a number of massive misconceptions about content moderation.

    Let’s bust some of those myths right now.

    Myth #1: Content moderation is censorship

    Some people see content moderation as a form of censorship; a way for organizations to exercise control and block comments, posts, reviews, and other types of undesirable content.

    The truth is, content moderation is about providing a healthy and safe environment where users can comfortably engage with others and upload their own products, posts or comments.

    Flags and report buttons allow users to notify site owners when something’s out of place, human moderators ensure that all users comply with community standards, and well-trained AI moderation solutions use filters to screen for inappropriate words, phrases, and images – helping weed out the trolls, bullies, and spammers; keeping your online space a great place to be.

    In short, content moderation isn’t censorship, it’s a tool to improve user experience, ensure that you adhere to local and global laws and that your users can interact through your services without fear of getting scammed.

    Myth #2: Content moderation gives no ROI

    Hmm… where to start? The notion that content moderation is simply another time-consuming and resource-heavy task that provides little ROI is a common one.

    Marketers need to remember that all their hard work on SEO, branding, and marketing is all for nothing if a damaging image gets uploaded, someone is bullied, or links to spam content or NSFW material get posted.

    Speaking of SEO. Content moderation helps here too. By removing duplicates, re-categorizing misplaced items and rejecting low quality ads you can increase your Google ranking.

    Content moderation also helps breed and maintain trust. A qualitative study we conducted  showed that 75% of users who saw scam on a site would not return and in a quantative study we found that 50% of participants encountered something they thought was a scam.

    That means you could potentially experience 38% user churn just from scams.

    And finally, a great content moderation strategy not only protects your brand value, it helps increase engagement as well. Besedo customers have seen a significant improvement in bounce rate, for example. In some cases, a drop from 35% to 14% – increasing the chance of conversion and return visits.

    So, while you may not be able to directly translate money into money out when it comes to content moderation, you will definitely feel it on your bottom line if you neglect to set up a strong strategy for content review and management.

    Myth #3: AI is not accurate enough

    People just don’t seem to trust robots (thanks Terminator!). And while Skynet is still a way off (waaay off, we hope!) current forms of AI are tailor-made for content moderation – and are actually in many cases as accurate as human moderators.

    Case in point: our own experience working with Meetic – an online dating platform with 6.5 million monthly users. We were able to automate 74,8% of their content from day one with 99% accuracy and over time we have moved the automation level to 90%, without negatively impacting the accuracy.

    Myth #4: Building your own content moderation solution is cheaper than using a SaaS solution

    Building your own content moderation platform is a huge task, especially when you want to use machine learning AI to support your human team. Setting your developers to work on building something will cost a lot of money and take a lot of time. That’s time that could be used creating unique features that will give you the competitive edge.

    Segment CEO, Peter Reinhardt states  “You should only be building your own tool if you’ve tried lots of others on the market and find that none of them will solve your problem. Only build if you’re left without a choice. And still take care: I’d estimate that 50% of the startups that I see build tools can’t maintain them.”

    So if you haven’t yet tested our content moderation tool Implio, put your in-house development on hold. Implio is a proven SaaS solution, with inbuilt AI, customizable filters and an efficient manual interface developed specifically for online marketplaces, sharing economy sites and dating apps. And it’s free – for up to 10,000 items a month.

    Most importantly, it’s religiously maintained, and new features are added regularly, without impacting your product roadmap.

    Consider those myths busted!

    There’s a lot of misinformation regarding content moderation. But fail to get it right and spam will harm your SEO, trolls will harass your customers, and irrelevant content will ruin your site’s user experience.

    Originally posted on besedo.com

    CONTACT US
    close slider

      CONTACT US






      Please prove you are human by selecting the Flag.
      To contact us directly or to inquire about our services, please email marketing@luminad.com