ach5

Why Online Moderation Needs New Solutions 2021

ach5

Key Takeaways

TikTok will now prompt users before they send a comment that might break the app's Community Guidelines.While useful, many see this as a tiny step towards stopping online bullying and hate.Ultimately, TikTok and other social media websites need to find new solutions outside of automated moderation to truly push forward. Robert Alexander / Getty Images

Online moderation is one of the most challenging problems that social media faces right now, but experts say the solution isn't simply adding more rules.

TikTok recently added a new feature that will prompt users before allowing them to send what it deems a hateful or rule-breaking comment. The move is an attempt to help curb the online hate and bullying that has spread across various social media networks, including the popular video-sharing app.

Unfortunately, while sites may mean well with these features, they don't address the more significant issue that lies beneath the surface.

"The main issue with so much [online moderation] is that there is no one size fits all. There is no good solution that is going to work for everybody," Catie Osborn, a TikToker who recently found herself dealing with a permanent ban, told Ach5 on a call. 

Finding Clarity

Osborn, who goes by "catieosaurus" on TikTok, has over 400,000 followers on the video-sharing site. In her videos, she focuses on sexual wellness, what it's like to live with ADHD, and other neurodivergent topics.

Over the weekend, however, she found all her work in jeopardy when TikTok banned her account for "breaking community guidelines," without any additional context of what rules she might have broken.

It's this lack of clarification that has become so upsetting for many users. Because social media sites like TikTok and Twitter bring in so many reports, much of the process is automated.

“When you're talking about hundreds of millions of users, there is no perfect solution.”

This means systems are put in place to trigger temporary bans, depending on the number of reports a piece of content generates. For example, Osborn told us that if multiple people report a TikToker's live video, they instantly ban that user from going live for at least 24 hours. 

"There's a lack of clarity for what works and what doesn't," Osborn explained. 

According to Osborn, the app has seen a considerable increase in users mass reporting creators because they don't like the color of their skin, their sexuality, and more.

This possibility and TikTok's lack of clarification about what the user did wrong is a big part of the frustration, she says. 

"How are we supposed to know what we did wrong if you don't tell us," she asked. "I'm more than willing to say that I messed up. But, if you don't tell me how I messed up, I can't fix it."

Osborn isn't the only one that has found herself confused by a ban, either. Many users have turned to TikTok's Twitter feed to find answers about their bans, with many tweets receiving the same response to appeal the ban from within the app. 

Without understanding why they are banned, users could find themselves even more frustrated when trying to figure out what to do next.

Fostering New Solutions

While features like comment prompts can positively affect the community, some don't see them as long-term solutions. 

"This feature will likely only influence individuals who actually want to avoid unintentionally sounding mean," Cody Nault, a software engineer who shares his coding on TikTok, told Ach5 via email.

"Sadly, it seems much of the hate spreading on the platform is very much intended to be harsh."

TikTok

Nault explained how people continue to use TikTok's Stitch feature—allowing you to stitch together parts of another video with your own—to call out and ridicule creators. He attributes much of this to how successful hateful content can be on social media and said that he'd love to see TikTok instead pushing more positive creators.

For others like Osborn, the problem isn't a lack of reporting features. It's how the sites handle those reports. Lack of communication and easily exploited report systems are big problems that need work, but she isn't naive. 

“When you're talking about hundreds of millions of users, there is no perfect solution,” Osborn said. She added that while her account was reinstated, many creators aren't as lucky.

"I don't think there is a one-size-fits-all solution. But, when the pattern is becoming hundreds of creators are getting their accounts banned—and banned repeatedly—for doing nothing wrong, something has to change."