Gamers Are Fleeing Twitter for Hive. Can It Handle the Swarm?

As it struggles to deal with an influx of new users, the platform—run by a staff of three—faces an uphill battle to stay on top of moderation. 
Different colored spheres orbiting hexagon frame on dark black background
Photograph:  MirageC/Getty Images

Twitter, both as a company and as a functioning service, is hurtling toward the unknown. In the weeks since Elon Musk took controlusers have been fleeing to whatever platforms they can regroup on: InstagramMastodon, Cohost, Substack. (It turns out no one wants to go back to Facebook.) Rising rapidly among those ranks is Hive Social, a three-year-old company run by an excited, if not slightly overwhelmed, trio now faced with a massive influx of new users. 

Hive’s functionality echoes much of Twitter’s, from profile pages and header images to long stream-of-consciousness threads. But it’s also more customizable, allowing users to change profile colors, add pronouns, post songs to their pages, and hold Q&As with followers. For anyone who wants Twitter without Musk, Hive offers a respectable replica. Even before Musk bought Twitter, Hive was gaining traction. But as users have migrated from the bird app, many—particularly those in the gaming community—have landed on Hive. So much so that the app surpassed 1 million users on November 21, more than doubling its user base, and it continues to grow.

But with great expansion comes great responsibility, and for Hive, an upstart entering the social media arena, the road ahead is difficult. The app itself is a bit, well, busted, crashing often and crawling when it does move. The team, according to founder Raluca Pop, has been working around the clock to squash bugs, improve performance, and keep up with demand. They're also busy answering user questions and moderating the platform’s content. Pop has been resting about two hours per night recently. “Honestly, none of us have really gotten that much sleep,” she says. “But the app is something we're passionate about, so it's fun right now.” 

Fun, according to Pop, is what Hive wants to bring back to social media. A self-taught coder who launched Hive in 2019, she wants to return social media to its so-called golden era by “making it a happier place for people … a safe place for [users] to express how they feel, a safe place for them to post content.” 

With a team of just three—though Pop says they’re looking to bring on a fourth person “just because we're scaling really fast”—this won’t be an easy task.

New-platform growing pains aside, if Hive is to succeed it needs to be more than just stable. It’ll have to have more protections in place for users and be moderated more vigorously than it can be currently. Left unchecked, platforms can easily become overwhelmed by harassment, hate campaigns, disinformation, child sexual abuse material, or violent images. To avoid that, a platform like Hive needs to be building in those safeguards now.

For proof, look no further than the pitfalls experienced by other social media platforms. Consider Clubhouse, the chat-driven app that gained popularity in the early days of the pandemic. As it grew, so did its problems with harassment, racism, anti-Semitism, and more. “There is a cycle,” says Daniel Kelley, director of strategy and operations at the ADL Center for Technology and Society. “This idea of building to scale growth and stapling on safety as an afterthought once [founders realize] ‘Oh, this is a problem.’”

Kelley says the smartest way for social media startups to avoid this is to think about “How do we center the communities that will be most harmed by harassment online in the building out of our platform?” In doing so, “you’ll end up with a platform that is safer for everyone,” he says. 

Hive offers options like choosing who can comment on your posts or hiding specific words and NSFW content. The platform uses an algorithm within the app to moderate content and “just be more helpful” as it scales, Pop says. “We've basically been filtering out the content before it even hits the Discover page.” The team is also looking into third-party services to make moderation a little beefier.

Moderation is fickle. How well automated systems carry that burden depends on contextual factors like how text, video, and voice are used on a platform. It’s also difficult to account for cultural differences and the nuances of language. Human moderators “matter a great deal. This is one of the things that's really hard to [address] as a smaller platform,” says Kelley. ”In the long term, there are limits to what automated moderation can do.” Major social media companies have entire teams in place, actual eyeballs to identify harmful content and keep bad actors out. At Twitter, where Musk has gutted the child safety and moderation teams, remaining employees are failing to contain an explosion of hate speech and toxicity.

This is particularly vital now that Hive has become one of the go-to destinations for gamers absconding from Twitter. The video game community has long been a case study for harmful online behavior, and it's all but impossible to talk about harassment without looking at Gamergate, which codified the playbook for spreading disinformation and coordinating attacks. Companies no longer have the luxury of time when it comes to stress-testing a platform’s ability to protect its user base.

Companies like Twitter (pre-Musk) and Facebook employ entire teams, sometimes in the thousands, and still fail to make the platform safe for all of their users. Asked if she worries about trying to protect Hive users against coordinated harassment with a team a fraction of the size of others, Pop again points to the potential for automated systems to help. “It's not that we are not concerned about that. I think we would want to do an algorithm to detect the rate of the comments coming in,” she says, suggesting that algorithm could work alongside a third-party moderation service.

Even as Hive struggles to keep up with the demands of its current user base, it is also looking to expand, adding features like collapsible stories and more substantial tools like streaming, which are important to content creators, especially those in the gaming community. 

The strength of Hive’s grasp on problems that arise during this expansion will dictate its future. Kelley says that for any social platform, there are two key things to consider. The first is how people experience it and who is being harmed. The other is making sure it sticks to its plans. How well a company handles these things determines how users perceive it. A platform will gain a reputation, Kelley says, for either ignoring ongoing issues or saying it will “address hate and harassment and extremism, and not being able to execute.”

Hive’s legacy, then, won’t be defined by the community it creates, but the ones it manages to protect.