Two Hat: Stop tweaking your game and start fixing your community – GamesIndustry.biz

Share this article

With live service games increasingly becoming the norm, the industry has entered a community-driven age. Whether that's fan communities on third-party sites, or communities baked into the fabric of the experience, developers are placing increased emphasis on their importance.

While questions around who is responsible for user safety online often don't have very neat answers, there is a growing consensus that platforms should be held accountable for hosting harmful content. Starting in September 2020, under the EU's Audiovisual Media Services Directive, UK communication service regulator Ofcom will be authorised to fine social media companies and online platforms up to 5% of revenue, or suspend operations, for failure to protect vulnerable users from harmful content.

Earlier this year, the Department for Digital, Media, Culture, and Sport in the UK produced its Online Harms White Paper, which also suggested that social media platforms be held accountable for hosted content, and outlined plans for an independent regulator to manage compliance.

Although the tides have been slowly turning, this is a problem Canadian tech firm Two Hat Security has a long history of tackling. Following a recent partnership with image and video classification software firm Image Analyzer, the two companies will work together to facilitate automatic moderation of live streamed video content "at unprecedented levels."

From left to right: Cris Pikes (Image Analyzer), Chris Piebe (Two Hat), and Carlos Figuieredo (Two Hat)

Two Hat and Image Analyzer are automating the moderation process, allowing individual communities and platforms to set the parameters for acceptable behavior or content, and let the system do the rest. For example, the tech can apparently identify content such as the Christchurch mass shooting, which was streamed over Facebook Live in March this year, and shut it down within seconds.

Speaking with GamesIndustry.biz, Two Hat CEO Chris Priebe says his work tackling harmful online content is deeply personal.

"I was bullied in high school, because I wanted to be different and do my own thing, and I didn't want to fit into the whole crowd, so they bullied me quite extensively, to the point where I had death threats and I had to leave town," he says. "So that gave me a passion for stopping bullying on the internet. I think everyone should be free to share without any harassment or abuse, which is our mission for our company."

The pressure to respond to toxic online content has intensified, as earlier this month another shooting was streamed on Twitch, this time in Germany, which left two people dead. But online communities elsewhere are facing similar problems, as social media becomes infected with extreme content, and fringe elements of gaming communities spread vitriol and hate.

This frequently boils over into industry workers' personal lives, as developers find themselves the target of online abuse, or even cybermobbing. The human cost of online toxicity is immeasurable to both the communities where it proliferates, and individuals actively targeted by it.

"How many billions of dollars are being lost because people quit playing, because they didn't feel welcome?"

Cris Pikes, Image Analzyer

Moderating toxic content isn't casualty-free either; earlier this year an investigation into working conditions at Cognizant-operated Facebook content moderation sites in America revealed staff developing symptoms of post-traumatic stress disorder after extended exposure to extreme graphic content, such as animal cruelty, violence, or child exploitation.

Even removing the human cost of it all, the cynical business case is reason enough to encourage healthy, positive communities. According to figures from Two Hat, people are 300% more likely to never return to games or platforms where they experienced toxic behaviour. Conversely, players are three times more likely to return after positive interactions.

"How many billions of dollars are being lost because people quit playing, because they didn't feel welcome?" says Image Analyzer CEO Cris Pikes. "The longer you stay, potentially the more you're going to pay because you're more invested in the game, so that long tail of building up your love for that game."

It's a position Priebe supports, saying that developers are "putting their time into the wrong thing" when there is a "giant looming sign" that points to problems with the community, rather than the game, limiting developers' potential.

"Stop tweaking your game. I hate your community, go fix your freaking community," says Priebe. "That's what [players] want. And that, I think, will move us from a $100 billion to a $200 billion industry."

"You can't assume that players and users know what is expected of them. So even in that sense, the industry needs to do a better job"

Carlos Figuieredo, Two Hat Security

Last week, Two Hat Security and Image Analyzer hosted a content moderation symposium in London to define and classify online harms, with the aim of tackling online toxicity. While it's a problem that won't be going away anytime soon, there are workable solutions, and plenty of things game developers can be doing in the meantime to help manage their communities.

Carlos Figuieredo is the Fair Play Alliance co-founder, and director of community trust and safety for Two Hat. He says one obvious thing that even large companies have failed to do is establish well-defined community guidelines.

"These serve as the baseline, the fundamental approach for everything else that you do in terms of player behaviour and understanding that player behaviour," he tells us. "So no kidding that we are completely, as an industry, unprepared to deal with threats coming from players on Twitter."

Figuieredo says that developers need to be intentional with how they develop communities. He mentions the recent A/B testing carried out by Twitch, which found that making people manually accept the community guidelines resulted in notable and positive change in general behaviour, just through the mere act of establishing expectations.

"You can't assume that players and users know what is expected of them," says Figuieredo. "So even in that sense, the industry needs to do a better job of really showing what is the expected behaviour, what is the unwanted behaviour, and they do need to enforce something."

With the rapid pace of technological advancement, basic steps like can be profoundly impactful. As Pikes says, it's "bit of an arms race," as companies like Facebook built mammoth platforms that quickly run out of their control.

"We haven't put the same effort into balancing the communities as we have balancing the games"

Chris Priebe, Two Hat Security

"They built this absolute machine," says Pikes, "and they had no idea how big it was going to be... They haven't thought about the securities or the educational pieces that should now be put back in as part of that design. So it's almost a retrospective thought... For us it's about enabling those tools and taking them to a market that has evolved too quickly."

There is a priority gap however, according to Figuieredo, who says there is a "lack of understanding" when it comes to the harm caused by toxic game communities.

"People don't necessarily have good stats or understanding that it affects their business, affects their employees as well," he continues. "How is it affecting their community? What is the user churn? There is a lack of understanding, a lack of white papers and good studies on this."

Priebe adds that companies are failing to adopt viable solutions to these challenges. Part of this he believes falls to an engineering backlog, as devs obsess over in-game balance while inadvertently de-prioritising the existential threat of a toxic community, and how that could significantly shorten a game's lifespan.

Community features like chat and audio are often ill-conceived, says Priebe, and the inclusion of poorly implemented communication tools effectively put a "powerful weapon" in the hands of bad actors.

"They can use it to drive everyone out of playing because they've made it miserable for everyone else," he says. "We haven't put the same effort into balancing the communities as we have balancing the games.... There is [the] technology that is actually available; let's participate in the solution, and let everyone get involved."

With the rise of extremist hate groups using gaming communities as recruitment grounds, there is a pressing need to address the threat, and ensure that vulnerable people in these spaces are protected. As Pikes says, it's about changing the community attitudes.

"Way back when, it was like people felt they had a safe haven where they could go and play a game, and that's what it was all about," he continues. "Whereas now all these far right groups, for example, have found this a potential grooming ground for radicalisation. But if we give [platforms the] tools and enable those companies to understand that the technology is available, those people will move on to another format, they will find another medium, but we've made this one safe and locked that one down."

As the largest entertainment sector in the world, and one which actively encourages online communities, the games industry is under an incredible amount of public scrutiny. The presence of toxic communities is one thing, but a failure to address the problem is arguably much, much worse.

Read the original here:

Two Hat: Stop tweaking your game and start fixing your community - GamesIndustry.biz

Related Posts

Comments are closed.