Advertisement

‘League of Legends’ testing automated system to combat abusive behaviour

“League of Legends” – an online multiplayer battle arena game – is incredibly popular and reported over 67 million monthly players in 2014.
“League of Legends” – an online multiplayer battle arena game – is incredibly popular and reported over 67 million monthly players in 2014. AP Photo/M. Spencer Green

TORONTO – The company behind one of the world’s most popular online games, “League of Legends,” is taking a stand against abusive behaviour and language from its users.

Riot Games announced in a blog post it will be testing a “behaviour reform” system with its North American users aimed at delivering feedback and punishments to gamers who are verbally harassing other players.

“League of Legends” – an online multiplayer battle arena game – is incredibly popular and reported over 67 million monthly players in 2014. But, the game has had many problems with trolls and abusive players in the past.

Currently, players who experience verbal harassment from opponents are able to report the abuse at the end of a match.

The new system automatically analyzes in-game data, such as chat logs, to detect abusive behaviour and take the appropriate course of action.

Story continues below advertisement

“The system delivers reform cards (notifications that link evidence of negative behavior with the appropriate punishment) that help players address their negative behavior,” explained the blog.

Breaking news from Canada and around the world sent to your email, as it happens.

“Your reports help the instant feedback system understand and punish the kind of verbal harassment the community actively rejects.”

Users must refrain from remarks that include homophobia, racism, sexism, death threats, and “other forms of excessive abuse,” according to the game’s website.

If a player is caught doing any of these behaviours, they will be punished with two week or permanent bans on playing within 15 minutes of the incident.

“If a player shows excessive hate speech (homophobia, sexism, racism, death threats, so on) the system might hand out a permanent ban to the player,” said Jeffrey Lin, lead game designer of social systems at Riot Games, in a comment on the blog.

The system is also capable of learning phrases, said Lin, who warned users that common gamer insults will be caught by the system.

The blog post noted that moving forward the automatic system could also be used to reward players for positive play.

Riot Games has been praised for leading the charge against trolls and abusive gamers.

In 2012, the company put together a team of staffers, dubbed “Player Behaviour,” aimed at finding new ways to make gameplay less toxic. One of the team’s solutions was to turn off the game’s automatic chat function and allow users to opt-in to chat.

Story continues below advertisement

Riot Games said it saw negative chat decrease by 32.7 per cent a week after allowing users to opt-in to the chat function. The company also observed a drop in offensive language and verbal abuse.

Sponsored content

AdChoices