bad behaviour online - a solution?
- Mavourneen
-
Topic Author
- New Member
-
- Thanks: 0
bad behaviour online - a solution?
12 years 1 month ago
Seems to me that much of the opposition to "censorship" or "speech control" is because it is imposed from outside, by someone in authority. However, if the community itself stamps on bad online behaviour, this may have more effect. Whether you still consider this as interfering with your freedom of speech, I don't know.
This is from the New Scientist of 12/04, saying how some sites are dealing with bad behaviour and swearing online.
Online juries help curb bad gamer behaviour
12 April 2013 by Hal Hodson
Magazine issue 2912. Subscribe and save
In the heat of the moment, online players often hurl abuse at one another – crowd-based justice can help bring them back into line
THERE is a place where 75 people die every second, hacked to bits with giant flaming swords or blasted into the air with powerful magic. Welcome to League of Legends (LoL), an online world in which 3 million gamers are playing online at any given time.
Games are competitive and tempers often run high, so abusive messages are commonplace. But a new system has shown that not only can such bad behaviour be dealt with by the crowd – it is also easy to modify.
"We can create behavioural profiles for every player in the game," says Jeff Lin, lead designer of social systems at Riot Games, which manages LoL. The profiles measure how many times users curse or insult their teammates and opponents during a game. It is not just about filleting out the handful of regularly abusive players among LoL's 30 million subscribers: most bad behaviour consists of outbursts from players who are normally well behaved.
"The question is how do we stop the spread of bad behaviour?" Lin says. A system called Tribunal, demonstrated at the Massachusetts Institute of Technology's Game Lab last month, could be the answer. "Tribunal aggregates all the negative behaviour cases, including chat logs, and bubbles them to the top," Lin explains.
These cases are presented back to the community in the game's forums, where other players vote on whether the behaviour was acceptable or not. Particularly egregious cases, judged by votes, can lead to the offending player being banned.
"If players say 'fag' or the N-word, those cases are the most highly punished," says Lin. The system has also led to new standards – swearing is now allowed in LoL, but not if it is directed at another player.
The Riot team has also tested other ways of nudging player behaviour. They found that simple messages, displayed during load screens, can have a big effect on player behaviour in the subsequent game. For example, advising players that their teammates would perform worse if they harassed them after a mistake resulted in an 11 per cent reduction in offensive language in the subsequent game, compared with when no tips were shown.
Lin says that systems like Tribunal could be useful if applied to other online systems. Web communities like Reddit already rely on the users themselves to shape the community, and to down-vote offensive posts from view. These mechanisms allow societal norms to emerge in online communities where none were before, just as juries have for hundreds of years, says Cliff Lampe at the University of Michigan in Ann Arbor. "This really helps to shape sites," he says. "It used to be that sites would rise and fall quickly, like MySpace, but these social structures lead to more sustainable sites."
This is from the New Scientist of 12/04, saying how some sites are dealing with bad behaviour and swearing online.
Online juries help curb bad gamer behaviour
12 April 2013 by Hal Hodson
Magazine issue 2912. Subscribe and save
In the heat of the moment, online players often hurl abuse at one another – crowd-based justice can help bring them back into line
THERE is a place where 75 people die every second, hacked to bits with giant flaming swords or blasted into the air with powerful magic. Welcome to League of Legends (LoL), an online world in which 3 million gamers are playing online at any given time.
Games are competitive and tempers often run high, so abusive messages are commonplace. But a new system has shown that not only can such bad behaviour be dealt with by the crowd – it is also easy to modify.
"We can create behavioural profiles for every player in the game," says Jeff Lin, lead designer of social systems at Riot Games, which manages LoL. The profiles measure how many times users curse or insult their teammates and opponents during a game. It is not just about filleting out the handful of regularly abusive players among LoL's 30 million subscribers: most bad behaviour consists of outbursts from players who are normally well behaved.
"The question is how do we stop the spread of bad behaviour?" Lin says. A system called Tribunal, demonstrated at the Massachusetts Institute of Technology's Game Lab last month, could be the answer. "Tribunal aggregates all the negative behaviour cases, including chat logs, and bubbles them to the top," Lin explains.
These cases are presented back to the community in the game's forums, where other players vote on whether the behaviour was acceptable or not. Particularly egregious cases, judged by votes, can lead to the offending player being banned.
"If players say 'fag' or the N-word, those cases are the most highly punished," says Lin. The system has also led to new standards – swearing is now allowed in LoL, but not if it is directed at another player.
The Riot team has also tested other ways of nudging player behaviour. They found that simple messages, displayed during load screens, can have a big effect on player behaviour in the subsequent game. For example, advising players that their teammates would perform worse if they harassed them after a mistake resulted in an 11 per cent reduction in offensive language in the subsequent game, compared with when no tips were shown.
Lin says that systems like Tribunal could be useful if applied to other online systems. Web communities like Reddit already rely on the users themselves to shape the community, and to down-vote offensive posts from view. These mechanisms allow societal norms to emerge in online communities where none were before, just as juries have for hundreds of years, says Cliff Lampe at the University of Michigan in Ann Arbor. "This really helps to shape sites," he says. "It used to be that sites would rise and fall quickly, like MySpace, but these social structures lead to more sustainable sites."
Please Log in or Create an account to join the conversation.
Time to create page: 0.099 seconds