Kleros as a Dispute Resolution Mechanism in MMOs: The Case of Bots of EVE Online
Disputes in massively multiplayer online games often transcend the basic understanding of terms and conditions and require a complex understanding of social norms of the community. Kleros, as a general dispute resolution mechanism, may be an excellent tool in bridging this gap and empowering the players to tackle the problems of abuse in open world MMOs.
EVE Online, a science fiction based MMO originally launched in 2003 boasts an almost cult-like following. The entire universe of EVE is a brimming social experiment on a grand scale, with players controlling most of what happens within it.
From player-owned corporations vying for power in certain sectors to grand scale space battles making the Battle of Endor look like a border clash, EVE Online is a unique universe where players take the reins of the military, economic and social aspects within the game.
This form of environment thrives on a laissez-faire regulatory approach to players and their actions - scamming other players, for example, is a completely valid play style (a personal favorite is he case of the EVE Investment Bank, where “Cally” promised high investment returns to players who put their money in and then walked away with over 700 billion ISK, the in game currency).
While this kind of an approach gives a lot of freedom to players in game, the creators of the game control all attempts at abuse of the system from the outside. One of the most prominent ways of cheating the game by non-legitimate means is using bots for specific purposes to increase your revenue.
So, what is botting in EVE Online? The concept is quite simple - players can automate the actions of the ship they control to do trade manipulation, resource mining and other time consuming tasks which are in some games called “grinding”. This, in turn, gives these players an unfair advantage over others, but it also affects the in-game economy.
An EVE Online mining fleet in action.
As most of the aspects of the game are player controlled, so is the production and trading in the markets. A sudden increase in revenue of certain players and groups of players inflates the economy and drives the prices of ships and other materiel upwards, effectively distorting it.
The use of bots in EVE Online has been a problem which has plagued the game’s creator, CCP Games and in turn the players for years. For example, the EVE Online unofficial subreddit r/eve contains some quite interesting testimonies by bot makers, as well as almost a steady flow of outrage by players faced with these issues. This is actually a problem which has even gone so far that it got covered by some of the most prominent gaming magazines.
A legitimate approach to resolving disputes in online environments must include serious understanding of social norms of the community, which in many cases supersedes the basic terms and conditions. This challenge often overwhelms the community moderators, who need to spend significant amounts of time understanding each individual claim of botting and come to a just decision.
But, what if we allow the community to filter the claims of botting in order to make the jobs of moderators more easy?
The Kleros platform was created to allow for the formation of decentralized curated blacklists, which would help in filtering justified from non justified claims of abuse in online interaction. We have already proposed a similar system for fake news detection and our T2CR platform has proven that this system does indeed work.
Let’s take the example of a player who is flying in a wormhole and discovers a Nyx Gallente Carrier which, after detection, withdraws all fighters and warps to a safe location. Even though this carrier could in fact blast him to kingdom come, it retreats. This pattern repeats several times in surrounding sectors with several different carriers of the same type.
The player records these flight patterns and manages to divulge that what he uncovered is indeed a botting ring. He sends the information to the Kleros subcourt in-game and posts a deposit alongside the evidence gathered for other players to see. Jurors, who are players themselves, chosen at random analyze this data and can do further checking in-game to find more proof of this kind of breach.
The potential flow of a dispute.
After observing all proof, they decide without a shadow of a doubt that the Nyx Carrier botting ring is indeed real and pass their judgement. The case is closed and the moderators receive the judgement with all evidence needed to ban the players who have been participating in this wrongdoing.
If his proof is not conclusive, or this kind of claim is indeed found to be frivolous, the original poster of the dispute loses his deposit and no action is taken towards the accused players.
The Kleros approach derives from the Justice as a Service concept, allowing the creation of dedicated subcourts for games such as EVE Online and automating dispute resolution, which can be used to guarantee a stronger feeling of ownership over digital assets to players and further bolster the game economy.
There is no one who better understands the nuances in social norms of the community than the players themselves.
With the Kleros approach, players become a part of the decision making process and not just reporters of malfeasance. On the other hand, the power is distributed between the posters of disputes, jurors and moderators.
This kind of private governance mechanism, based on cryptoeconomics gives players a heightened sense of ownership over the universe they participate in. Instead of being governed by unseen entities of game creators or an oligarchy of powerful players, they are given an instrument to govern themselves.
By creating a transparent, blockchain based court, community norms can be studied in much more detail through observation of court history, which would set the guidelines for inappropriate behaviour in an organic way and shape the environment through precedence. Another key element would be that it would become easier to track down botters and botting rings and make it easier to put a stop to this kind of abuse.
Online communities evolve rapidly and social norms develop with them. The classic, centralized, top-down approach works in systems that have significant resources to manage and control the entire environment and even then abuse simply becomes more sophisticated and increasingly difficult to control. Just take a look at Facebook, for example.
It is exactly for this reason that it is important to put power in the hands of the users and, through a curated and managed process, allow them to track down the bot menace for the good of the universe.