cheating in online video games

Transcription

cheating in online video games
CHEATING IN ONLINE VIDEO
GAMES
1. 3
Savu-Adrian Tolbaru
University of Copenhagen, Datalogisk Institut
1
Thesis committee:
Head of Studies:
Andrzej Filinski
Supervisors:
Dr. Klaus Hansen (Copenhagen University)
Dr. Michael Hafner (Innsbruck University)
Submitted: 15 August 2011
2
CONTENTS:
CHAPTER 1 – INTRODUCTION
1.1
1.2
1.3
1.4
1.5
1.6
Why video games?
Why is cheating detrimental to a game?
Motivation for cheaters
Understanding the player
Human-related cheats
Thesis overview
14
14
14
15
15
CHAPTER 2 - GOALS
2.1
2.2
2.3
2.4
7
7
8
10
12
12
13
Hypothesis
Research objectives
Main contributions
Success criteria
Chapter 3 - Game types and video game architecture
16
3.1
3.2
3.3
3.4
16
18
18
21
Game types
Qualitative risk analysis
Game genres
General architecture of MMO and MO games
CHAPTER 4 - DEFINING AND CLASSIFYING CHEATING
4.1
Cheating definition
4.1.1
4.2
Cheating classification
4.2.1
4.2.2
4.2.3
4.2.4
4.2.5
4.2.6
4.2.7
4.2.8
4.2.9
4.2.10
4.2.11
4.2.12
4.2.13
4.2.14
4.2.15
4.3
Proposed definition
Cheating by exploiting misplaced trust
Cheating by collusion
Cheating by abusing game procedure
Cheating related to virtual assets
Cheating by exploiting machine intelligence
Cheating by exploiting client infrastructure
Cheating by denying service to peers
Timing cheating
Cheating by compromising passwords
Cheating by exploiting lack of secrecy
Cheating by exploiting lack of authentication
Cheating by compromising game servers
Cheating related to internal misuse
Cheating using social engineering
Game specific cheats
Proposed taxonomy
3
23
23
24
24
26
28
28
29
30
30
31
32
33
33
34
35
35
36
36
37
CHAPTER 5 - CHEATING BY COLLUSION
40
5.1
5.2
40
40
41
41
42
42
43
43
43
44
44
44
44
45
45
45
What is collusion
Classification of collusion
5.2.1
5.2.2
5.2.3
5.2.4
5.2.5
5.2.6
5.3
Collusion aimed at gaining experience or rating
Collusion aimed at trading information
Collusion to deadlock a system
Enemy team manipulation
Insider collusion
Collusion by impersonation - Boosting
Collusion detection
5.3.1
5.3.2
5.4
Rank trading
An AI-based detection approach
Collusion prevention
5.4.1
5.4.2
5.4.3
5.4.4
Collusion aimed at gaining experience or rating
Collusion aimed at trading information
Collusion to deadlock a system
Opposing team manipulation
CHAPTER 6 - VIRTUAL PROPERTY
46
6.1
6.2
6.3
6.4
6.5
6.6
46
47
49
51
52
53
What is virtual property
Ownership of virtual property
Tertiary markets
Player resentment
The spectrum of virtual world license agreement
Proposed solutions
CHAPTER 7 - VIRTUAL IDENTITY
7.1
7.2
7.3
Real life identity vs. virtual identity
The privacy paradox
Social engineering
7.3.1
7.3.2
7.3.3
Pretexting
Third party in-game merchants
Prevention
7.3.1.1
7.3.1.2
7.4
User awareness
Tools
Phishing
7.4.1
7.4.2
What is phishing?
Phishing attack vectors
7.4.2.1
7.4.2.2
7.4.2.3
7.4.3
Man-in-the-middle attack (MITM)
URL obfuscation attacks
Cross-site scripting attacks
Present session attacks
7.4.3.1
Hidden attacks
4
55
55
57
61
62
63
63
63
64
64
64
65
65
66
68
68
68
7.4.3.2
7.4.3.3
7.4.3.4
7.4.3.5
7.4.4
Observing customer data
Client-side vulnerabilities
Phishing through compromised web servers
Phishing using botnets
Message delivery
7.4.4.1
7.4.4.2
7.4.4.3
7.4.4.4
7.4.5
Email and spam
Web-based delivery
Fake banner advertising
In-game mails/messages
Prevention
7.4.5.1 User awareness
7.4.5.2 Email protection tools
7.4.5.2.1 Yahoo
7.4.5.2.2 Google Gmail
7.4.5.2.3 Microsoft-Hotmail
7.4.5.3 Browser protection tools
7.4.5.4 In-game solutions
7.4.5.5 Authenticators
7.4.5.5.1 Authenticator functionality
7.4.5.5.2 Weaknesses
7.4.5.5.3 Improvements
7.4.5.5.4 Further ideas
69
70
70
70
71
71
71
71
72
73
73
75
75
76
77
78
79
80
80
82
83
84
CHAPTER 8 - GAME SPECIFIC CHEATS
85
8.1
8.2
8.3
8.4
8.5
8.6
85
85
86
87
87
87
What are game specific cheats?
Cheating by exploiting a bug or loophole
Tweaking
Scripting and macros
Add-ons
Camping
CHAPTER 9 - CONCLUSION
9.1
9.2
9.3
9.4
Assessment
Human related cheats prevention - summary
Future work
Personal views
88
88
88
89
REFERENCES
90
Main References
Secondary References
90
91
5
TABLES:
Table 1: Cheating in a chess game…………………………………………………………………...9
Table 2: Motivation for cheating and the effect of that specific cheat…………………….………..11
Table 3: Video game types classification……………………………….……………………….….16
Table 4: Video game genres classification……………………………….…………………………19
Table 5: Cheating classification according to Yan & Randell (2005) ……………………………...26
Table 6: Cheating classification according to Gautheier Dickey et al. (2004) …………………….32
Table 7: Proposed taxonomy for cheating in online games……………………………….………..37
Table 8: Authentication inside the video game……………………………….…………………….56
Table 9: List of heuristics and cognitive biases……………………………….…………………….58
Table 10: List of motivational errors……………………………….……………………………….60
Table 11: Example of fake banner advertising……………………………….……………………..72
Table 12: Opera‘s security badges……………………………….…………………………………79
IMAGES:
Image 1: Risk analysis on game types…………………………….………...………………………18
Image 2: MMO and MO architecture ……………………………….……………………………21
Image 3: Picture taken on the 5th of June 2011 from eBay.com depicting the sale of a
level 85 World of Warcraft account……………………………….……………………………..…51
Image 4: Virtual identity……………………………….…………………..……………………….56
Image 5: Factors defining virtual identity……………………………….…………………….……57
Image 6: Man-in-the-Middle Attack……………………………….……………………………….66
Image 7: Example of in-game phishing ……………………………………………………………73
Image 8: User awareness for Yahoo.com……………………………….……………………..……73
Image 9: Example of digital tag for yahoo emails……………………………….…………………76
Image 10: Example of certified email icon next to the sender‘s ID……………………………….76
Image 11: Example of Gmail themes………………………………………………………...……..77
Image 12: Blizzard Authenticator activation page…………………………………………..……80
Image 13: Plaintext initialization request ………………………………………………………….81
Image 14: Authenticator initialization response message ………………………………………….81
Image 15: The initialization data encryption and decryption ………………………………………82
6
ABSTRACT
The video game industry has its roots in a multi-billion dollar economy that is
constantly increasing and, at the same time, gaining more and more of the
population‘s interest by creating new ways of interaction and engagement,
captivating millions of players. Thus, assuring and maintaining a state of security in
these virtual environments is crucial. However, current research in cheating in
online games has been concentrated on the technical solutions, ignoring those
cheats and attacks that take advantage of the player‘s shortcomings. These cheats,
referred to in the paper as ―human-related cheats‖, do not seem to be well
understood and few academic research has been conducted into them. In this paper
a new definition and an extended taxonomy is provided for cheating in video games
as to eliminate the ambiguities related to what it means to cheat and to identify the
characteristics of each individual cheating category. The focus of the paper is
slowly switched onto the main topic - the human related cheats and methods of
decreasing their risks. Additionally, the paper provides a detailed view upon a
newly introduced cheating category (―Game specific cheats‖) in order to further
eliminate the ambiguities to what can and what cannot be considered cheat in
certain games.
Chapter 1
Introduction
1.1.
Why video games?
“Is That Just Some Game? No, It’s a Cultural Artifact ―(Chaplin, 2007). According to Tristan
Donovan, author of ―Replay: The History of Video Games‖, the video game* industry ―spends,
makes and loses billions creating experiences that thrill millions, and governments compete to woo
game companies to their shores with generous tax breaks.‖ (Donovan, 2010). The game industry
was estimated at $47.5 billion (Go rumors, 2010) and according to a New Report by Global Industry
Analysts Inc. is estimated to exceed $61.9 billion by the end of year 2011 (GIA, 2011). As every
industry that deals in large amounts of money, it must have a serious security to make sure that the
investments are not lost.
Moreover, games have long escaped the confined walls of the arcade*, becoming a vibrant part of
our lives, from cafes in China, to commuters playing games on their mobile phones, to the Korean
Air Force Starcraft II division (Jimmy & Hwang, 2009). Games have opened gateways to other
cultures. For instance, the success of the Japanese gaming company Nintendo, and in particular, the
Video game = in this paper video game refers to any type of console (such as Xbox360, PS3, Wii, etc) or PC game, excluding
games played on arcade systems (Dictionary.com, 2011)
Arcade = specially designed area which has coin operated terminal and allow the gamer to play specific games, usually only
one game per terminal (they did not provide the flexibility of today‘s consoles and computers) (Dictionary.com, 2011)
7
success of the game Pokémon have opened the west to a great part of Japanese culture - the anime
films and manga comics.
As the main driver for computer sales in the ‗80s and ‗90s, games have encouraged acceptance of
new technologies – Playstation 2 console helped make the transition from cassettes to DVDs and the
latest consoles are paving the move to high-definition TV (Donovan, 2010). The Wii and Kinect
have completely revolutionized the interaction between the player and the game, changing the
control from pressing buttons, to using gestures and spoken comments (Project Natal ―101‖, 2009).
Moreover, Microsoft‘s Kinect has enabled researchers to create a wide range of image dependant
applications, such as hologram-like images, 3-D models of homes as well as help NASA scientists
have teleconferences in three dimensions (GamrConnect, 2010).The Kinect is but one example
where technology was pushed to the limit for entertainment but then has become a catalyst for
research and innovation.
Although video games have been created for the purpose of entertainment, recent studies show that
they can be used as educational tools (Masnick, 2009; Gee, 2003; Squire, 2005), improving skills
such as business skills, physics skills, teamwork and leadership (Kimak, 2009). Additionally, video
games have also been used for training purposes by military (Erwin, 2005) and clinical
rehabilitation for conditions such as amblyopia* (Hubert-Wallander et al., 2010) or have
therapeutic outcomes on side effects associated with the treatment of cancer (such as nausea),
anxiety management or decrease asthma and diabetes attacks (Kato, 2010). However, these positive
effects can be experienced only if the player continues to be emerged in the game, being an active
participant is the most important part in acquiring these skills (Gee, 2003). In online video games*,
whenever a community of gamers is overrun by cheaters, non-cheating players stop playing
(Pritchard, 2000). In order to preserve the education value of a game, its integrity and availability
must be protected against cheaters.
Moreover, no matter how the user interacts with the game or the intricate storyline, video games are
still applications and in this situation cheats are ―in some ways similar to security holes in other
software‖ (Mørch, 2003). Providing a method to prevent cheating in online games, can also provide
methods of avoiding similar security threats in other software.
1.2.
Why is cheating detrimental to a game?
“I do not cheat; I just play by different rules”. Cheating has a definite negative impact on the
gaming community - cheaters disrupt the normal gameplay*, either making the progression of the
game not possible or manipulating the outcome (Pritchard, 2000). As games have evolved over the
years, so have the cheating methods. Games do not require the players to be in the same room and
the evolution of the internet has allowed players from all over the world to play together, making it
more difficult for players to identify cheaters. For single player games, such as Solitaire, the cheater
can only ―hurt‖ himself, allowing him to finish a game in a shorter amount of time with less effort
and even skipping important scenes that might decrease his overall immersion. Additionally, closed
multi-player video games do not offer the anonymity of the internet, therefore cheating is usually
not that much of a problem (Kluecklich, 2004). In this aspect, the video game is similar to a game of
chess.Cheating in chess can take place before, during, or even after the game, and can occur in
different ways: collusion with spectators or other players, rating manipulation, misuse of the touchmove rule, the pre-arranged draw, or use psychological tactics to unsettle an opponent (see Table 1).
Amblyopia = ―dimness of sight, without apparent organic defect.‖ (Dictionary.com, 2011)
Online video game = any type of video game that requires a client-server connection, the term excludes games that run
through peer to peer connections (Castronova, 2005).
Gameplay = goal-oriented activity based on overcoming challenges within game rules through game mechanics (Sicart,
2009)
8
Table 1: The different methods which can be employed to cheat in a face-to-face chess game
1
2
Type of cheat
Collusion with spectators or
other players
Misuse of Touch – move
rule
3
Cheating with technology
4
Rating manipulation
5
Use psychological tactics to
unsettle an opponent
Example
Such as, a pre-arranged draw.
The touch-move rule states that if a player has already
touched a chess piece, the player is required to move that
particular piece. If the player wishes to adjust the
position, he must inform the other player.
Communication with an accomplice that has access to a
chess program.
The manipulation occurs when game results are
determined before the game starts.
Such as passive aggressive behaviour.
Any of these cheats used in a face-to-face game are easier to identify than in an online game, where
the players are not in the same room, and as such cannot keep an eye on their opponent‘s moves. In
traditional games, cheating is usually obvious: stuffing an ace in the sleeve, moving a chess piece
when the opponent goes to the bathroom. However, discovering a cheater is online games is more
difficult. In open multi-player games cheating can be virtually undetectable and can destabilize
whole game-worlds. Unlike traditional, single-player video games in which a human player
competed against a non-human character* that used artificial intelligence, online games have
opened the door to player communication, allowing them to compete against each other and against
other human players over computer networks (Webb, 2004). In this respect, cheating is no longer an
activity that does not harm anyone (but the player‘s skill or personal esteem), but other human
players, thus ruining the gaming experience* for non-cheating players. In multi-player video games,
cheats do not only change the experience of the cheater, but affect other players as well. For
example, in a game like Counter-Strike, players equipped with automatic aiming algorithms, or
―aimbots‖, are superior to the honest players‘ to such an extent that their avatars are virtually
invulnerable (Kuecklich, 2004).
Most of the times a player cheats in order to artificially increase their in-game ranking without
actually putting all the effort and skill required. Once a game is infested with cheaters, honest
players either stop playing altogether or play only amongst the people they trust. During the last 20
years there have been many online games that have lost a considerable number of players due to
cheaters: Age of Empires, America‘s Army and EverQuest are amongst the games affected (Spohn,
2002). Hardy (2009) states that one out of five people that participated in his poll have stopped
playing or avoided a game due to cheating. Another famous example of cheating in games is
Blizzard‘s Diablo, amongst the first truly successful commercial online games, where the gaming
experience was seriously affected by the amount of cheating between many participants (Smith,
2007). In a survey conducted by the gamer magazine Games Domain (Greenhill, 1997), 35% of the
respondents who played Diablo confessed to having cheated in the game (n=594). Interestingly,
when asked whether a cheat and hack free gaming environment would have decreased or increased
the game‘s life, 89% of the professed cheaters states that they would have preferred not being able
to cheat. This gives birth to a social dilemma: the players queried are tempted to cheat, but knowing
that others face the same temptation, they would prefer that no one can do it (Smith, 2007)
Non-human character = in game characters controlled exclusively by artificial intelligence, in role playing games they are
also called NPCs (Non Player Characters) (Ellison, 2008)
Gaming (user) experience = ―a momentary, primarily evaluative feeling (good-bad) while interacting with a product or
service‖ (Hassenzahl, 2008)
9
Therefore, one of the main reasons to protect online games from cheaters is the player‘s user
experience, which is very important for game companies and to the success of a game as ―the
customer experience – [..] -- is the key driver of online success.‖ (Pritchard, 2000)
Cheating in online games not only affects the game-play and the enjoyment of non-cheating players
but also the quality of service (Hardy, 2009) that the game developer provides. In addition, most
games are not free – the player has to buy the game itself and in some cases also has to pay a
monthly fee. These people expect to get their money‘s worth and play in environments where they
can compete on equal terms and test their skill against other players‘ skills. Games with low quality
of service are not considered ―fun‖ to play and usually lose players, situation that has an economic
impact on the game developer. The loss of gamers not only brings financial loss to the company but
also gives a bad reputation which may affect their future work. In this situation the game company
stands to lose the most as players move on to more ―secure‖ games. The player‘s frustration and
negative view of the game is added to the developers who are blamed for the game‘s security flaws.
As Pritchard (2000) argues: ―Cheating hit closer to home for me while I was working on the final
stages of Age of Empires II: The Age of Kings. Cheating online became a widespread problem with
the original Age of Empires. Tournaments had to be cancelled due to a lack of credibility, the
number of online players fell, and the reputation of my company took a direct hit from frustrated
users. Unable to spare the resources to fix the game properly until after Age of Kings was done, we
just had to endure our users turning their anger upon us -- probably the most personally painful
thing I've experienced as a developer.‖
However, game companies are not the only ones who gain money from games. Some online games
have potential economic revenue for cheaters. For instance, a fully geared level 85 World of
Warcraft account can be sold off the IGN Account website (IGN, 2011) for prices ranging from 100
to 360 Euros. Moreover, game accounts are not the only in-game items that can be exchanged for
real-world money: the same website also sells and allows player to sell in-game currency and items.
If cheaters can find a way to gain more in-game currency*, high level characters or special items
they can sell these virtual assets* for real money. Because of the appeal of earning a living while
playing, players may resort to fraudulent increasing the number of overall cheaters and attacks in
video games (McCurley, 2010).
1.3.
Motivation for cheaters
Pritchard (2000) claims that a lot of cheaters use cheating tools and mechanism in order to win
games, to “dominate and crush opponents, trying to make other players think they are gods at the
game”. He states that there is no ethical dilemma that would concern them, since the anonymity and
artificiality if the Internet “seems to encourage a moral vacuum where otherwise nice people often
behave in the worst possible way” due to the lack of serious consequences (if a cheater is caught he
might be rejected by other players, he might be banned by the gaming company, but he can always
establish another identity and continue from where he left).
Consalvo (2005) identifies four main reasons why players cheat in video games: (1) when they get
stuck, (2) to ―play God‖, (3) to ―fast forward‖ through unpleasant or boring parts, and (4) to annoy
other players. She argues that the most cited and accepted reason players offer for cheating in games
is getting stuck – they reach a point where they cannot progress further without help, and turn to
guides, codes, or friends to help them (Consalvo, 2005).
In-game currency = similar to real life currency, in game currencies are virtual tokens used as a medium for exchange, the
―money‖ used in the game‘s world (Corwin, 2009).
Virtual asset = anything that has value for gamer or the gaming community, items, characters, in-game currencies etc.
(Castronova, 2005)
10
Some players cheat to ―play God‖ or have fun without wanting to get ahead or defeat another human
player, but to “bring more pleasure to an already pleasurable experience”, such as doing
everything possible in a game, discovering all the secret options, etc.
The players in the third category (who cheat to ―fast forward‖ through content) cheat in order to
avoid unpleasant or boring parts of the game. In this case, cheating is usually instrumental in nature,
the player wishing to complete a game but not fully engage in all its aspects.
The last category includes players that cheat to distress other players, to defeat them without having
the necessary qualifications (―to level the playing field‖). For these players cheating focuses on the
reaction of other players and may not be tied to actual self-advancement: ―The sad truth is that the
Internet is full of people that love to ruin the online experience of others‖. On the other hand, some
claim that they cheat because they have the right to do so because they are superior players.
Spohn (2002) and Pritchard (2000) also introduce the monetary incentive for cheating – ―wanting to
have a heap of game currency to sell on eBay‖ (Spohn, 2002), ―Now look at the real money
changing hands for virtual characters and items. […] Let’s not overlook the growth of tournaments
and contests for online games‖ (Pritchard, 2000).
The following table (Table 2) lists the cheating motivations as presented from the view of the
previously mentioned researchers and their respective effect on the gaming community:
Table 2: Motivation for cheating (according to Spohn, 2002, Consalvo, 2005 and Pritchard, 2000) and the
possible effect of that specific cheat on the overall game experience.
1
Motivations for cheating
To dominate and crush opponents
2
To get unstuck
3
To play god
4
To fast forward through
unpleasant/boring parts
To annoy other players
5
6
To gain in-game items/money/game
accounts that can be sold for real life
currency
Effect
Creates frustration and ruins the spirit of
sportsmanship that is a key driver in most
competitive games
Usually no real damage is done, players that cheat
to get unstuck do this in single player games where
cheating is ―allowed‖ (for details read single
player games in chapter 3.1)
Depending on the game type this category of
cheats can have from little to very high effects (see
chapter 3.1)
Same behaviour as for ―getting unstuck‖.
This type of cheat is directly aimed towards other
players, having a high negative impact on the
gaming community.
Highly damaging to the game infrastructure, also
with a high chance of the cheat occurring due to
the real life currency gain and little to no
consequences for the cheater if he gets caught.
11
1.4.
Understanding the player
Emma Woollacott (2010) mentions a BitDefender study that states that gamers are more prone to
phishing and social engineering attacks than other social network users since gamers are more
willing to add a player to their friend list in order to have someone to play with or win higher scores.
BitDefender created three false accounts containing different amounts of information and signed
them to a generic interest group on a social network. They noticed that social network users
befriended the profiles that had more information (23 for the first profile, 47 for the second and 53
for the last). On the other hand, in a games group, they noticed that players were friendlier and more
trusting (85 for the first profile, 108 for the second and 111 for the third). Since players are more
susceptible to attacks than normal users, it is vital that we identify what makes players more
vulnerable, how human-related attacks differ for gamers and how they can be stopped.
1.5.
Human related cheats
A lot of attention has been attributed to technical cheats that disrupt the game play and experience of
other people, almost completely ignoring cheats that can occur from the human counterpart.
However, as technical cheats usually get solutions because it is easier for the gaming company to
protect from them, players are left to fend for themselves against the human related cheats. Due to
the monetary incentive for cheating (money that can be obtained by accessing the credit card
attached to the account or by selling the items or accounts), the attacks on the gaming experience
and the game accounts have increased, attackers using more than one medium to gain access to the
―assets‖: attacking the players through their email clients, social networks, or going directly to the
source, and employing social engineering attacks on the game developer‘s customer service. The
problem becomes more serious when we realize that the gamer‘s ―whole life‖ may be attached to the
gaming account, since most games require an email account and a credit card to register. Such was
the case of editor Dan ‗Shoe‘ Hsu (2008) who found out that his MSN account got hacked through
his Xbox Live* account, and has realized that the attacker had deleted his friend list, changed his
Gamertag*, made purchases using his credit card, spent his Microsoft Points* and actively harassed
his industry friends and contacts through his MSN Messenger list.
Stephen ―Stepto‖ Toulouse, lead program manager for policy and enforcement on Xbox Live,
mentioned that attackers can employ three human-related types of attacks on the player, attacks to
which the player can be oblivious (Hsu, 2008). He mentions that the first attack involves social
engineering during the game play – attackers will play alongside honest players, chat with them, get
friendly and then start asking about their interests: ―they’ll ask you if you have a favorite
car…they’ll bring it up in the casual flow of conversation… and get you to divulge certain things‖.
We have to take into consideration that gaming is a social activity, you go online to play and make
friends so this type o conversation will not seem suspicious to many players and thus they are open
to attacks. In most cases, attackers will try to use cognitive biases in order to get players to ―open
up‖ – such share a bit of personal information to coerce the player to do the same. Therefore a need
to educate players on identifying these attacks arises.
The second attack mentioned by Toulouse involves online research done by the attacker: ―As our
core audience is very technologically savvy – they’re all on Facebook, MySpace, or they’re on
various social networking sites – they’ll put up a wealth of information about themselves. They’ll
blog about their very first pet … or [other] personal details. So with a little bit of research and time,
the attacker can try to get information that way‖.
Xbox Live = the online service provided for Xbox players (XBOX, 2011)
Gamertag = name used for a gamers Xbox live‘s player profile (Qualls, n.d.)
Microsoft Points = currency of the Xbox Live marketplace which allows players to purchase content without a credit card
(XBOX Live, 2011)
12
The third path mentioned by Toulouse involves ―trying their luck‖ with the customer service – after
having gathered enough information about the player (using one or both of the above-mentioned
methods), they will start “making dozens and dozens of phone calls to customer support and try to
get customer support to divulge information” or convince the employees that they are indeed the
player. This scenario happened to GameSpy Console Editor-in-Chief, Will Tuttle (Hsu, 2008) who
called customer support after realizing that his account got hacked: ―Oh Mr. Tuttle, didn’t we speak
yesterday? The support rep asked. They didn’t‖. As Touluse mentions, in these attacks the
technology itself is not the issue – there is no software vulnerability gaps that allow for these attacks
to happen and in each case the game company representatives realize that the attack occurred
through human-related methods. In some cases, players do not even realize how the information was
available since there are situations where a friend will divulge it because there is not harm in telling
anyone that a different player might have a cat named Waffles. Therefore, more attention has to be
given to such attacks and new protection and prevention methods must be identified.
1.6.
Overview
Chapter 2 (―Goals”) focuses on highlighting the hypothesis and research questions, as well as the
success criteria for the paper. Chapter 3 (“Game types and video game architecture”) familiarizes
the user with game genres* and game types* as well as giving basic insight into which types of
games have a higher threat level based on a qualitative risk analysis. Chapter 4 (―Defining and
ClassifyingCheating”) provides a definition for cheating in online games as well as a up-to-date
taxonomy of different methods employed, based on the taxonomies offered by Yan and Randell
(2005), Webb et al. (2007), Neuman et al. (2007), Gautherrier Dickey et al (2004) and Junbaek et al.
(2004). Chapter 5 (―Cheating by Collusion”) introduces cheating by collusion, presents the
categories in which this cheat is divided and proposes methods of prevention. In chapter 6 (―Virtual
Property”) the research presents cheating related to virtual assets or real money transfer, and the
effect they have on the online gaming community and honest players. In Chapter 7 (―Virtual
Identity”) there are 2 cheating categories looked upon in detail, grouped because of their similar
aspects. More precisely, chapter 7 presents the risk of social engineering, the motivation behind the
success of these attacks and methods of preventing them. The chapter also describes phishing
methods in online games, as well as the implemented methods of preventing phishing attacks and
other methods of mitigating them such as the authenticator - a method that could prevent most
attacks based on cognitive biases*. Chapter 8 (―Game specific cheats”) focuses on a new category
of cheats - game specific cheats, giving examples of such cheats and explaining in more detail why
they are considered ―game specific‖ and why they cannot be interpreted as any other cheat type. In
chapter 9 (―Conclusion”) conclusions based on the previous research are presented. Additionally,
the paper is assessed in relation to the success criteria specified in Chapter 2 and future research is
proposed. Moreover, the reference section has been split in two groups: main references and
secondary references. This split has been done in order to highlight the most important sources
(those that have been referenced the most and had an impact upon the current research).
Game Genre = a kind or category of video games defined by common traits (Apperley, 2006)
Game type = in this paper it refers to the basic architecture of the game e.g. single player, multiplayer online etc.
Cognitive biases – defined in chapter 7 ―Virtual Identity‖.
13
Chapter 2
Goals
2.1.
Hypothesis
Although, gaming is a multimillion dollar industry growing significantly every year, not enough
research has been done in the field of video game cheating. The lack of research in this field can be
observed from the definition (which leaves room for ambiguity) and from the taxonomy provided by
(Yan et al., 2005). Unfortunately, current cheating preventing solutions focus only on technical
issues*, examples include ―Catching Cheaters with Their Own Computers‖ (Naone, 2007), DMW
Anti-Cheating (DMW, 2011), nProtect (2011), Even Balance (2011). The changes required by such
papers are intended towards the core architecture of video games and even to the general
architecture of today‘s computers and very few research has been done in the field of human-related
video games cheats. However, many video game companies have already realized the need of
increasing the players awareness by internal campaigns (such as Blizzard Support, 2011) which
leads to disjuncture between the methods used by various game developers since there is no
generally accepted research in this specific field.
The paper starts by providing a sound definition for cheating in video games as well as extending
existing taxonomies based on the new definition. The remainder of the research is focused solely on
human-related cheats and new categories of cheats proposed by the new taxonomy in online games.
The types of cheats discussed in detail in this thesis are: cheating by collusion, cheating through
social engineering and phishing (both referring to virtual identity), cheating related to virtual assets
(virtual property), and game specific cheats (providing a few examples of behavior that may be
considered a cheat in some games and may be allowed in another similar game).
The study shows us in which circumstances human related cheats occur: what the necessary
conditions are and how they affect the game for the non-cheating players. Based on this, the paper
will provide methods used to prevent or mitigate these cheats, taking into consideration the existing
literature (research studies, as well as forums and magazine articles). Additionally, the thesis will try
to propose new methods of prevention, based on how heuristics and biases affect the way players
deal with risk and uncertainty. Therefore, the current paper will address four important categories:
virtual property, virtual identity, player collusion, and game specific cheats.
2.2.
Research objectives
The paper plans to meet the following research objectives:

Search and critically analyze literature on relevant topics


Provide a new, comprehensive and unambiguous definition for cheating.
Describe and analyze existing taxonomies and propose a new taxonomy of cheating in
video games based on the existing literature and on the newly proposed definition for
Technical issues = in this case refers to cheats that use technical methods to employ, the category excludes cheats that are
purely based on human interaction, synonymous to technical cheats.
14

2.3.
cheating. The taxonomy will then be used to differentiate between cheats that are
technical and cheats that are human related*.
Analyze in detail selected categories (the ones that can be categorized as human
related cheats from the previously defined taxonomy as well as any newly proposed
categories in the given taxonomy).

Provide feasible solutions for decreasing the chances of successful cheats and even
attempted cheats for each of the categories analyzed in detail.

Attract the attention towards categories of cheats that require future research.
Main contributions
The thesis contributes by proposing an unambiguous definition of cheating as well as an up-to-date
taxonomy of cheating in online games (providing an extension to the existing taxonomy given by
Yan, 2005). Additionally, since the majority of research in cheating in online games revolves around
technical cheats and ignores the types that involve human interaction, the current paper will present
the human related cheats and present solutions to help prevent future attacks (based on existing
solutions put in practice by companies) as well as propose new prevention solutions. Moreover, the
thesis will analyze in detail any new category that is added in the taxonomy after completing the
initial research phase. Thus the paper intends to bring contributions to both the core of cheating in
video games (providing a new definition and taxonomy) and specific categories of cheats which
may be directly used in order to decrease successful attacks of the types that have been presented in
detail.
2.4.
Success criteria
The paper is considered successful if it meets all the research objectives and the content brings
additions to the existing literature in the field of video game cheats by providing a definition and
taxonomy that is unambiguous and easily applicable and ensures that the categories looked upon in
detail are clearly explained. Additionally, the paper should provide feasible solutions for decreasing
the number of attacks and/or number of successful attacks in the specific categories.
Human related cheats = as opposed to technical cheats, human related cheats refer solely to cheats that purely rely on human
interaction (definition proposed in the thesis).
15
Chapter 3
Game types and video game architecture
In this chapter a general introduction to video games is provided, focusing on the possible game
genres and game types supported by each of the genres. This introduction is aimed at persons that
require general information about video games (classifications, definitions, architecture, etc.).
3.1. Game types
The following table (Table 3) describes all the possible game types with an emphasis on the
consequences of cheating and number of players affected by cheating:
Table 3: Video game types classification: the rules related to cheating, the game‘s goal as well as cheating
consequences and specific observations.
Game Type
Rules
Goal
Single
Player(SP)
(Schreier,
2010)
Cheating ―allowed‖,
even facilitated by
cheat codes provided
by the game‘s system.
However the user
must accept the
EULA terms in order
to install the game.
Rules specified and
agreed upon verbally
by the players/local
admin.
Exploring the game
story, completing
the game.
Local Area
Network /
Peer To Peer
( LAN/P2P)
(Lehn et al,
2010)
Multiplayer
Online (MO)
(Shenglong
et al., 2007)
Massive
Multiplayer
Online
(MMO)
(Shenglong
Cheating
consequences
None
Observations
Competing/working
together with
friends.
Disturbing the game
flow and competition
spirit of the other
players; may result in
local penalties
Gamers playing
in an instance of
the game: 2-16.
Rules specified and
constantly updated in
the EULA and ToS;
players must agree to
the rules in order to
join the online world.
Playing with allies
or against other
players in order to
enjoy the challenge
and gain
levels/ranks in the
game‘s leaderboard.
Rules specified and
constantly updated in
the EULA and ToS,
players must agree to
the rules in order to
Socializing, working
together with large
groups for people to
surpass otherwise
impossible
Artificially
increasing standings,
disturbing game flow
etc.
May result in
permanent banning of
the account or even
legal actions taken
against the cheater.
Cheats in MMO‘s
may have colossal
effects; in some
games such as
Second Life in game
Gamers playing
in an instance of
the game: 2-16;
however their
ranks are
compared to the
ranks of all other
players active on
the same server.
Gamers playing
in an instance of
the game:1000+;
cheating can
have widespread
16
Gamers playing
in an instance of
the game: 1.
et al., 2007)
join the online world.
obstacles.
Hot Seat
(HS)
(GiantBomb,
2011b)
Rules set only
between friends;
usually the system is
completely based on
reciprocal trust.
Competing or
playing with
friends/having fun
against AI using the
same
computer/console.
Split Screen
Multiplayer
(SS)
(GiantBomb,
2011a)
Rules set only
between friends,
usually the system is
completely based on
reciprocal trust
Competing or
playing with
friends/having fun
against AI using the
same
computer/console
with the screen split
into 2-4 regions, 1
for each player.
17
items can be bought
with real money and
some people earn
their living just by
selling virtual items;
a cheater disrupts the
normal sales or even
steals the items and
sells them himself.
May result in
permanent banning of
the account or even
legal actions taken
against the cheater.
Cheating in this case
ruins the spirit of
competition and can
disrupt the game
experience of the
other players to such
an extent that they
need to establish
rules (e.g. players
that ended their turn
must not look
towards the screen
while waiting for
their turn) or even
completely give up
the hot seat type of
game.
Cheating ruins the
spirit of competition,
upsets the flow and
immersion of other
players.
effects due to the
butterfly effect
(Butterfly Effect
in EVE Online)
Gamers playing
in an instance of
the game: 2-8;
cheating is again
achieved easily:
normally when a
player has ended
his turn he needs
to look away
from the screen
and allow the
other player to
do his own
actions; if the
player which is
not active at the
moment looks at
the screen he can
gain information
about the actions
of his
competitors
Gamers playing
in an instance of
the game: 2-4;
cheating in such
games is very
easy since at any
moment a player
can look at the
screen part of
another player to
gain sensitive
information he
shouldn‘t have
normally known
Note: Cheating in SP games on the Xbox 360 is allowed but achievement points are no longer
granted for players that use such cheats (achievement points are used to compare the rankings of
players for their single player activities), trying to use cheats in order to trick the system into
according achievement points (e.g. bots controlling the player‘s character) may result in banning of
the account. Also the cheats that are provided by game developers for single player are only
intended to help the player progress in the game to enable him to see the full story even if he is
unable to complete some tasks; these codes are automatically disabled in any other game type.
3.2. Qualitative risk analysis
A qualitative risk analysis can now be deduced based on the previous table and taking into account
the chance of a cheat occurring and the value of loss (given by the gravity of loss multiplied by the
number of players such a loss affects) when such a cheat occurs (see Image1).
Image 1: Qualitative risk analysis on game types, taking into account the chance of a
cheat occurring and the value of loss (the gravity of loss multiplied by the number of players affected)
3.3. Game genres
In order to link the game types (and their estimated risk) to specific games the following table
(Table 4) describes the basic game genres, listing which game types are supported by each genre as
well as providing a general description of the genre. Additionally, the table will present specific
details for cheating or skills required by gamers when playing a games from the in question (these
skills will most likely form a good target for exploits - e.g. if a game requires a player to be fast then
there is a high chance of cheating in the form of ―bot‖ applications which executes commands with
lightning speed) as well as providing a few examples of recent games that belong to each category.
The game genres as well as the examples and the description have been taken from (ign.com,
gamespot.com and Rollings & Adams, 2006). In some cases the sources did not fully agree on the
definition of a genre. For instance, City Builder games were categorized as simulations by
gamespot.com. However the other two sources agreed that city builders form a genre on their own,
and such, the following table recognizes the category as a genre and not a subgenre. All subsequent
conflicts have been solved in a similar manner. Moreover, new games tend to combine features of
more game genre‘s leading to hybrid types such as action-adventure and Action-MMORPG‘s which
usually provide the game modes resulting from the reunion of each of the parent genre‘s.
18
Table 4: Video game genres classification: game genre, types supported (see Table 3), description, specific details and examples (based on Apperley, 2006).
Game Genre
Game Types Supported
Description
Specific details
Example
SP LAN MO MMO HS SS
Action/Beat-them-up
Adventure
Arcade
Board Games
City Builder
Fighting
Gambling
Massive Multiplayer
Online Role Playing
Game (MMORPG)
Puzzle
Games featuring melee combat between the player Cheating in such games is not an issue since Devil May
and a large number AI controlled enemies
they only support single player.
Cry,
Bayonetta
―Video game in which the player assumes the role of Cheating in such games is not an issue since Uncharted
protagonist in an interactive story driven by they only support single player.
exploration and puzzle-solving instead of physical
challenge―(Rollings & Adams, 2006)
Arcade games are simple, intuitive games with short
Space
levels usually based on games that were running on
Invaders
Arcade machines (coin operated).
Games in which pieces are moved on a board Cheating in such games can even be based on Holye Board
according to a set of rules
running several instances of a game at the Games,
same time, seeing from other instances which Chessmaster
are the most probable moves the opponent
2000
will make in the future.
Players act as both architects and rulers, creating an The player‘s success is based on his ability to SimCity
optimal infrastructure while ensuring economic properly manage his resources.
growth.
The player controls a character fighting vs. another The easiest cheat in such cases occurs when
Mortal
character; both the characters have a limited amount using a usual keyboard (which has a limit of 5 Kombat,
of hit points and abilities.
key presses at a time); player 1 can press 5
Street
keys and keep them pressed in order to nullify Fighter, Soul
the effect of any of the inputs of player2 (in Calibur
the case of such a game being played on the
same computer).
Games of chance that usually allow betting for either Cheating in such games could cost the game
Poker,
virtual currencies or real-life.
company and/or other players a substantial Roulette
amount of money/virtual currencies.
Online RPG‘s games which allow thousands of In MMORPG‘s a players actions can resonate World Of
gamers to play in the game's evolving virtual world at inside the entire game‘s world. (Butterfly Warcraft,
the same time.
Effect in EVE Online)
Guild Wars,
Rift, Second
Life, EVE
Online
Games that focus on logical and conceptual
Bejeweled,
challenges.
Limbo
19
Racing
Real Time Strategy
(RTS)
Rhythm
Role Playing Game
(RPG)
Shooter(first person
shooters :FPS, third
person shooters TPS)
Simulators
Sports
Turn Based Strategy
(TBS)
Games which are built around races with any type of Examples of cheats:
Need For
vehicles.
1 .creating a bot that controls the car perfectly Speed, Grand
over a specific race track (having pre- Turismo
specified turns, breaking times etc.)
2. going intentionally in a opposite direction
in order to collide with other players (allowing
a ―friendly‖ player to gain an advantage)
Strategy games in which the in-game timer is Constant adaptation to the enemy‘s Starcraft,
constantly moving, allowing the players to control movements and lightning-fast actions are the Warcraft,
their buildings and units at any point in time; as main the features required from the gamers of Shogun
opposed to TBS where the player has control only this genre.
when his turn has arrived.
Games which focus on dancing or the use of musical The success of the game is based on the Dance Dance
instruments in rhythm to music.
player‘s ability to synchronize the key presses Revolution,
with the music.
Rock Band
The player controls a character(s) in order to fulfill Dupping: a cheat which duplicates items, can The Witcher,
quests and immerse himself in the story of the game be abused in such cases to sell an infinite Dragon Age,
amount of very rare items; which has serious
Diablo
effects on the ―Virtual Property‖.
The player uses one of several weapons in order to Precision
aiming
and
environmental Medal Of
dispose of enemies. Either first person, in which the awareness are the key features that are Honor(FPS),
player sees the game as if he is looking directly required from gamers of this genre.
Call Of
through the eyes of the controlled character, or 3 rd
Duty(FPS),
person where the player seems the entire body of the
Gears of
in-game character.
War(TPS)
Games striving to achieve the maximum level of Flight simulators are the most common Microsoft
realism; they may also be used for training purposes. however there are a variety of games that fit in
Flight
this genre, such as surgery and hazard Simulator X,
simulators.
Aces High
―Games that strive to simulate sporting events as they Since sports games are simulations of real Fifa, NHL,
are in real life― (Rollings & Adams, 2006)
sports where cheating in considered taboo,
SSX
cheating in these types of games is also looked
upon an action of lack of sportsmanship and
generally ruins the gaming experience of other
players.
Strategy games in which the players are limited to a As opposed to RTS‘s, TBS games do not Heroes Of
short number of actions in each turn, once a player‘s require quick decisions and fast movements; Might And
actions are used up he has to wait for the other players their focus is on the tactical side of battles.
Magic,
to do their own actions.
Disciples,
Risk
20
3.4. General architecture of MMO and MO games
Since the highest risk can be observed in MMO (very high) and MO (high) games, Image 2 presents
the general architecture of a MMO game from the perspective of the user (the normal player) and
the Game Master (who has special privileges on the same game application as the player, and who
can even ―materialize‖ as a player avatar next to the gamer to assist him). On the other hand, in
multiplayer games, the game master application can be completely separate from the client
application or just an extension of the normal game client application with unlocked command line
capabilities, such as ―/goto Location X‖ which ―teleports‖ the game master to the specified location.
After specifying the game genres, their game types, as well as a risk analysis based on game types, it
becomes easy to estimate the risks for a game genre: if the game genre supports a specific game type
then the game genre inherits the risk from that game type.
Image 2: MMO and MO architecture from the perspective of a normal player and a
game master (moderator or game administrator).
21
Note: due to space limitations the diagram presents a simplified model containing a single user and
a single game master and server. However, in practice, there are several thousands of users
connected to several servers, each of the servers with a number of game masters supervising to
ensure optimal gameplay.
The diagram also presents the hardware and software components that are usually used by games,
each of these components can be individually exploited, for optimal protection all of the
components of the diagram must be secure, including the activities of the user and game master (and
server administrators which are not present in the diagram however as with any other application
there are always are server administrators that have absolute access on the game‘s databases/settings
and that usually run with fully trusted privileges).
22
Chapter 4
Defining and classifying cheating
The purpose of the chapter is to provide a suitable definition for cheating in online video games.
The definition is necessary for distinguishing the human related cheats, necessary for this paper, and
technical cheats.
4.1.
Cheating definition
Cheating is a major concern for online games since the presence of a cheater inside the gaming
community could have a detrimental effect over the whole game (Pritchard, 2000). In order to
properly classify the methods of cheating employed in most online games, we must first define what
cheating in online games is.
The need for a precise and unambiguous definition of cheating is vital for both video game players
and game publishers, to ensure that both sides are clear on what represents illegal behavior.
Unfortunately, the definition for cheating varies between games and their publishers. For instance,
some games allow players to use macros or scripts to repeat certain sequences of key strokes (such
as World of Warcraft), while others forbid their usage (e.g. Guild Wars). Additionally, it is often
difficult to differentiate between a player who cheats and a player who is using special tactics or
possess out-of-the-ordinary abilities. For instance, the ―camping‖ behavior in first person shooter
games, that involves a player lying in wait in a particular location for other players to pass and kill
them, is considered cheating by some. They argue that camping spoils the game since the location in
which they hide is hard to spot by other players and is often hard (if not impossible) for another
player to kill the camper. On the other hand, as Yan (2005) points out that in war games, hiding in
locations is a simulation of real combat, mimicking the behavior of a sniper, and so it can be
considered legitimate behavior. Thus the need arises for a proper understanding of what is cheat and
what is not – defining cheating in online video games is a difficult task, since cheats are as diverse
as the games in which they occur, and resemble each other only superficially (Kuecklich, 2004).
A game is a system composed of interactive elements, connected to each other by context that forms
a complex and consistent whole (Parker, 2007). When a person is playing a game, the system is
engaged and forms a world with its own rules and goals, referred to as the magic circle (Huizinga,
1955). Johann Huizinga (1955) defines ―…the cheat… pretends to be playing the game, and, on the
face of it, still acknowledges the magic circle‖, thus the cheater pretends to obey the rules of the
game, but secretly subverts them to gain advantage over another player. On the other hand,
Consalvo (2005) considers the player who trespasses against the rules or ignores a ―spoil-sport‖.
The ―spoil-sport‖ is not the same as the cheater, though, since the cheater pretends to be playing the
game by the rules and, on the face of it, still acknowledges the game‘s magic circle (Consalvo,
2005).
Yan & Choi (2002) define cheating in computer games as: ―Any behavior that a player may use to
get an unfair advantage, or achieve a target that he is not suppose to is cheating‖. Kuecklich (2004)
23
claims that most cheats ―give the player an advantage that the rules of the game do not allow for‖.
Yan & Randell (2005) define cheating as: ―Any behavior that a player uses to gain an advantage
over his peer players or achieve a target in an online game is cheating if, according to the game
rules or at the discretion of the game operator (i.e. the game service provider, who is not
necessarily the developer of the game), the advantage or the target is one that he is not supposed to
have achieved.‖ Additionally, Gautheier Dickey (2004) considers cheating any action that gives a
player an unfair advantage over another player. Unfortunately, the definition is ambiguous since it is
not clear what ―unfair‖ means according to the game rules. For a FPS player, camping is an unfair
advantage to the other players, but not according to the game rules.
Similarly, Neumann (2007) defines cheating as ―as an unauthorized interaction with the game
system aimed at offering an advantage to the cheater.‖ In this situation we are dealing with an
ambiguity – what does ―advantage‖ mean? How do we differentiate between skilled players and
cheaters? In both cases, they have an advantage over the other players in the game. Can we consider
an advantage if a player has more experience in a game than another player? Or is it an advantage if
a player has a much better internet connection or computer performance?
Additionally, Consalvo (2007) argues that cheating is not limited to breaking the rule or law of a
game, but also to the instances where the player is ―bending the rules‖ or ―re-interpreting‖ them to
his advantage. Moreover, Consalvo (2007) states that cheating is social since – ―you can only cheat
another player‖ by introducing deception and possible chaos into the game world. It is true that in
many cases there is a cheating entity and a cheated one, however cheating in online games is not
limited to this. For instance, a player can acquire an in-game item through an illegal method that
would not affect the game play of the players from his community, but it is still considered cheating
since the acquisition method was against the game rules.
In most situations players are not equal in skill or experience and as such a player will always have
an advantage over another player. It becomes cheating when the player uses a different means that
the ones intended by the game to gain that ―advantage‖.
Since the previous definitions leave room for interpretation, this paper proposes a slightly new
definition in comparison to the one previously stated by Yan & Randell (2005) and also emphasizes
the idea that illegal actions in games should be clearly defined by the EULA and supporting
appendixes.
4.1.1. Proposed definition
Thus we can state that: cheating occurs whenever a player uses techniques, tools, activities or
procedures that enable him to gain an advantage through a method that compromises
confidentiality, integrity or availability of any of the services connected to the games environment or
that was stipulated in the end-user agreement of the game as a cheat.
4.2.
Cheating classification
A proper classification for cheating in online games is required in order to identify the types of
cheats related to the paper‘s main interests (virtual identity, virtual property and collusion).
Amongst the first proposed taxonomies regarding cheating in games we can find Matt Pritchard‘s
(2000) article. As one of the players and developers of Age or Empires, he provides a framework to
24
classify and understand why players cheat, how they do it, how cheating can be prevented and the
limitations of certain game architecture in relation to multiplayer cheating. Pritchard states that no
game is truly safe from cheating (―If you build it, they will come -- to hack and cheat.‖) since no
application is truly secure and thus is always susceptible to attacks. Moreover, he mentions that
preventing cheating altogether is impossible, but he argues that if the difficulty of cheating is greater
than the effort required to play the game as the developers intended, players will not cheat.
Neumann et al. (2007) classifies cheating in accordance with the game property threatened:
confidentiality (focus on the network - the cheater gains information about the player that he/she is
not supposed to get, such as elements of the global state of the game or confidential information
about the opponent‘s avatar), integrity (target the game state - the cheater modifies the game state or
its fundamental laws) and availability (target the game application itself - cheaters do not prevent
the intended progression of the game by delaying or switching off parts of the game). The authors
mention that they did not consider methods that use other communication channels (e.g. disclosing
information about opponents over the phone).
GautheierDickey et. al. (2004) argues that cheats can be classified by the layer in which they occur
in four categories: game, application, network and protocol. Game cheats include the situations
where the rules of the game are broken without external modifications, such as bugs and loopholes
in the development of the game - e.g. ―Eve discovers that by dropping an object while casting a
spell allows her to keep a copy of the object in her inventory, even though she just dropped it
(granting her the ability to duplicate any object in the game)‖. On the other hand, application cheats
require the modification of either the code of the game or the operating system (e.g. modifying the
rendering code to make the walls invisible). Network cheats occur when the cheater interferes with
the infrastructure of the network over which the game traffic is sent (e.g. denial-of-service is an
example of network level cheat). Protocol cheats require the cheater to interfere with the game‘s
communication protocol (e.g. modifying the contents of packets sent). Junbaek, et al. (2004)
classifies game cheats and attacks by their ―objectives‖ and ―methods‖. Objective cheats include
into ID/password theft, hidden information exposure, data modification and advantage from
cheating.
Kuecklich (2004) classifies cheats in digital games in three different categories: (1) platform (PC,
console and arcade games – unlike, hand console and arcade games, PC games allow access to the
game files, which enables hacking), (2) game mode (indicating the differences between singleplayer games, ―closed‖ multiplayer games and ―open‖ multiplayer games, and how cheating in these
games affects the players), and (3) genre (genres can be explained as triangular matrix, according to
their levels of narrativity, interactivity – the frequency of the players‘ physical interaction with the
game, and openness – the range of actions the players can choose from). According to Kuecklich‘s
(2004) matrix, game-specific cheats can be categorized in three genre-specific cheats: (1) cheats that
speed up the narrative progression, (2) cheats that increase the player‘s frequency of interaction, and
(3) cheats that enhance the possibilities of the player‘s options.
On the other hand, Consalvo (2005) presents three types of cheating, based on the players‘
perspective: (1) ―purist‖ perspective, (2) ―code is law‖ perspective, and (3) ―You can only cheat
another player‖. Purists consider anything other than a solo effort is cheating. In their perspective,
strategy guides, walkthroughs, cheat codes, and hacking as all being cheats. While the second group,
―code is law‖, who do not consider walkthroughs and guides as cheating, but draw the line at cheat
codes. The last group, ―you can only cheat another player‖, claim that cheating occurs only in
relation to another player.
25
However, the most used taxonomy in research is the one proposed by Yan & Randell (2005),
presented in the below table (Table 5). The current paper will insist upon assessing thus taxonomy
and extending and clarifying where necessary.
Table 5: Cheating classification according to Yan & Randell (2005)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
Type of cheat
Cheating by Exploiting Misplaced Trust
Cheating by Collusion
Cheating by Abusing the Game Procedure
Cheating Related to Virtual Assets
Cheating by Exploiting Machine Intelligence
Cheating by Modifying Client Infrastructure
Cheating by Denying Service to Peer Players
Timing Cheating
Cheating by Compromising Passwords.
Cheating by Exploiting Lack of Secrecy.
Cheating by Exploiting Lack of Authentication.
Cheating by Exploiting a Bug or Loophole.
Cheating by Compromising Game Servers.
Cheating Related to Internal Misuse
Cheating by Social Engineering
Based on Yan & Randell‘s (2005) classifications of cheats, the paper presents and extended
taxonomy which includes cheats that were not taken into consideration, as well as eliminating cheats
that are either not feasible anymore or are not included in the end-user agreement (the so-called
―gray area‖ cheats). As such, the proposed taxonomy will contain 15 categories of game cheats
described in the following pages.
4.2.1. Cheating by exploiting misplaced trust
Yan & Randell (2005) argue that too much trust is placed on the client and such the player might
abuse that trust and modify the game code, configuration data or both. Similarly, the cheater can
also tamper with the game client program and access sensitive data which are normally unavailable
for normal players. Since the game client is in the cheater‘s possession, it can be reversed
engineered passively and can be modified using a disassembler, decompilers, debuggers, coverage
tools, fault injection engines or virtual machine simulators (Hoglund & McGraw, 2007; Joshi,
2008).
Techniques
A decompiler translates the binary code from a platform into human-readable code functions
making it easier for the cheater to understand how the game works and how to attack it (Hoglund &
McGraw, 2007). Although decompilers are not completely accurate in the source they generate, they
can help the cheater gain significant understanding of the code. A tutorial of how to hack flash
games using such a method can be found on the website durabe.com where the cheater uses a
decompiler to find the specific functions of important pointers saved in the kernel on execution and
then modifying their values on the fly (Anonymous, 2010).
26
Instead of transforming binary code into source, dissasemblers convert the binary language into
assembly language. This method is more robust than using decompilers, and unlike the decompilers
the resulting code is reliable as long as the cheater can distinguish between code and data (Hoglund
& McGraw, 2007). Since the attacker understands how the code works, he can modify important
variables such as disclose opponents‘ location (situation that occurs in maphacks and wallhacks) or
give him more ammunition (Joshi, 2008).
Debuggers allow direct interaction with the program as it runs enabling the cheater to watch what
happens as a program executes. Hoglund & McGraw (2007) argue that if a debugger is properly
used, it can force the program to do almost anything it is inherently capable of doing, regardless of
whether the activity was part of the design or not. An example is the debugger called OllyDbg
(OllyDgb, 2010) used to reverse games created for the Microsoft Windows Platform.
A code coverage tool is designed as a measure of software testing, describing the degree to which
the source code of the program has been verified by performing white box testing* on the code
directly. Coverage tools keep track of which possible program paths execute as the program runs
and thus allows the tester to ascertain the test effectiveness by determining exactly which parts are
verified and which are not. These tools are useful to cheaters interested in exploiting the software,
since it might reveal an exploitable defect inside the program.
Fault injection relies on modifying the data state, messages and other basic conditions, as it would
be done with a debugger, but using a special purpose engine for running very large numbers of
injections in an automated fashion (Hoglund & McGraw, 2007). The cheater can supply malformed
data to important parts of the program (such as the program input) in order to get it to crash or
behave in a different way – e.g. forcing error conditions in code to see how the program handles
errors. For instance, Joshi (2008) shows how Dynamic Link Library (DLL) - a Microsoft
implementation of the shared library concept, is used to introduce new code into an executable and
thus enable them to develop and run cheating exploits. He argues that there are two ways of
applying DLL injection – static (the injection occurs before the code is executed, where an attacker
inserts a jump instruction in a file that points to the address of the code he wants to be executed) and
dynamic (occurs after the program has been executed, where the attacker attempts to load his code
into the process‘ memory space).
In all the cases presented above, the servers send the information to the clients and it is the purpose
of the client program to hide the confidential information from the player. Thus the attacks
presented above are all based on the cheater‘s ability to reverse engineer the client code and modify
it to behave in the cheater‘s desired manner.
Examples
Hacks resulted from modifying the client to perform cheats are wallhacks and maphacks which are
created by modifying the graphical engine in the game client to initialize the map control values in a
way advantageous for the cheater, allowing him to see parts of the maps that should be undisclosed
(Joshi, 2008). Wallhacks are usually encountered in first person shooters and allow the cheater to
locate all his opponents by making the walls in the game transparent. Maphacks are a similar cheat
that appears in real time strategy game and give the cheater the advantage by showing his opponents
position, moves and resources.
White-box testing = testing that takes into account the internal mechanism of a system or component (IEEE, 1990)
27
4.2.2. Cheating by collusion
Cheating by collusion (Yan and Randell, 2005; Webb, 2007) refers to the situation where players
might group up and gain unfair advantage over honest players in the game. The grouping can pass
the bounds of the virtual world, creating possibilities of exchanging information through other
applications or even by phone. Usually the information which is shared helps the receiver of the
information to anticipate and adapt to the enemy‘s moves even though those moves were known
only to the player performing them and to special users such as team members of the player or
―observers*‖.
The method is very appealing to cheaters because the risk of getting caught is minimum, in-game
chat logs are usually not used for this type of cheat making automatic detection impossible and if
put under analysis the extra information the user had during a game may be confused with a mix
between skill and luck. Since cheaters are not being punished, this encourages even more players to
take the same sort of action to ―even the odds‖.
Techniques
This cheat is split in several categories explained in detail in Chapter 5. Most of them are used to
enhance the player‘s statistics (where there is no directly wounded party) or to gain an advantage
over an opponent (directly wounded party) using a channel that is not controlled by the game client
(phones, instant messaging and even live communication).
Examples
As a direct example we can take a situation in Warcraft 3 where a cheater can be in the same room
with an observer. The observer can see all the actions and resources of the opponent and thus can
reveal the information to the cheating player.
4.2.3. Cheating by abusing game procedure
Yan & Randell (2005) describes this cheat as a situation where the player abuses the way the game
operates or the way it rewards or punished the players. Sometimes it results in tilting events that
should be random in the cheater‘s favor.
Techniques
Some games record loses and wins as well as disconnects. In some cases, disconnects are considered
similar to loses since there is no way to identify cases where the player gets disconnected
legitimately from cases when the player disconnects intentionally to avoid a loss. Additionally, in
this situation it is difficult to differentiate between cheating by abusing game procedure and
cheating resulted from exploiting a game bug. Moreover, the situation given as an example
(disconnecting to avoid a loss) was considered as a bug and was corrected. This attack was noticed
on Battle.Net, the online gaming service provided by Blizzard which allowed players to take part in
multiplayer games. Hackers took advantage of the fact that only wins and losses were recorded (and
not game disconnects) and thus avoided a loss by leaving the game. In this situation, the cheat was
created by abusing a neglected part of the game procedure - and can be considered a game bug and
not an abuse on the game procedure – the game provider did not intend the game to behave in this
manner and as such Battle.Net corrected the issue (Joshi, 2008).
Observer = (here) as a person which does not play the game but can only act as an invisible entity which follows the game‘s
progress in real time (ObserverMode, n.d.).
28
In another case, the attacker can reverse engineer a process that should be hidden, gaining a deeper
understanding of the game procedure and reveal loopholes that may be exploited to his advantage.
Examples
Although, Blizzard considers disconnects as loses in both Warcraft3 and Starcraft 2, this is not the
case with some Microsoft Xbox games. For instance, Xbox players have complained that some
Soulcalibur IV and Tekken 6 players leave the game as to avoid having a loss documented on their
Live Gamertag profile (Gamespot, 2010). Rage-Quitters.com documents cases of such players that
drop connections on purpose when a loss is imminent.
4.2.4. Cheating related to virtual assets
According to Yan (2005) this category includes players that have received real life money for an ingame item but did not deliver it as promised. This classification is incorrect since in this situation
both players are cheating according to the game rules which state that in-game objects must not be
sold in real life. Of course, one can argue that by entering such a situation the player is ―cheated‖
out of real money. However, the bottom line is that according to game rules, he should not have
been there in the first place. Moreover, this fact is also sustained by the punishment inflicted by
most game companies: both players get banned and the ―injured party‖ does not receive any
monetary compensation. Additionally, any other kind of loss (compromised game account,
compromised real life bank account) that might result from this situation is not sustained by the
game company.
The only exception to this rule is found in the game Second Life where players get ownership of the
in-game items they produce and they can consequently sell their items for real money. Second Life
is pushing the limits of virtual economics by allowing and supporting the trade of in-game objects
and services. Although the game economy is strictly regulated by the provider Linden Lab, there
have been occasions that ended in lawsuits over the value of the virtual property (Hoglund &
McGraw, 2007).
Techniques
Most cheating techniques that involve virtual assets revolve around real money transactions,
practices referred by Webb (2007) as ―gold selling‖. Millions of people play online games that
contain a virtual economy, an economy that has become so complex that it can rival the real life
economy. Castronova (2006b) argued that the gross domestic product from the MMORPG
EverQuest exceeded the gross domestic product of countries such as India, Bulgaria or China (as
measured at the time of the article). Consequently, nowadays there is a series of ―companies‖ that
provide in-game objects and currency for real life money. Although this practice is forbidden by
most game companies, the practice of ―currency farming‖ has become a common practice in poor
countries such as China and Russia where people are paid to work 12-hour shifts of ―gold farming‖
– obtaining virtual gold within the game that is later sold outside the game to other players (Warner
& Raiter, 2005).
The phenomenon of ―gold farming‖ has caused outrage in player communities due to its effect over
the game economy. Moreover, the middle man companies that offer in-game currencies do not
always state the method through which the currency has been gathered. It could be the result of
29
―gold farmers‖, but it can also come from hacking a player‘s account and selling the items and
currency that character had to a gold seller. Moreover the gold can come from a character controlled
by a bot that just goes around automatically and gathers items that can be later sold for currency,
thus giving even more incentive towards the abuse of other methods of cheating.
Examples
An example of a gold-selling company is the in-game item trading firm IGE, founded in 2001 by
Brock Pierce and Alan Debonneville in order to ensure a ―safe, reliable trading platform support‖
for the ―gray-market RTM [Real Money Transactions] of virtual items between MMOGs [Massive
Multiplayer Online Game]game players‖ (Carless, 2006). More details are presented in chapter 6
(Virtual Property).
4.2.5. Cheating by exploiting machine intelligence
This type of cheating occurs when the player uses artificial intelligence in online games to gain an
advantage over his opponent.
Techniques
Joshi (2008) gives as example for this cheat the situation where the cheater uses a computer chess
program where he mirrors the opponent‘s actions to determine the ―best‖ next move in a game with
a human opponent. Webb & Soh (2007) introduces a similar category called Bots/reflex enhancers
where the player uses an external program to generate inputs that would increase the player‘s
actions (reactions) in a manner that is impossible to perform by an honest player. In this category we
can include color aimbots – bots which shoot automatically depending on the color chosen for
detection. Their performance is lower than that of other aimbots since the detection is purely color
coded, the aimbot might shoot any textures that have that color – dead bodies, walls, flags and
sometimes even team mates. Since color aimbots do not modify any file or hook the game, they are
hard to detect by anti-cheat applications (Posey, 2004).
Examples
These types of cheats are not very widespread due to their difficult level of implementation. In
Starcraft 2 a player created artificial intelligence (AI) scripts that would take appropriate action
based on the information read from the kernel‘s variables. The AI reactions are performed at speeds
impossible for humans, going as high as 800 actions per minute (whereas usual players have an
average of 100 actions per minute and professional gamers have an average maximum of 400). The
actions per minute (APM) is calculated using Starcraft 2. As mentioned before, these types of AI are
difficult to implement and although fast can be defeated due to their inability to properly adapt and
choose between decisions with the same success probability as a player with average APM but
better rationalization.
4.2.6. Cheating by exploiting client infrastructure
This cheat is similar to exploiting misplaced trust with one difference – instead of modifying the
game program or data received by the client, the client infrastructure (the device or drivers of the
operating system) is the one that is attacked.
30
Techniques
A version of the wallhack can be created by modifying the graphics card driver. The end result is the
same as for a game client wallhack – the cheater makes the walls transparent thus revealing the
location of his opponents. In this category we can include the use of aimbots – cheats that allow the
player to automatically aim and fire their weapon (Joshi, 2008), ensuring that they never miss their
targets. Therefore cheaters gain superhuman killing abilities (Joshi, 2008). Aimbots intercept the 3D coordinates between the game and the OpenGL video library on the graphics card and
automatically calculate the angle and position of the weapon in order to get a killing shot of the
opponent.
Another method of hacking by exploiting the client infrastructure is hacking the API (defined by
OpenGL and Direct3D) that the developers use for rendering game graphics. Cheating can be
accomplished by replacing the DLL which does the graphical rendering in the game. The Trojan
DLL inserted instead of the real one can intercept and alter the render function.
Cheating can also be accomplished by hacking the client hardware. The graphics card is one piece
of hardware that can be hacked. It is the one responsible for converting the data it receives into the
objects displayed. The data needed to display the in-game graphics conforms to the standards of
OpenGL or Direct3D, however since most video cards possess a separate video hardware standard,
at some point in the graphics rendering process the data will be in a format used by the graphics
card. By accessing the data when it is in the video card standard format will allow the cheater to get
information about his opponent with 100% accuracy (Joshi, 2008). Moreover, sensitive data used by
the game client is sent and stored on the player‘s machine and such the game publisher trusts the
player not to attempt to tamper with it. Whenever this data is leaked, the result will be a cheat.
Examples
As an example of client API hack was used by XQZ‘s cheat for Counter Strike – an aimbot with an
optional wallhack feature that replaced the OpenGL DLL with a hacked version that modified the
rendering making the walls transparent (Joshi, 2008).
4.2.7. Cheating by denying service to peers
Since being disconnected from the game is usually evaluated as a forfeit, cheaters can launch a
Distributed Denial of Service(DDoS) attack(DDoS,Cert, 2001) to flood a network, preventing
legitimate traffic from taking place or disrupt the connection between two machines which will
result in the enemy player being disconnected from the game. In some cases the attack is used to
prevent another player from connecting to a service (this could result in the blocking of certain
players from playing the game anymore as a sign of revenge) or disrupt service to a specific system
or person which again would block functionality from enemy players. This method is very appealing
to cheaters since they can use predefined tools that give instant results with no technical skill
required by the cheater unless they are the ones who designed the application. However the
drawback is that the method usually leaves a clear trace which can result in the banning or even to
more serious consequences.
Techniques
In this category we can include the disconnect hack which floods the opponent‘s network
connection causing the player to be dropped from the game (the game would notice that the player‘s
31
connection is slower than required and will drop him in order to avoid the stalling of the game). A
complete taxonomy of attack methods with listings of prevention methods and other defense
mechanisms, including an analysis of their strengths and weaknesses can be found in ―A Taxonomy
of DDoS Attacks and DDoS Defense Mechanisms‖ (Mirkovic & Reiher, 2002).
Examples
An example of such an attack was encountered on September 2010 when a 17-year-old from
Manachester launched a DDoS attack against Call of Duty using an application called Phenom
Booter that resulted in the disruption of the online version of the game and playtime of the other
players (Oates, 2010). Phenom Booter can be found online on unofficial forums where it can be
bought for 10$ /month for the license or for 20$ for month you can become a reseller for the
malicious application. In this case a method of attack on a game became widespread enough to
create an entity that supports and gains money from helping other players cheat.
4.2.8. Timing cheating
In some online games a cheater can delay his own move until he knows what the opponent is doing
and thus gain an advantage.
Techniques
A relevant example of timing cheating is the ―look-ahead‖ cheat – a methods of cheating within a
peer-to-peer multiplayer gaming architecture based on the player delaying his actions in order to get
the advantage of seeing what the other player does (Smed & Hakonen, 2006). The cheating player
will appear to the honest one as suffering from high latency. The outgoing package is time-stamped
prior to the moment when it is actually sent. Gautheier Dickey et al. (2004) identify 5 types of
cheats that belong to this category (presented in Table 6).
Table 6: Cheating classification according to Gautheier Dickey et al. (2004)
Fixed
cheat
delay
Timestamp cheat
Suppressed
Update cheat
Inconsistency
Cheat
Collusion Cheat
The cheater adds a fixes amount of delay to his/her outgoing packets,
allowing the cheater to receive packets faster than he/she is sending them
and thus is able to react quicker than the opponents
Since events are ordered due to consistency needs, a global clock is used for
time stamping. The cheater waits to receive the opponent‘s update and sends
him/her an update with a time stamp that is before the opponent‘s.
The cheater suppresses all updates to one or more players while continuing
to receive updates from them, thus is able to hide from other players.
The cheater sends different updates to different players. E.g. the cheater
sends different updates regarding his location and thus all players will
disagree with the position of the cheater.
This cheat occurs when several players collude and either share packets or
modify them in order to gain an advantage over other players. This type of
cheating is different from the one presented in Chapter 3, where cheaters are
not required to have prior knowledge of the game‘s technical procedures.
32
4.2.9. Cheating by compromising passwords
By compromising a user‘s password, the cheater gains access to his data, authorization and
resources. A cheater can find the password of another player through various methods such as brute
force, keyloggers (either software* or hardware*), phishing or social engineering. Once the cheater
has gained access to the player‘s character, he also gains access to the player‘s virtual assets which
he can then sell. Joshi (2008) relates the case presented in ―PlayNoEvil Game Security News &
Analysis‖ from December 2006 where 44 people were arrested after stealing almost $90,000 worth
of virtual items from players whose passwords they had hacked.
Techniques
Password secrecy can be exposed through spyware* or by manipulating the user into divulging
confidential information (social engineering, phishing). Spyware software is usually installed
without the user‘s consent or knowledge, monitoring his actions and collecting personal information
about him, as well as interfering with the user control of the computer (Posey, 2004). Unlike viruses
and worms, spyware does not self-replicate, but it does exploit infected computers (Posey, 2004).
Spyware gets installed on the user‘s computer through deception of the user (social engineering,
phishing) or through exploitation of software vulnerabilities. The spyware is usually presented as a
useful application; deceiving the users into installing the software without suspecting that it would
cause harm (Posey, 2004).
Social engineering refers to the act of manipulating people into willingly giving out confidential
information, rather than hacking the user‘s computer. Social engineers created pretext, an invented
scenario to engage a targeted victim in a way that would make him give out information that he
would not have in normal circumstances (Goodchild, 2010). Because of people‘s inability to
properly assess certain risks and manipulating the natural human tendency to trust people, the social
engineer has all the tools needed to entice certain people to take a certain course of action
(Defending the Net, 2010). On the other hand, phishing gathers personal information (username,
passwords) from the user by masquerading as a trustworthy entity). Phishing is typically carried out
by email or instant messaging, while social engineering can be done by phone, instant messaging or
even real life. Phishing emails direct users to fake emails who look almost identical to legitimate
ones.
Examples and procedures have been described in detail in Chapter 7 (―Virtual Identity‖).
4.2.10. Cheating by exploiting lack of secrecy
Pritchard (2000) states that ―Any communication over an open line is vulnerable to interception,
analysis, and modification.” This situation is further aggravated when communication packets are
exchanged in plain text thus allowing the cheater to eavesdrop and enable him to affect the integrity
of the game (by modifying, inserting or deleting events or commands transmitted over the network).
Spyware = software that transmits personally identifiable information from your computer to some place in the internet
without your knowledge (Spybot, nd).
Hardware Keylogger = gadget that needs to be physically attached to a computer in order to record keypresses and/or mouse
clicks and locations (SpyCop, 2007)
Software Keylogger = similar to the hardware keylogger, a software keylogger is an application that logs the inputs of a
computer (Dictionary. Com, 2011)
33
Techniques
Packet sniffing programs enable the cheater to examine, alter, block or send these packets whenever
it is convenient for them. To highlight the gravity of this type of cheat an article posted on Dissident
Voice (Burghardt, 2010) states that the ―Einstein 3 software developed by the National Security
Agency‖ is in its first steps of being deployed on the nation‘s telecommunications infrastructure,
which would have as a result the interception of all messages including emails and private
information. Imagine a hacker that can know every action you have done in a game as soon as you
have done it, even if you are far away from him or even in a different game, he can simply react the
―sniffed‖ actions making him be close to unstoppable against an opponent that has every action
revealed to his enemy. Even though packet sniffers can also be used in timing attacks, in the case of
exploiting lack of secrecy the goal is not centered around the time variable, but on information
exposure.
Examples
By finding the enemy’s IP using an IP sniffer*, a third party application called DisconnectHack
could then block the communication from the user which would result in kicking the attacked
opponent from the game. This cheat is hard to detect unless the attacker uses it at the beginning of
the game. Unfortunately, attackers were aware of the game logs being monitored against this type
of cheat, thus they disconnected only a small number of player (around 20%) near the real ending
of the game (at a time considered probable by the monitoring tool) whenever the attackers feel they
are about to lose.
4.2.11. Cheating by exploiting lack of authentication
In games that lack a proper authentication mechanism, cheaters can collect information about the
player‘s account (user name and password) by setting a dummy game server. Unfortunately, the
example given by Yan & Randell‘s (2005) in this category presents a cheat that may happen in
―special‖ conditions. Their example regarding the unattended computers in Chinese or Korean
internet cafes is not relevant to game cheating since in the same manner an attacker can gain access
to any of the user‘s personal information, not necessarily only the game account. For instance, some
online games send the user an email regarding the request for a password change. In this situation, if
the player who has left the computer unattended had his email client opened in a browser, an
attacker can access it; change the game password in a way that cannot be detected by the game
company and gather additional information about the user.
Techniques
For this method, separate servers are created. These servers emulate the same properties as the
original servers but use separate administration and different databases. In this case, the users can
either willingly join the new ―pirated‖ server or unwillingly by being infected by a worm that
instead of doing any serious damage it just changes to URL of the servers to which the game client
is allowed to connect (this can easily be done by replacing one or two lines of information in a
config file*). For the first category of users, if they use the same password for other accounts, they
will be compromised. The 2nd category of users, that are unknowingly sending their game account
information to a fake server, have their official game account compromised, regardless of whether
the password for the game account in used for other accounts.
IP sniffer=computer software or hardware that can intercept traffic passing over a digital network (EffeTech Sniffer, 2003)
Config file = when associated to games, ―config files‖ are files that are directly accessible by users which allow them to set
various values for different system properties in correlation to the game (eg. allocating the minimum and maximum ram
usage allowed for the game) (Rybka, n.d.)
34
Examples
Joshi (2008) gives as an example for this category of cheats the case of multiplayer first person
shooters such as Quake 3 that allow the players to host the game on their own servers. In this
situation the game can be customized according to how they want it to be played. These servers
allow all players to connect to them, giving rise to the possibility that the cheater will use them to
harvest valid user login credentials. This method is more powerful than the one presented in the
above category regarding password theft (involving password stealing) since it can gain more
passwords in the same time and does not require the intervention of the legitimate user (in the
previous case the legitimate user was either coerced into revealing the information through phishing
or social engineering, or different malware and keyloggers can be installed on his computer).
4.2.12. Cheating by compromising game servers
Pritchard (2000) argues that client-server games are as good as the server security: ―trust in the
server is everything in a client-server game‖. Game servers programs and configurations can be
altered by cheaters to their advantage.
Techniques
Pritchard states that in some cases players will purposely test the server to see what can be
―exploited in the name of cheating‖. He argues that in some situations modifying the configuration
is acceptable when the modifications are made public for all players on the server, but he also states
that there will always be situations where the configuration give advantages only to some players
(such as the administrators‘ friends) which becomes a cheat. Whenever this attack is employed on
an official server, it can have devastating effects on the players, the attackers having access to their
credentials as well as credit card information.
Examples
Cesar Cerrudo, lead researcher for Application Security (Higgins, 2009) demonstrates in 2009‘s
Black Hat DC Technical Security Conference, that a hacker can take control of a database and
manipulate it to his/her will. In a game, if a hacker takes control of a game‘s database, he can
change different values, making his character or units , stronger or even just directly modifying his
account‘s history to automatically set him on a better position or in the case of games with monthly
subscription he can give himself and anyone he wants free ―game time‖, the exploits possible from
this vulnerability are limitless and usually have a great impact on the game‘s community, it can even
end in the shutting down of the servers for a period of time in order to perform a rollback.
4.2.13. Cheating related to internal misuse
This category includes cheats resulted due to an abuse of privileges from an insider – a system
administrator, game master or employee of the game operator*. Insiders have access to both user
accounts and in-game items. A game administrator can sell information pertaining to the user – such
as personal information or can sell the account itself. Additionally, since game masters are usually
called to ―fix‖ certain elements of the game that do not work as intended, they can abuse their
privileges and create items or currency where they are not supposed to. Cheating related to internal
Game operator = (here) company who has developed and/or is maintaining the game servers as well as supervising the game
to ensure optimal gaming experiences.
35
misuse is most damaging when it is coupled with another type of cheating. Moreover, cheating
administrators can temporary or permanently ban players in order to achieve a certain goal.
Techniques
Game server administrators and company insiders have privileged access to the servers and their
databases and thus are able to modify the data stored. They can create valuable in-game items and
sell them for a profit (Zetterström, 2005). Therefore the techniques provided by the developers to fix
issues in the games can be used to the administrator‘s advantage, or to the advantage of another
cheating player or groups of players.
Examples
Joishi (2008) includes in this category the case of the 3 Chinese employees from Shanda Interactive
that in 2006 sold fake rare virtual items from the MMORPG Legend of Mir II for a huge profit
(Associated press, 2006). The 3 employees were charged with embezzlement relating to virtual
assets. Examples of cheating related to internal misuse coupled with other types of cheating can be
seen in online poker games, where game administrators can collude with players to the disadvantage
of honest players.
4.2.14. Cheating using social engineering
This category includes cheaters that use psychological methods (heuristics and cognitive biases) to
coerce the player to willingly give confidential information from their accounts.
Techniques
The techniques employed are dependent on the victim – the attacker will use different tactics to
convince the owner of a game account to willingly give out information from the ones used to
coerce a game administrator to ―help‖ him recover his ―lost‖ credentials. Unlike phishing, social
engineers have to research information about their victim in order for the attack to be effective.
Sometimes social engineering attacks are coupled with ―spear phishing‖ attacks. The techniques
employed as well as the reasons why people fall for these attacks are presented in detail in chapter 7
(―Virtual Identity‖), section 7.3 (Social Engineering).
Currently in World of Warcraft, cheaters have begun sending in-game messages stating that the
player has won a specific item (such as a free Celestial Steed, an in game mount* that costs 20
Euros in the Blizzard Store*) and that they should login to a link given to receive it. The link usually
leads to a page that seems legitimate to most users but most of the times contain small details that
would identify it as a fake page (details which are oblivious to many players) and with the aid of this
page they can either steal the account from there or install malware and keyloggers on the user‘s
computer in order to gather more information (such as real bank accounts). Using social engineering
the player can be convinced to give out his information to the cheater. For instance a cheater can tell
a player that he can play on his character, level it and then give it back for a sum of real-life money.
Once the player gives the information, the cheater can change the password and sell the account.
Examples and procedures have been described in detail in Chapter 7 (―Virtual Identity‖).
In-game mount = ( here) means of transport usually available in RPG games, e.g. a horse or a dragon that can be ridden in
order to increase the speed of the character or even unlock special events (GiantBomb, 2011d).
Blizzard Store = an online shop provided by Blizzard Entertainment which allows the acquisition of virtual pets and mounts
as well as other items for real life currencies (Blizzard Store, 2011).
36
4.2.15. Game specific cheats
The taxonomy provided by Yan & Randell (2005) introduces most of the cheats present in the
games from the time of the article. However, it does not take into consideration several types of
cheats. More exactly, Yan & Randell does not take into consideration the gray area of cheats, such
as camping, griefing or using scripts and macros. For a more detailed look upon this category refer
to Chapter 8 (―Game specific cheats‖) of the current paper.
4.3.
Proposed taxonomy
Cheats in online games can be split in two main categories: technical and human related cheats as
described in Table 7.
Table 7: Proposed taxonomy for cheating in online games
1
Technical
cheats
Name
Cheating by
Exploiting
Misplaced
Trust
2
Cheating by
abusing game
procedure
3
Cheating
Related to
Virtual Assets
4
Cheating by
exploiting
machine
intelligence
5
Cheating by
Exploiting
Client
Infrastructure
6
Cheating by
denying service
to peers
7
Timing
cheating
Description
The player might
abuse the trust and
modify the game
code, configuration
data or both
A situation where
the player abuses
the procedure of
the game – such as
disconnecting
when the player is
about to lose
Includes players
that have received
real life money for
an in-game item.
The player uses
artificial
intelligence in
online games to
gain an advantage
over his opponent.
The client
infrastructure (the
device or drivers of
the operating
system) is the one
that is attacked.
Methods that
disrupts the gameplay by denying
the service to the
players.
In some online
games a cheater
can delay his own
move until he
37
Origin
Yan &
Randell,
2005
Threat
Integrity
Example
Maphack,
wallhack
Yan &
Randell,
2005
Integrity
Disconnectin
g to avoid
loss
Yan &
Randell,
2005;
Webb, 2007
Yan &
Randell,
2005; Joshi,
2008;
Webb, et
al., 2007
Yan and
Randell,
2005
Integrity
Real Money
Transactions
Integrity
Aimbots
Integrity
Wallhack
Yan &
Randell,
2005
Availability
Dischack
Yan &
Randell,
2005;
Gautheier
Integrity
Look-ahead
cheating
8
Human
Related
Cheats
Cheating by
Exploiting
Lack of
Secrecy
9
Cheating by
Exploiting
Lack of
Authentication
10
Cheating by
compromising
game servers
11
Cheating by
Compromising
Passwords
12
Cheating by
Collusion
13
Cheating
related to
internal misuse
14
Cheating using
social
engineering
15
Game specific
cheats
knows what the
opponent is doing
and thus gain an
advantage.
Communication
packets are
eavesdropped by a
cheater that will
affect the integrity
of the game to his
advantage.
The cheater
collects
information about
the player's
account by setting
a fake game server.
Game servers
programs and
configurations can
be altered by
cheaters to their
advantage.
By compromising
a user‘s password
the cheater gains
access to his data,
authorization and
resources.
Cheating players
might group up
and gain unfair
advantage over
honest players in
the game.
This category
includes cheats
resulted due to an
abuse of privileges
from an insider.
This category
includes cheaters
that use
psychological
methods to coerce
the player to
willingly give
confidential
information from
their accounts.
Cheats that are
specific to games –
based on the
EULA agreement.
38
Dickey et
al., 2004
Integrity /
Confidentiality
/availability
Packet
sniffing
Yan &
Randell,
2005; Joshi,
2008
Confidentiality
/ availability
Fake game
server
Yan &
Randell,
2005
Integrity
Database
manipulation
Yan &
Randell,
2005
Confidentiality
/Availability
Spyware,
malware,
keyloggers,
phishing,
social
engineering
Yan and
Randell,
2005;
Webb, 2007
Confidentiality
/ Integrity
Zetterström
(2005); Yan
& Randell,
2005
Integrity
Abuse from a
system
administrator
Yan &
Randell,
2005
Confidentiality
Phishing
Newly
proposed
Integrity /
Confidentiality
/ Availability
Tweaking
Pritchard,
2000; Yan
& Randell,
2005
Technical cheats are comparable to security holes in software. Like any security hole, it eventually
gets fixed or patched, forcing every player to upgrade in order to be allowed to continue playing on
the official servers. Additionally, many third party companies and existing research have created
tools to prevent cheating in online games such as ―DMV Anticheat‖ (DMW, 2011), ―GameGuard‖
(nProtect, 2011), ―PunkBuster‖ (Even Balance, 2011), ―VAC‖ (Steam, 2011), ―HackShield‖
(HackShield, 2011), or ―Warden‖ (Ward, 2005). On the other hand, human related cheats cannot be
fixed by a ―patch‖. The confidentiality and integrity of a player‘s account lies in the hands of the
player. Like every application that relies on Human-in-the-Loop to perform security-critical
functions, humans fail in their security role. It‘s the player‘s responsibility to protect his username
and password. In the following chapters a more detailed look upon the human-related cheats and
game specific cheats is provided; thus the focus of the research paper will be directly on the least
researched categories of the previously explained taxonomy.
39
Chapter 5
Cheating by collusion
The current chapter presents how cheating by collusion occurs, how it affects the players and how it
can be prevented and detected.
5.1.
What is collusion?
A player can either cheat independently in either single player or multiplayer online games.
However, in multi-player games there are cases where two or more players can cheat using
malicious cooperation, also called collusion. Collusion refers to a fraudulent agreement between two
or more persons, acting with a common secret strategy, to limit open competition by deceiving,
misleading other players to obtain an advantage that is difficult to work against. Collusion occurs
whenever players cooperate when it is not allowed by the game rules. For instance, in World of
Warcraft it is not illegal for a group of players to collude to kill another player, and EVE Online is a
MMO based entirely on manipulating the other members of the team. On the other hand, many
games assume that players are rivals and, therefore, forbid collusion, especially in games such as
online poker or other competitive games. Cheaters do not require any technical sophistication to
carry out this cheat.
Cheats in this category are most commonly found in online card games such as bridge or poker,
where players can illegally cooperate to get more information about the cards in the play that they
may have in order to gain an advantage against honest players (Yan & Randell 2005, Yan 2003).
Collusion can occur between players that know each other, who are friends inside the game or
outside it, or between strangers that share a common goal. According to online poker websites, a
partner involved in cheating by collusion team, actively helping the other members is called an
―agent‖. The agent is usually another player in a game, or in some cases an observer. When cheaters
are physically together, they can collude using spoken language or body language, whilst online
they can use channels that cannot be controlled by the game developers: telephone, email, instant
messenger or online chat. From this point of view, cheating by collusion can be split in three
categories: express collusion (the players have an explicit agreement to cooperate), tacit collusion
(there was no prior agreement) and semicollusion (certain decisions are agreed upon, whilst at
others the players still compete).
5.2.
Classification of collusion
Actual cases of collusion are complex and usually involve multiple forms of cheating. For instance,
a beginner or unskilled chess player can use an expert to make better moves, or participants in a
tournament can pre-arrange the outcomes of games to eliminate honest players or sent numerous and
seemingly independent complaints to an administrator to get innocent players banned (Smed et al.,
2006). Outside online card games, the types of collusion that most often occur are: collusion to gain
40
experience or rating, trading information or information exposure (Yan & Randell, 2005), collusion
to deadlock a system, enemy team manipulation and insider collusion. And a newly proposed type
of collusion type: collusion by impersonation.
5.2.1. Collusion aimed at gaining experience* or rating*
There are two types of cheating by collusion included in this category: win trading (van Summeren,
2011) and cheating to gain an advantage. Win trading is present in competitive games that have
ladder systems – the list of players and their position amongst their peers, where players and teams
of players battle each other to progress on the ladder. This type of collusions, particularly popular in
Starcraft 1 (Yan, J., and Randell, 2005) involves the player agreeing to take turns winning against
each other instead of fighting unknown players or teams. On the other hand, in World of Warcraft
arena systems, cheaters raised their rank by creating other teams to increase the ratings of teams one
by one.
On the other hand, not all collusions between players are illegal, although in some cases they are not
ethical. For example, a method to win the World of Warcraft Stranglethorn Vale Fishing
Tournament is to have two or more friends around that would kill the competition from the
opposing faction(s)* even though these friends are not actively participating in the tournament itself
(Nat_Pagle_Dude, 2006). This situation cannot be considered cheating since the game allows the
players from the two different factions to kill each other and the tournament rules does not mention
anything against this behavior. It is not ethical and it is unfair for honest players since all players
should have an equal chance to win the tournament, but it is not a cheat unless this behavior takes
place on a server where attacking each other is not allowed (such as RP server– role playing).
Although the previous quote was from 2006, this method of bypassing the tournament rules without
cheating is used even today, so even if it is unethical and decreases the overall value of the
tournament it cannot be stopped since it is not considered illegal by the rules of the game.
5.2.2. Collusion aimed at trading information
In games where information is not known to all players, some players might cooperate and trade
information or sell it to other players who are willing to pay for it. The second type of collusion
involves this trade of information that is supposed to be hidden to the other player. In this situation,
one player (or group) is playing while the other has access to the game, most commonly by being an
observer. An observer is a player that is allowed to view a game live, without participating.
Although such a player cannot communicate with the other players through the game interface, they
can communicate through other means: instant messaging, phones or even be in the same room with
the player from the opposing team. Since many official sponsored tournaments have live streaming
of the games, it makes it easier for players that are not even invited as observers to have access to
information that would be beneficial to any of the two player involved in the competition. Unless
the players involved in the game are properly controlled as to not have access to any people outside
the games, a person can contact one of the players and give out secret information. As a direct
example we can take a situation in Warcraft3 where a cheater can be in the same room with an
observer (a player allowed to watch the game ―live‖). The observer can see all the actions and
resources of the opponent and thus can reveal the information to the cheating player.
Experience = in RPG video games characters gain experience points for every kill/ completed quest which allows them to
gain stronger abilities and unlock access to new features of the games (GiantBomb, 2011c).
Rating= (here) rating is associated to a player‘s skill level, this is usually shown in a leaderboard/ladder where each of the
player is ranked based on his rating.
Faction= ―a group or clique within a larger group, party, government, organization, or the like: a faction in favor of big
business.‖ (Dictionary.com, 2011)
41
5.2.3. Collusion to deadlock a system
Yan (2003) presents this type of collusion to deadlock* as observed in a free online Bridge game
provided by Pogo. To help the game progress faster, Pogo implemented a procedure called ―boot
out‖ or expel a player who might be stuck due to network lag* or who annoyed the other players.
Assuming that North(N), South(S), West(W) and East(E) were playing at the same table and
assuming that N wanted W expelled, he can initiate a ―boot out‖ procedure which sends N‘s request
to both S and E. If both S and E agrees, then W will be automatically kicked out of the game,
otherwise he will remain part of the game. Is two of the playing partners collude (N and S), then
they can misuse this feature to stall the game when they were going to lose. For instance, N
launches a request to expel W. E will refuse to expel W since he did not do anything wrong and they
were going to win. In the same time, S ignores the ―boot out‖ requests, never answering with a vote.
Even if E and W realize that they are dealing with cheaters, they cannot do anything against the
cheaters since they are involved with one ongoing vote and no further request can be accepted. Both
W and E can do nothing but wait for Pogo to automatically destroy the game (to conserve resources)
due to inactivity from the players. The agent involved in the collusion does not need to know the
other cheater personally, just know the trick and have the same goal as the cheater. Therefore, the
deadlock can be done without an explicit prior agreement.
Summeren (2011) misinterprets Yan‘s description since he believes that this type of cheating by
collusion can be applied to World of Warcraft‘s dungeon finder tool that generates 5 people groups
from players that use it. This tool allows players to complete 5-man dungeons* (instances) faster by
grouping the team and allowing the majority of players to ―vote-kick‖ the player that misbehaves or
is away from the keyboard (or slows the group in any way). Summeren (2011) argues that if the
majority is formed by players who collude, they will use it against the other player and kick him out
of the group for reasons that were not intended by developers, such as refusing to speak a certain
language (usually English) or for not being skilled enough. The tool was created to help players
advance fast and not help players who are either too slow, un-skilled or lack the proper equipment.
Moreover, having an unskilled player in their group who also acts reckless leads to ―wipes‖
(situations where all group members die), every in-game death in World of Warcraft (WoW)
damages the player‘s equipment, forcing him to repair which many times is very costly.
Additionally, a player that would refuse to speak the language of the majority can be kicked since
the group cannot communicate strategies or attack order, thus the player only hindering the group‘s
success. WoW is a social game, thus it requires cooperation. Sometimes the group can be
compounded by complete strangers, chosen randomly by the tool, who have a specific goal and who
are already frustrated by it (due to limitations such as a small chance for an item they hope or the
need to finish the dungeon in a certain amount of time) and such do not care much of the comfort of
the player in the minority. In fact, if the dungeon finder would not provide this option, chances are
that the group would have disbanded and re-grouped with the same people except the one that
should have been kicked.
5.2.4. Opposing team manipulation
This category is similar to Yan‘s (2003) friend or foe. As mentioned, collusion does not necessarily
happen only between players that know each other and it may not involve illegal transfer of
information. Yan (2003) gives an example where North could collude with East to beat West. This
option was available for Warcraft 3 Free For All players – where 4 players (who did not know each
Deadlock = a situation where two or more threads are blocked forever, waiting for each other. (Oracle The Java Tutorials, nd)
Network lag = delay that appears in a network, causing synchronization issues in games as well as causing the game to stall
and even completely ignore actions which time-out (Mitchell, n.d.).
Dungeons = in World of Warcraft ―realms‖ that are completely isolated from the rest of the players on the server, allowing
only the initial 5 player group to complete the tasks and receive the rewards (Henry, 2011).
42
other and whose names were obscured) had to fight each other until there was only one standing.
Additionally, in Warcraft 3 when a player wanted to lose a team game in order to decrease their
estimated level on the ladder and fight against weaker opponents in future games, he cannot just quit
the team game (since the remaining allies might still win the game), instead he would resort to
attacking his allies, destroying their economy before they even had time to develop as well as
sharing information with the opposing team.
However another example for this type of cheat can be found in Warcraft 3‘s free for all competitive
games (where 4 anonymous players are matched up randomly against each other and only 1
survives, alliances are not allowed and private communication between players is strictly
prohibited), in reality though because of the relatively low number of active players 2 or more
friends could synchronize their games in order to trick the system into selecting them into the same
game, after that even though their names were anonymous and they could not communicate directly
or not even form alliances (the units of player A always attacked the units of player B if left
unchecked), they were able to communicate with each other (since they already shared contact
information before the game started) , this enabled them to form pseudo-alliances (helping each
other to defend and synchronize attacks).
5.2.5. Insider collusion
A type of collusion that can be seen in card games is the operator-player that involves the
cooperation of a player and an insider. In this situation, the illegal operation is two folded –
collusion as well as internal misuse.
5.2.6. Collusion by impersonation – ―Boosting‖
A new collusion method that has not been specified by any of the previous researchers is collusion
by impersonation or boosting. Since most gamers that play online are not situated in the same room,
it is easy for player to impersonate another one. In some cases, players will ask friends who are
more skilled to play on their account in order to increase their account statistics, a technique called
―boosting‖ inside the gaming community. However, most game companies state that sharing an
account for whatever reason is illegal. An even greater problem arises when a player, considered of
a high level (due to ―boosting‖ done by his friend/s), resumes normal activities on his account,
interacting with players of similar level – due to the sudden gap between the expected skill level*
and real skill level of the player the entire group suffers. Since the player‘s real skill level is lower
than the estimated* one, he will lose a very high number of games until the level reaches a realistic
one thus nullifying the ―boosting‖ effect. Additionally, when playing in a group, he also ruins the
game experience of allies by accumulating a large amount of losses.
5.3.
Collusion detection
According to Smed et al. (2006) collusion recognition has to observe the following factors:
-
Learning curves – the way the player progresses from a novice. Advanced players‘
actions are usually predictable from their perspective
Familiar company – better success in the company of friends
Sensible play – each participant tries to minimize the worst possible outcome of the
game
Real skill level = (here) the level at which the player can perform in reality
Estimated skill level = (here) the level at which the game system estimates the player‘s skill to be, or to which the player skill
is estimated to converge.
43
-
Conflicting interest – searching for interest that can be mutually beneficial for players
in opposing teams.
True Poker (2006) mentions that collusion detection requires:
-
-
The detection of opportunities in order to recognize the possibilities for gain.
Forming the player‘s profile using pattern recognition taking into consideration risktaking, impatience, actions that suggest knowledge of information that should be
secret.
The actions of the player can be statistically analyzed to determine the player‘s
reputation.
Analysis of player history and past games to observe changes from usual habits which
can indicate collusion.
5.3.1. Rank trading
Yan mentions that rank tracking, which monitors the promotion/demotion of each player, can help
in identifying cheating by collusion by looking for unusually quick rank promotions. On the other
hand, in online Bridge it is not a good method since a partnership winning often does not necessarily
cheat. Additionally, colluding partners do not necessarily win. The rank tracking tool can only
detect unusual patterns only after a cheater has played sufficient number of hands, thus a cheater
that plays less than the threshold will not be detected by the system. However, this method can be
used in order to detect enemy team manipulation cheating. If a player constantly attacks his allies or
hinders their progress in the game, then the tracking tool can detect this behavior and flag the player.
5.3.2. An AI-based detection approach
Another approach would be an AI-based tool to monitor the critical points in the process of a game
and repeated behavior that is not a logical lead. The behavior must be repeated to appear as a sign of
cheating in order to eliminate error, luck or even honest or ingenious play. The tool can at least be
used to flag suspicious behavior that can later be tracked by a game master or administrator. Yan
(2003) proposes an AI model created specifically to identify cheating by collusion in the online
games of Bridge.
5.4.
Collusion prevention
5.4.1. Collusion aimed at gaining experience or rating
Win trading can be fixed by changing the way ratings are given to players, by giving rewards for
winning and punishment for losing, as seen in Warcraft 3 and Starcraft II. World of Warcraft‘s arena
exploit has been fixed by linking ratings to teams as well as to players. Randomized partnering can
be used to prevent win trading, a solution employed by Warcraft 3 and Starcraft 2. In both games,
the player cannot choose his opponents at will, but will be coupled with an opponent of an equal or
approximately equal rank and skill. Games that are played with friends are allowed, but will not
influence their position on the ladder. It does not eliminate win trading completely since in-game
chat allows the players in the same team to communicate and set up an agreement to search a game
at the same time thus increasing their chances of being randomly selected into the same game.
44
5.4.2. Collusion aimed at trading information
Collusion to trade secret information can be fixed by making sure that people with the same family
IP are in the same game only as a team and not as opposing members or observers. By having the
same family IP, the developers can assume that the players either live together or close by and such
unless they are playing against each other, or in arranged teams with their friends, they should not
be allowed to play in different teams with random people, or be observers in these games. If they
wanted to view the game, they would have to view it from their friend‘/s point of view (without
seeing any hidden information from the opposing player or team) and watch the game‘s re-play after
wards (if they are interested to see the point of view of the opponent/s). Tournament matches can be
protected again cheating by collusion by restricting the player‘s access to the outside world, or delay
the streaming of the match with 5 minutes, so that the information seen by the cheater is of no use to
the cheating player.
The type of collusion which is hard to detect is the one that involves players that do not have the
same IP family and thus cannot be easily tracked since nobody can see the members of the groups
and it would be too intrusive to block the player‘s access to instant messaging programs, emails or
telephone. In this situation, one can say that we can determine whether we are dealing with cheating
or not by watching the player‘s pattern: if the player knows the opponent‘s every move, units or
location, elements which should be unknown, then the player might be cheating.
5.4.3. Collusion to deadlock a system
A deadlock attack can be prevented by putting a timer on the ―boot out‖ tool – if no answer has been
given in the first 60 seconds from the launch, then the game will proceed from where it left. World
of Warcraft‘s dungeon finder tool allows players to continue the game while the vote is still on.
5.4.4. Enemy team manipulation
Enemy team manipulation is hard to detect – the ally of the cheater realizes that he is dealing with
an enemy only after the first betrayal, while his units and buildings are destroyed. Additionally,
since the motivation is to lose their position on the ladder, punishment will not deter them. Although
it might seem logical to stop this method of cheating by not allowing players to attack their own
allies, this would open another door to the cheater – he will construct buildings to block the ally‘s
access out of his base or surround his troupes as to not be able to attack. Allies cannot be stopped
from constructing on the player‘s base since in many games different players have different
advantages. For instance, in Starcraft II* a Zerg* player cannot block his own choke point with
buildings and has to rely on a Protoss* or Terran*. These types of attacks cannot be eliminated
completely, but they can be prevented and mitigated by studying the behavior of the player and
flagging
his
account.
Starcraft II: Wings of Liberty = a competitive science fiction real-time strategy video game, developed and released by
Blizzard Entertainment for PC, that pioneered the concept of dramatically different races in one game (Battle.net, 2011)
Zerg, Terran, Protoss= the different races available for playing in Starcraft (IGN Staff, 2000)
45
Chapter 6
Virtual property
This chapter presents how virtual in-game economies can affect real life economies and how
cheaters can take advantage of this situation. The focus is to show how cheaters can earn real-world
money using in-game items and accounts. Additionally, the chapter will present the effect these
types of cheaters have on the gaming community, as well as present the legal issues that may arise
with the crossing of video games in the real world. In the end, the chapter will present solutions to
prevent this type of cheating.
6.1.
What is virtual property?
“Not long ago, a 43-year-old Wonder Bread deliveryman named John Dugger logged on to eBay
and, as people sometimes do these days, bought himself a house. Not a shabby one, either. Nine
rooms, three stories, rooftop patio, walls of solid stonework - it wasn't quite a castle, but it put to
shame the modest redbrick ranch house Dugger came home to every weeknight after a long day
stocking the supermarket shelves of Stillwater, Oklahoma. Excellent location, too; nestled at the foot
of a quiet coastal hillside, the house was just a hike away from a quaint seaside village and a quick
commute from two bustling cosmopolitan cities. It was perfect, in short, except for one detail: The
house was imaginary.” (Dibbell, 2003)
Although it may seem strange, property such as the one bought by John Dugger is not the only one
available for sale. Many massive online games do not have a distinct win or lose scenario, but are
designed to allow gamers to build up their characters (avatars) by earning virtual currency and
increasing their avatar‘s fighting experience (leveling*). Additionally, property, and items can be
created, bought, and sold inside the game and used by the player‘s avatar. A player‘s success in the
game is based on the skills he possesses, as well as on the items he carries. Many MMORPGs allow
players to buy and sell virtual goods (within the game). If a charcter needs a weapon, he can
approach another player inside the game worlds and exchange it for a number of virtual coins. The
ability to change items for virtual currency creates an internal economy inside the games (Bartle,
2004).
In academic literature, such online games and services are referred to as ―virtual worlds‖ and are
compared to real-world (Castronova, 2003; Castronova, 2006a; Castronova, 2006b; Nash &
Scheneyer, 2004; Lastowka & Hunter, 2004; Taylor, 2006) in order to communicate the scale and
complexity of these systems and the activities that take place within them. Bartle (2004) defines
virtual worlds as ―computer-moderated, persistent environments through and with which multiple
individuals may interact simultaneously‖.
Vili Lehdonvirta (2005) classifies ―virtual assets‖ in four categories: currency, personal property,
realty and avatar attributes. Most virtual worlds have their own internal currency, such as ―gold
pieces‖ in Ultima Online, ―platinum pieces in EverQuest‖, ―copper‖, ―silver‖, and ―gold‖ in World
of Warcraft, or ―Linden dollars‖ in Secondlife. Personal property consists of weapons, armors,
Leveling = gaining experience in-game the characters gain power, usually represented by the character‘s level (GiantBomb,
2011e)
46
costumes, furniture, and other transferable assets than an avatar can possess (Sezen & Isikoglu,
2007). The third virtual asset category is realty: in some games the player can own a building or a
land. The last virtual asset is represented by the avatar‘s attributes and are among the few assets who
cannot be transferred partially (the whole account, including all the avatar skills and its possessions
must be transferred).
On the other hand, the presence of virtual economies has sparked a debate concerning the
relationship between virtual worlds and real worlds (Lehdonvirta, 2010). ―The more people who
accept an illusion, however, the more it becomes real. A “share in a company” is not a tangible
thing, for example, but folk deal in them on stock exchanges every day‖ (Bartle, 2004).
Edward Castranova (2001) declared in the paper ―Virtual Worlds: A First-Hand Account of Market
and Society of the Cyberian Frontier‖ that the gross domestic product of the MMO EverQuest
exceeds that of many real countries such as India, Bulgaria or China (as measured in 2001) – ―It was
the seventy-seventh richest country in the world. And it didn't even exist‖ (Castranova, 2001).
Although Castranova‘s estimation might be exaggerated, it is clear that the virtual worlds manage to
create more money out of nothing than the time the player‘s invested in (Hoglund & McGraw,
2007).
Since some players do not want to go through all the hassle of creating an avatar from scratch and
improving it through many hours of work or acquire the in-game currency through the means
intended by the developers, a secondary marked has arisen to cater their needs. Thus offers for
digital goods, property and avatars have appeared on IGE, eBay and other e-commerce websites in
exchange for real money. In some cases, these items have sold for large sums of money. For
instance, a virtual island in Project Entropia has sold for $30,000 and a space station in the same
game has sold for $100,000. In another case, a virtual representation of Amsterdam in Second Life
has sold for $50,000 (Kane & Duranske, 2008). Another example is the case of Tod Kellen who
sold his one and a half year-old Jedi knight for $510 over eBay (Musgrove, 2005). According to a
2008 European Network and Information Securing Agency study, the real money value of virtual
world objects traded every year is $2 billion. Since the majority of game related transactions are
carried on the black market, that figure might be much higher (Hill, 2010).
However, as Bartle (2004) mentions, selling virtual goods inside virtual worlds does not imply that
they can be sold outside of the game, in the real world. The trade of virtual property for real-money
(real-money trade or RTM) is said to open the door for real-world rationalism and economic
inequalities inside the virtual world (Castronova, 2004).
Although these transactions seem relatively recent (2000), in actuality they have been around for a
very long time, since the apparition of MUDs (the first multiuser dungeons game). McCurley (2010)
gives as an example the case where one of his professors confessed to sending a decent amount of
real-life money to the MUD administrators in the Netherlands since “the time, energy and devotion
to get items in these games when the opportunities just don't present themselves for one reason or
another seems utterly worth it sometimes. My heart almost broke‖.
6.2.
Ownership of virtual property
Within the virtual world property is a meaningful concept, however outside that world its
meaningfulness is dependent on whether the virtual world explicitly recognizes it. As Bartle (2004)
47
argues, owning property in a game of Monopoly is meaningful only during that specific game
session – once the game is over, the player cannot take the ―property‖ home since the game does not
belong to him. The same reasoning can be applied for a virtual world – you as a player do not own
anything inside the game, the character does. Some players argue that if you can buy and sell an
item inside the game, therefore you can do the same with the item in the real world. However, the
virtual world does not function on the same rules as the real world – the virtual world is a separate
realm, with specified rules and ―laws‖. Before entering the world, the player has to agree to a set of
rules, called Terms of Use, which tell the player what they are allowed or not allowed to do inside
the game.
Joshua Fairfield (2005) argues that virtual property shares three legally relevant characteristics with
real world property: rivalrousness (the ability to use something to the exclusion of others),
persistence (longevity – the player‘s virtual item continues to exist in the virtual world and remains
the property of the player‘s character) and interconnectivity (convey or transmit virtual objects
among different players, allowing the trade of virtual goods). These properties mimic real world
properties and based on them, Fairfield argues that virtual property should be treated like real world
property. However, as mentioned before, the access to the world where the player owns property is
on a contract basis with the game developing company (the EULA and TOU). Therefore the laws or
real world cease to function inside the world, being replaced by the ones mentioned in the TOU.
According to the Terms of Use (TOU), the player never owns anything in the virtual world, and thus
cannot transfer ownership of anything to another person, for gain or for free (Ruch, 2009). The
account, characters and objects are all owned by the game developing company. Although
ownership is a prerequisite for sale, this fact has not prevented players from claiming to own virtual
goods in the same manner as they own real-world goods (Bartle, 2004). Real-money transfer
happens inside the game despite the views of a game company because ―it is technically possible‖
(Ruch, 2009). As Ruch (2009) mentions: ―A day without an advertisement for a 'cheap gold site'
being posted in the general chat channels of major cities is remarkable and could be considered a
holiday‖.
Bartle (2004) identifies five claims that players have made regarding this ownership: (1) ―I own it
because I bought it‖; (2) ―I own it because I stole it‖; (3) ―I own the product of my labor‖; (4) ―I’m
selling my time‖, and (5) ―I own it because you made me buy it‖.
I own it because I bought it. The first claim is based on the idea that the player has purchased
something ―in good faith‖. To explain this, Bartle (2004) gives us the real-life example of a person
buying a stolen good without knowing it to be stolen. The author mentions that the person is honest,
and thus should not be punished by having to return the purchase to its rightful owner without
compensation. However, if the developer never sells the property and punishes those who sell it (as
mentioned in the End User License Agreement), then a player can have no claim to have bought
something ―in good faith‖.
“I own it because I stole it‖ is based on the fact that if the game company does not ask for the
property back, then the player has a right over it. However, Bartle (2004) mentions that players are
paying a monthly fee to rent the in-game property.
“I own the product of my labor.‖ Some believe that since the items and characters took time and
effort from the player, they should be allowed to do what they want with them. The EULA
agreement for most virtual worlds mentions that players are ―invited‖ not to ―make things‖, but to
―make things for fun‖ (Bartle, 2004). ―It’s not work, it’s play. If you start regarding it as work,
48
you’re breaking the implicit conditions under which you were given access to the necessary
materials‖ (Bartle, 2004). Therefore, the relationship between the player and the game developer
company is similar to the relationship between a family and the restaurant that allows children to
play in the children area – the child cannot keep any of the things they have built using the
restaurant‘s toys.
“I’m selling my time.‖ The argument justifies the actions of players that sell their items and it is the
most common argument used by gold-selling companies that hire people from poor countries to
farm virtual goods for real money such as, Black Snow who farmed in Age of Camelot (Ruch,
2009).
“I own it because you made me buy it.” Many gamers use this excuse when justifying their
motivation for buying items and currency from gold-sellers. They state that the virtual world,
through its demands encourages real world trade in virtual items (Bartle, 2004). In this situation,
cheaters classify themselves as ―time-poor people‖ (Bartle, 2004), who have to work and thus do not
have enough time to play, but have real-life money to buy the items that they require, as opposed to
―time-rich people‖ who can spend hours playing.
Additionally, there are three main reasons why players choose to buy and sell virtual characters and
items (Bartle, 2004): (1) as an investment, (2) for group-play reasons, and (3) to inflate their status.
The first reason is used by players who hope to sell the character or the objects that came with for
more than they paid. The second reason is used by players who have returned to the game after a
period of absence and wish to play in the same group as their friends. The third reason applies to
players who want appear that they have a higher ability.
However there have been ways of handling the confusion regarding the ownership of virtual
property. Julian Dibbel (2006) states in ―Play Money‖ that in-game items and the account are
intangible thus cannot be transferred or insured. However, Dibbel suggests that the login
information is the only part of the account that does not belong to the game company, and thus the
sale of the passwords and login information written on a piece of paper, would be similar to the sale
of a ticket to a concert. In this case the sale would be of access to the concert and not for the paper
itself, or from the gamer‘s point of view, the sale if for access to the account and not the sale of the
account itself (Dibberl, 2006).
6.3.
Tertiary markets
Some games, such as ZT Online from Giant Interactive, explicitly encourage real-money trading
(RMT), whilst other games, such as Ultima Online, tolerate this trade. SecondLife, for instance, is
built on the idea that players are allowed to create and own objects inside the game, objects that
have real-world value. On the other hand, games such as World of Warcraft, do not tolerate RMT.
―The existence of virtual worlds as a practical phenomenon has two major side-effects: it activates
real-world property laws; it stops the game being a game‖ (Bartle, 2004). Whenever property
created inside the virtual world is given real world value, a bridge between virtual economies and
the real world marketplace takes shape, bringing with it real world legal implications (Abramovitch
& Cummings, 2007). To date, virtual property rights have not been fully litigated (Lastowka &
Hunter, 2004); however, there have been steps in the direction of virtual property litigation. One
such example involves Black Snow Interactive ―point-and-click sweatshop‖ situated in Tijuana,
Mexico, who employed cheap labor to play the video game Dark Age of Camelot and then sold the
virtual assets they earned (Kayser, 2006). In 2004 the value of one hour labor in the virtual equaled
49
$3.42 in the real world (Lastowka & Hunter, 2004). The labor consisted of players gathering virtual
raw materials (such as virtual iron ore), and then ―farming‖ the materials into virtual chattels, which
will be sold inside the virtual world for virtual money or in any real-world online auction house such
as eBay (Kayser, 2006). The owner of Dark Age of Camelot, Mythic Interactive, had attempted to
prevent the commoditization of virtual items by forcing eBay to remove them from their auction
listings based on the infringement of intellectual property. Black Snow Interactive responded with a
suit in federal court in California claiming unfair business practice, but dissolved as an organization
and such the litigation concluded with no precedential value (Kayser, 2006).
Another example is Sony‘s desire to end the real-money trade of EverQuest items and accounts. The
resulting lawsuit was centered on the claim that, although Sony owns the virtual items, the users
own their time and labor (Kayser, 2006). As in the previous example, the issue remained
unresolved.
On the other hand, as in-game items gain value, real-world problems are making their way inside
the game world. For instance, a Chinese man has been stabbed to death for selling a virtual sword
that did not belong to him, while in Japan a student was arrested for creating a hack that robbed and
killed other characters in Lineage II (Musgrove, 2005).
Considering the value of in-game items (see Image 3 for an example), characters, and currency,
many virtual ―entrepreneurs‖ have started to cash in on it in various ways: setting up virtual
sweatshops to gain in-game currency (gold farmers), companies that play instead of the real
character owner or even companies that provide high level avatars for sale. Additionally online
markets such as IGE – Internet Gaming Entertainment (IGE n. d.) have emerged in which the
players buy and sell currencies for many MMORPGs as well as see the in-game currency value in
real money. Hoglund & McGraw (2007) believe that getting in-game currencies fast is the main
incentive behind players cheating. Although middleman companies, like IGE, seem more legitimate
and claim to offer services to gamers, they are targets to money laundering, both real and virtual.
Additionally, these companies are used as a method of selling virtual items that were created
through cheating, from stolen accounts or in-game bug exploits.
―Farming*‖ sweatshops are most common in third-world countries. These companies persist
although their existence is against the EULA Agreement and in some cases they get big enough to
enter the radar of the game companies, which ultimately leads to the game developers taking action.
Sweatshops hire low-wage hourly workers to constantly play in order to level up or ―farm‖ gold.
Additionally, these companies offer the ―service‖ of players who are willing to play on certain
characters for a fee. Most farmers use macros, automated programs to do all the work and
continuously look for game exploits and in-game ―dupes‖ (bugs that allow items or gold to be
duplicated). Although everything is automated, human players are required whenever another player
is around (looking to fight) or a game master looking for automated programs (Lee, 2005). A farmer
sometimes controls more than one character in the same time in order to increase ―productivity‖ as
well as ―launder‖ the in-game currency: a duper account, a filter account and a delivery account
(Lee, 2005). All accounts are created using different IPs, credit cards and computer to make it
difficult to trace the source.
(Gold) Farming = users harvesting gold and selling it in-game for real-world currency (Adams, 2005)
50
Image 3: Picture taken on the 5th of June 2011 from eBay.com depicting the sale of a level 85
World of Warcraft account
6.4.
Player resentment
―A major source of unfairness is the buying and selling of virtual goods as opposed to characters
[…] A manifestly inferior player with a manifestly superior weapon is harder to stomach when you
know that to buy such a weapon would cost you a month's salary: you're never going to get a
weapon like it because you simply don't have the money‖ (Bartle, 2004). In 2003, Li Hongchen, a
player in the virtual world of Hongyue, or Red Moon, a game provided by Arctic Ice Technology
Inc, became invincible inside the game through the purchase of tens of thousands of Yuan* worth of
in-game weapons accumulating an overwhelming reserve of virtual biological weapons (Kayser,
2006; China Daily, 2003).
Bartle (2004) presents the case of a player that wants to get a powerful sword that can help him get
more experience faster, but which is guarded by a monster that can kill the player faster than he can
retaliate. The player has the choice to wait until he can kill the monster or buy the sword from eBay.
Thus, the player can ―now rise in levels faster than people who don’t have a few hundred bucks to
spare: more to the point, you can do so faster than people who do not have the cash but are playing
by the rules‖ (Bartle, 2004).
Cheaters argue that players and gold-selling companies are just providing a service: players who do
not want to pay are not forced to; they can get the items and in-game money by playing the game.
However, players who pay are ―taking an unfair short-cut and cheapening the successes of people
who don’t take it. […] The status of the character should reflect the status of the player behind it.‖
(Bartle, 2004). Whenever a player is found to not be entitled to wear this ―marker of [player]
status”, it annoys the players who are entitled to wear it to such an extent that they may quit the
game altogether. Independent traders (gold-sellers) do not care whether players who do not buy their
characters will leave the game since they do not have any monetary gain from their presence, and
thus feel free to sell as many characters and items as they can.
Yuan = renminbi yuan the standard monetary unit of China, divided into 10 jiao and 100 fen (Dictionary.com)
51
Another argument given by cheaters is the fact that players want to enjoy themselves while playing
and do not want to do the things that are not as ―fun‖, such as leveling or grinding (repetitive and/or
non-entertaining gameplay). However, as Bartle (2004) mentions, players do not buy virtual items
to skip the content that they consider boring or not compelling, but buy items to give the impression
that they have not skipped it. By suggesting that they are ―helpless victims of bad design‖ (Bartle,
2004), they try to hide the motives that might make other players consider them cheaters. On the
other hand, if selling to players what they need would not matter so much, then the developers
would have included such a feature in the game. However, the players would have complained that
the developers have added boring content to force players to pay to skip it (Bartle, 2004).
One side effect of the presence of real money transaction companies is the wave of virtual crime that
appeared in the community. Seduced by the opportunity to make easy money, a group of
unscrupulous players that extort money from players, create computer programs to beat ups and rob
other characters, as well as programs to ―farm‖ expensive mineral and plants. Holisky (2008)
mentions the story of a WoW player that had his account taken hostage by a gold seller that
demanded payment from his guild. The gold seller held the account hostage until the guild members
played the ransom. (Holisky, 2008) states that a stolen World of Warcraft account is worth more
than a stolen credit card, claiming that the security of credit cards is harder to bypass than that of an
in-game account. As the value of online commodities increase, these trends will continue to
increase.
Another mentioned side effect is the fact that in some cases players were literally unable to get hold
of a particular in game item since its source was tied up by farmers, who were camping (staying in a
certain place in the virtual world waiting for useful objects to appear in an area rather than actively
seeking them out) where rare objects were due to appear. This is one of the reasons why buying and
selling was banned on eBay since farming was ―blocking normal play‖ (Bartle, 2004).
Many companies and game forums advise against employing the services of RTM companies
because they can bring along vulnerabilities to the player‘s accounts and computers. (AION
Account Protection, 2010) states that the companies websites can attempt to load Trojans onto the
player‘s system as well as try the login information they require for the transactions on the player‘s
in-game account in an attempt to steal it. Another reason is the fact that these transactions devaluate
the efforts of the honest-players.
In order to curb the player‘s desire to buy accounts, some games offer methods to start out ahead:
UltimaOnline allows players to buy levels (Castronova, 2003); Camelot allows player to start at
level 20 if they already have a level 50 character (Castronova, 2003); and since the ―Wrath of the
Lich King‖ expand, Blizzard allows World of Warcraft players to start a Death Knight character
from level 55 if the player has a level 80 character (Blizzard Entertainment, n.d.).
6.5.
The spectrum of virtual world license agreement
Each game has an End User Agreement (EULA), a set of rules to which the player must agree to if
he wishes to play the game. These rules are created by the designers and their lawyers to control the
in-game behavior of players and differentiate the virtual world as a ―closed‖ space that is not subject
to real world laws and requirements (Kayser, 2006). By accepting the EULA, players agree to give
up certain individual rights. If a player fails to comply with the rules, the developers have the right
to punish the player by temporary or permanently banning their account.
52
Virtual worlds, such as that portrayed in Blizzard‘s World of Warcraft are known as a ―leveling
world‖ (Kayser, 2006). This system allows game companies to create an engaging, persistent world
where users will continue to play in order to increase an avatar‘s abilities, allowing Blizzard to
profit from the monthly subscription fees (Kayser, 2006). For such games, it is vital that players of
different levels move slowly through the content. Thus, the availability of high level characters on
the black market implies that players will bypass the leveling and such not making the low level
content less valuable. As a consequence of third party virtual object transaction, the game company
will be forced to create more high-level content to keep players interested in paying the monthly fee
(Kayser, 2006). Thus, in order to protect their interests, Blizzard has banned the commoditization of
virtual goods, defining World of Warcraft as a game world (Kayser, 2006).
In Blizzard‘s case account selling is an unforgivable offense. The ―grand scam‖ refers to stolen
accounts that have been sold to other players, only to have them taken away by Blizzard and
returned to their original owners after they have reported it stolen (McCurley, 2010). In this
situation the buyer loses both the money and the accounts, while the seller has nothing to lose.
Moreover, the buyer cannot even claim the theft since the transaction was illegal to begin with.
(Simon, 2010) For instance, in 2007 a Rogue World of Warcraft character with a rare sword
dropped by Illidan Stormrage* and a set of Tier 6* armor was sold for 7,000 Euros only to be
banned by Blizzard shortly afterwards.
Raph Koster, chief creative officer at Sony Online Entertainment pointed out that it is not in the
game‘s best interest to imitate the real-world economy since in the game world the point, ―the fun of
playing” is to do things and not pay money to not do them: "The economies in the real world are
designed to grow and progress toward an improved standard of living so that eventually you don't
have to slay dragons for food -- you go to a supermarket and get dragon burgers. We don't want
people to get to a point where they just go out for dragon burgers. That would not make for an
interesting game." (Musgrove, 2005)
However, Sony has begun recognizing the fact that a significant portion of their users prefer making
transactions of virtual items for real-world money due to the player‘s insistence, despite the threat of
being expelled from the virtual world (Kayser, 2006). In this respect, they have reached a
compromise position: players who object to RMT can play without the intrusion of real-world
economics, while the players who wish to pertain to the trade, can purchase their items from Sony‘s
Station Exchange servers (presented in the next section). Consequently, Sony tries to achieve
customer satisfaction as well as profit from the trade: Sony charges a one dollar listing fee for each
item listed, as well as collect 10% of every transaction (Kayser, 2006).
6.6.
Proposed solutions
Duping bugs that are exploited can be detected by companies by examining economic statistics from
the virtual world. Any time such a bug is exploited; the supply of gold increases and reduces the
virtual money to real money exchange rate. Everquest suffered from a gold duping bug* which had
catastrophic effects on the MMO‘s economy – the hyperinflation caused the price of virtual items to
rise to such an extent that basic items cost huge amounts of gold and thus the honest players could
not afford any items anymore and cannot progress in the game. Websites such as GameUSD (n.d.)
track the exchange rate between real money and virtual worlds.
Illidan Stormrage = one of the main storyline characters in the game as well as a ―Raid Boss‖ an enemy that can be defeated
only with the combined work of a large number of players (25-40) (GiantBomb, 2011f)
Tier 6 = in World of Warcraft the most valuable items for a specific moment in the game‘s history are called tier items, e.g. a
Tier 6 item was the best item that could be obtained in 2007, with the passing of time and addition of new content new item
tiers have been included in the game (Hecht, 2006).
Gold duping bug = a game bug which allowed the infinite duplication of in-game currency (Oster, 2004).
53
To curb illegal item and currency trading, Sony Online Entertainment (SOE) developed Station
Exchange, an official auction service that provides players with a secure method of buying and
selling in-game currency, items or characters in accordance with SOE‘s license agreement and
guidelines. Since many of Sony‘s customer service calls were in relation to virtual item disputes
whenever a player wants to sell an item, that specific item gets removed from the game world and
gets stored in a Station Exchange server and gets listed as an item for an auction similar to in-game
auction houses. When the auction is won, the successful bidder receives a notification and makes the
payment via PayPal and gets the item sent as an in-game email attachment. As opposed to IGE,
Station Exchange acts as a trusted third party (TTP) in the exchange between the buyers and the
sellers since Sony would not risk their reputation being spotted by fraudulent behavior and thus both
buyer and seller can rest assured. Players that do not agree to this exchange do not have to
participate. The idea behind Station Exchange is to shut down the middlemen companies by offering
a legitimate option and make sure that the stolen accounts can be sold only on the official website,
thus making it easier for the owners of the account announce the publishers and have them returned.
Sony reported that they have experienced a 30% fall in virtual item trading related calls.
By employing bot detection in-game farming ad virtual muggings can be averted. Additionally, by
keeping track of virtual items and their virtual owners, virtual item fraud can be averted. By
keeping snapshots of virtual items databases and associate unique identifiers to all virtual items,
companies can better investigate virtual muggings (Joshi, 2008).
On the other hand, sweatshops cannot be easily detected. The only element that can be used for
detection is the presence of accounts that are played constantly. Joshi mentions that accounts that are
used only a few hours a week and then re-used 24 hours a day followed by a sudden decrease are a
sign. A threshold detection mechanism that automatically alerts activity longer than 24 hours in a
row can be flagged as suspicious. Additionally, by legalizing the trading companies can spot
sweatshops as repeated sellers and will also make it easier to notice dupe exploits. Ironically, this is
how Sony discovered the EverQuest duping bug – by noticing high amounts of gold sold.
This however might have an increase on account theft since middlemen might need accounts to
scatter their ―business‖. Moreover, this might also lead to a breach in PayPal accounts – the thief
now know the name of the player‘s PayPal and thus the methods presented in the Phishing Chapter
to prevent identity theft might not be sufficient.
However, by legalizing the in-game trade, game companies might open the door to a well known
real life rule – tax paying. So far since the trading occurs only in game (officially), players and
companies do not have to pay taxes no matter how much gold they gain for farming or auctioning.
According to Masnick (2006) items that have been converted to real dollars are already taxable and
that the next ones to become taxable are items within the game based on the perceived value.
54
Chapter 7
Virtual identity
This chapter presents the concept of identity inside the virtual world and how the player can
authenticate himself inside the game world. Based on this description, the chapter will show how the
properties and attributes that are used for authentication can be stolen or attacked by using cognitive
biases (social engineering and phishing). The chapter will also show different methods and tools of
preventing these attacks.
7.1.
Real life identity vs. virtual identity
In real life, the identity of a person is a collection of all his personal attributes/properties, such as
name, profession, date of birth, sex, signature, etc, each with different levels of discrimination, i.e.
the social security number is more discriminant than the first name or date of birth (Jaquet-Chiffelle,
2002). Therefore, in real life a partial identity (a subset of an identity) is sufficient to be
authenticated: physical evidence, visual recognition, presentation of official papers (JaquetChiffelle, 2002).
Identity is a key part of virtual communities since communication is an important activity, and thus
knowing with whom you communicate is essential for understanding and evaluating an interaction
(Donath, 1998). The virtual identity or avatar is the manifestation of the player in the digital world,
perceived by the player as an independent identity, existing separately from their physical body. In
role playing games (RPGs) or massive multiplayer online role playing games (MMORPGs),
improving and maintaining the avatar is the most important part of the game. Edward Castronova
(2003) defines the avatar as the ―physical representation of the self in virtual reality… The avatar
mediates our self in the virtual world, we inhabit it, and we drive it, we receive all of our sensory
information about the world from its stand point‖. In his view, players can have two bodies: their
real avatar (the physical body) and a virtual avatar (the in-game identity).
On the other hand, the Oxford English Dictionary defines identity as ―the sameness of a person or
thing at all times or in all circumstances; the condition or fact that a person or thing is itself and not
something else; individuality, personality‖. However, this definition is problematic in connection
with virtual identities since an individual has a single real life identity, but can have multiple virtual
identities. Additionally, the virtual identity can sometimes be controlled by a different person than
the original owner. Thus there is a many-to-many relationship between physical and virtual
identities (Jaquet-Chiffelle et al., eds). For a comparison between real life identity and virtual
identity, please refer to Image 4.
55
Image 4: Identity in the physical world vs. identity in the virtual world
Virtual identities are also a collection of attributes or properties required for authentication. Usually,
inside the virtual world the authentication consists of knowledge of a specific secret related to that
virtual identity. The strength of the authentication depends on the security of the cryptographic
mechanism (Jaquet-Chiffelle, 2002). Authentication inside the video game can be defined as a set of
four components (see Table 8): authentication, accountability, billing and maintaining relationships
(Brooke et al., 2004). Additionally, virtual identity can be described as a tuple comprising the user‘s
characteristics (the attributes/properties used to distinguish him from other players), the player‘s
avatar characteristics and the hardware that enables the interaction between the player and system
(Brooke et al., 2004).
Table 8: Authentication inside the video game
Authentication
Accountability
Billing
Maintaining relationships
Description
Providing a collection of attributes or secret that
establishes someone‘s identity in the system.
Whenever a player brakes the terms and conditions of
the agreement with the game company, the developers
can enforce punishments.
Usually for games with monthly subscription, billing
is a method of identifying the legal entity in the real
world (and is usually represented by the credit card
information or pre-paid gaming cards).
The player and game developer are in a relationship in
which the game developer provides a long-running
persistent system to maintain the player ―account‖ and
recognize him when he returns to the game.
Virtual identity can be defined through the following factors (see Image 5): something you know
(knowledge factors, such as password and username), something you have (ownership factors, such
56
as security token), and inherence factors: something you are (such as fingerprints), and something
you do (such as signature).
Image 5: Factors defining virtual identity
7.2.
The privacy paradox
Studies suggest that many users do not know or do not employ methods of protecting their online
identities. For instance, Jensen, Potts and Jensen (2005) noticed that most of their study subjects
overestimate their knowledge or privacy related technologies and practices, whilst Carey & Burkell
(2009) states that most respondents were preoccupied with the privacy concerns that arose from
their immediate social relations (ex-girlfriends, friendships that went sour), rather than unknown
people. Carey & Burkell (2009) mention that users are capable of ambivalence regarding the
information they choose to make public over the internet and the online risks associated with it. The
motivation behind this ambivalence is the way users perceive the risks associated with their privacy
and how they influence the privacy-protecting behaviors. Often users spend a large amount of time
protecting aspects of their online life, while completely leaving out other kinds of threats.
When people face complex or uncertain situations they tend to rely on seemingly non-rational
motives, mental-shortcuts that help them make their decisions since the full assessment of available
information is difficult, time consuming or the information itself is not enough. These shortcuts
influence both the ways they asses the harms of the threat and the consequence of using privacy
protecting behaviors. Why would users believe that their associate relations may be more harmful to
their privacy than unknown attackers? The reason behind making certain information available and
to whom lies in the user‘s calculation regarding the relative costs and benefits of the disclosure, not
in the harm that the user believes that may or may not happen but in the likelihood of it happening,
in the risk of it happening. On the other hand, risks are often unknown and from the decision
maker‘s point of view, he may or may not be able to determine the frequency of certain harms
occurring, a situation called by Kahneman and Tversky (1974) ―judgment under uncertainty‖ where
users are forced to rely on heuristics to make their judgments.
57
The main elements which need to be examined are heuristics (see Table 9) - the mental shortcuts,
and cognitive biases. These cognitive shortcuts help the decision maker when resources such as time
and attention are limited and help him bypass complex calculations regarding the probability. The
result of such shortcuts is a response that ―satisfies‖ the user and meets his immediate need although
it may not be the optimal response.
When using heuristics, the outcomes of a single choice do not necessarily coincide with the
outcomes resulted from rational choice theory. However, it does not mean that the outcomes are
always incorrect or that they deviate systematically from the predictions of rationality. Although in
some cases heuristics produce better results than those generated by rational approach (Gigerenzer
and Goldstein,1996), in others heuristics lead to systematic, predictable deviations from rational
choice, deviations called biases (Office of Fair Trading, 2009).
Table 9: List of heuristics and cognitive biases that might affect a player in his decision making.
1
2
3
4
5
6
7
8
9
10
11
12
Heuristics and cognitive biases
Affect heuristic
Availability heuristic
Anchoring heuristic
Adjustment heuristic
Representativeness heuristic
Control heuristic
Privacy decision making heuristics
Valence effect bias
Background knowledge and Overconfidence
Optimism bias (positive illusions bias)
Rational ignorance bias
Norm activation bias
Protection motivation theory (PMT) suggests that people invoke protective behaviors after assessing
the advantages and disadvantages of responding to a perceived threat. Cognate approaches such as
the theory of reasoned action (TRA) and subjective expected utility (SEU) theory share a costbenefit analysis component in which the person considers the costs of taking the precautionary
action against the benefits (Floyd et al., 2000). These theories state that the user is able to calculate
the probabilities of harms or rewards deriving from an action.
A type of heuristic processing that deals with the aforementioned behavior is the affect heuristic –
the positive or negative feeling arising from perceptions of a stimulus, feelings that are crucial in the
determination of risk: if a person‘s feelings towards an activity are favorable, then the risks are
viewed as low and the benefits are high and if the situation is reversed, the risks are considered high
and the benefits low (Carey & Burkell, 2009). People in a good, happy mood tend to use heuristics
associated with top-down processing such as relying on pre-existing knowledge with little attention
to details (Baddeley, 2010). On the other hand, people in a sad mood will use bottom-up heuristics
and thus pay more attention to details than existing knowledge.
Retrievability plays an important influence on the affect heuristic: the individual making the
decision will likely base his result on events or objects that are easy to recall or judge. Therefore, the
individual will judge an event to be more likely if he is able to imagine or construct it. On the other
hand, the availability heuristic is that the perception of risk increases when faced with direct
58
experience of negative outcomes. If a person‘s negative experience is memorable, that person will
associate all similar situations with the same kind of risk (a person that had a cancer case in the
family will overestimate cancer risk). Thus, in the case of online threats and risk, individuals will
associate the harms that are already available to them, related to uncomfortable consequences that
were the effect of the release of ―destructive information‖. Anchor and adjustment heuristics are
involved in the creation of the initial estimate of the probability (anchor) and then revising and
updating it (adjustment) once new information is added [unknown] resulting in bias towards the
selected anchor value.
Control heuristic represents the tendency of people to influence situations over which they have no
control such as believing that the value of lottery tickets is higher for those with chosen numbers
than those with random numbers, despite the fact that the probabilities are identical in both cases.
Using representativeness heuristic individuals use similarity between two events to estimate the
probability of one from another. Another heuristic can be found in the privacy decision making that
tells us that individuals consider events that are difficult to picture mentally as improbable or
associate trustworthy behavior with the organized appearance and design of a website (Carey &
Burkell, 2009).
Due to the valence effect of prediction, individuals tend to overestimate the likelihood of favorable
events. As a self-serving bias, individuals overestimate the likelihood of favorable events that
happen to them relative to other individuals – for instance, some individuals believe that revealing
personal information online on social networks will cause privacy problems for others, but not for
them (Acquisti and Grossklags, 2006). Another interesting aspect presented by The Office of Fair
Trading (2009) is the fact that trust is higher among frequent users compared to users who are not
familiar with the internet, and thus resulting in overconfidence. In scenarios where probabilities are
difficult to predict, individuals tend to be overconfident in their knowledge or abilities, especially in
estimating exposure to online privacy risks. Research (Schulz-Hardt et al, 2008, Kray & Galinsky,
2003) shows that the more knowledge people have in a specific area, the more they feel competent
and thus overestimate their abilities to make good decisions in it. Additionally, many victims
claimed that they believed to be protected against social engineering attacks by the law and the
government, an illusion of control that is the result of overestimations (Office of Fair Trading,
2009).
‖Optimism bias‖ (Schneider, 2008) or ―positive illusions‖ (Office of Fair Trading, 2009) convinces
users that they are better than others involved in the same activity: individuals believe that car
accidents happen only to other people and thus continue driving recklessly, identity theft will never
happen to them and thus they do not worry about the personal information they may reveal in an
online conversation. Another important aspect of the positive illusion is the ―regret effect‖ (Office
of Fair Trading, 2009) – the feeling the victim would have experience had he not answered to the
scam to win what it offered.
On the other hand, ignorance becomes rational when the cost of learning about a scenario enough to
inform a rational decision would be higher than the potential benefit which may derive from the
decision. Acquisti and Grossklags (2006) mention that individuals might not feel compelled or
interested to read the data holder‘s privacy policy as they believe that the benefit will not
compensate for the time spent reading it. Additionally, they state that individuals choose to not look
for solutions that deal with their personal information since they prefer situations to remain the same
(the so-called status quo bias) thus explaining why the majority of users do not change their default
privacy settings.
59
Attackers target specific norms in their victims, such as the norm to help others or be a good citizen.
To follow a norm is not in itself an error of judgment. The error occurs when the individual
misclassifies a scam situation as one where a particular norm would apply. For example, when an
attacker calls a game company claiming to be a player who has trouble logging in, some employees
might give him the information to ―help‖ him out. Social norms also play an important role in
decision making, telling individuals what they ―ought‖ to choose. An attacker who knows the
norms, but does not respect them may use this knowledge to their advantage to find possible
exploits.
Another key element in heuristics is represented by motivational errors shown in Table 10.
Table 10: List of motivational errors that can cloud or influence a person‘s
judgment when faced with a possible attack
1
2
3
4
5
6
7
8
Motivational errors
Visceral influences
Reduced motivation for information processing
Preference for confirmatory information
Lack of self-control
Mood regulation and phantom fixation, sensation
seeking
Liking and similarity
Reciprocation
Commitment and consistency
Visceral influences such as fear or greed, can cloud a person‘s judgment, making decisions simpler
and the biases larger – clues that might indicate an attack are not detected anymore thus making the
individual more liable to become a victim. By appealing to strong motivational forces, the depth of
which the message is processed will be reduced, making the victim ―grab‖ at the superficially
attractive reward. Some victims mentioned that they were under the impression that they had been
offered something rare and unique, the urgency of the response request and the visceral influences
made it easier to fall for the scam. Research (Frey, 1986; Schwarz, Frey, & Kumpf, 1980) shows
that people who are highly motivated do not process information thoroughly, do not elaborate on the
pros and cons of the decision and neglect the possible side effects or long-term consequences.
Additionally, reduced motivation for information processing make individuals disregard the
attributes that help distinguish between real and fake messages (Langenderfer & Shimp, 2001). The
decreased motivation can be influenced by visceral influences or induce scarcity or uniqueness to
place victims in an urgent situation. A persistent error in human decision making is preference for
confirmatory information, an error based on the individual‘s tendency to seek information that
confirms one‘s initial hypothesis, rather than information that might prove it wrong (Festinger,
1957; Frey, 1986; Wason, 1971). Additionally, another important element is the lack of self-control
– people who respond to scams are usually less able to regulate their emotional responses and
impulsively answer to the attacker without putting much thought into it.
The Office of Fair Trading (2009) states that falling for a scam can also result from the individual‘s
attempts to control his mental states (mood regulation), such as replace a negative mood or increase
appositive one, the attackers taking advantage of the individual‘s needs and desires. By offering a
prize (the bait or lure), attackers intent do create a fixation (phantom fixation) in the victim and
over-motivate him to obtain the alleged rewards and thus create distortions in the decision-making
process. On the other hand, falling for a scam may also induce emotional effects which elicit
60
excitement and arousal through the hope of winning the offered prize or reward (sensation seeking),
and thus the victims are more willing to engage in risky behaviors in order to increase their
psychological arousal (Fischer et al. 2007). Sensation seeking is a cognitive and affective
psychological state that is at the roots at both the reasoning for falling victim to scams as well as
creating them (Office of Fair Trading, 2009).
Social engineering revolves around making the victims trust the attacker enough to offer the
information without much effort, thus enabling attackers to use another motivation error: liking and
similarity. People tend to like people who like them back, thus attackers can create scenarios to
which the victims can relate and thus increasing the empathy for the scammer. Additionally, using
the reciprocation errors, attackers make victims believe that they have offered something unique,
that they are bending the rules just for them and thus creating the need for the victims to answer to
―give something back‖. Moreover, since people tend to appreciate consistency in both their behavior
and the behavior of others, attackers can use the commitment and consistency error and manipulate
people‘s desire for the sense of control. Cialdini et al (1978) describes a similar effect as the
―lowball‖ sales technique, whilst Arkes & Blumer (1985) state that the desire for consistency might
explain the ―sunk cost effect‖ (people take into account past costs when relating to present decisions
despite being irrelevant from the perspective of rational choice theory).
7.3.
Social engineering
Joan Goodchild (2010) mentions that regardless of the money invested in security measures or the
quality they offer, they can be rendered useless with a social engineering attack. Social engineers
take advantage of the human behavior and the human‘s innate desire to consider all people good and
using these weaknesses they can bypass any security system. ―Social engineering is essentially the
art of gaining access to buildings, systems or data by exploiting human psychology, rather than by
breaking in or using technical hacking techniques‖ (Goodchild, 2010). For instance, instead of
trying to find vulnerabilities in the software, a social engineer will call an employee pretending to be
an IT support person and ―convince‖ the employee to give out his password. Social engineering
attacks are semantic attacks (Schneier, 2000) since they are aimed directly at people, rather than
taking advantage of system vulnerabilities.
Hadnagy (Olavsrud, 2011), a co-founder of Social-Engineer.org and operations manager at
Offensive Security states "We define social engineering as understanding what makes a person
think, tick, and react and then using those emotional responses to manipulate a person into taking
an action that you want them to take". A definite aspect of social engineering is the fact that most of
the times the attacker does not know his victim and the deception is done for the purpose of
information gathering, to get access in a computer system or account. Social engineering, as a
psychological manipulation method to gather information was popularized by Kevin Mitnick,
consultant and former hacker, who stated in his 2002 book ―The Art of Deception” that all his
attacks were conducted using only this technique. Deception consists of a set of acts created to
increase the chances that a specific set of targets will behave in a desired fashion when they would
be less likely to behave similarly if they were aware of those acts beforehand ( Cohen et al., 2007).
All techniques used in these attacks take advantage of specific human decision-making attributes,
known as cognitive biases or ―bugs in the human hardware‖ (Garth et al). The Office of Fair
Trading (2009) suggests that there are certain psychological reasons that would make an individual
respond to scams, reasons that involve a mixture of cognitive and motivational processes, such as
appeals to trust and authority or the induction of behavioral commitment.
61
By inducing behavioral commitment (The Office of Fair trading, 2009), attackers use small steps of
compliance to draw them in and thus make the victim feel committed to giving out more
information. If attackers can provide certain cues that would make the victims believe that they are
in a position of authority (an employer of a game company, a game master) then the appeal to trust
and authority (The Office of Fair trading, 2009) and the need to obey authorities, will compel the
victims to reveal personal information. The Office of Fair Trading (2009) mentions that there is a
minority of people (between 10-20%) more susceptible to scams than others and which are
consistently more likely to be interested in responding to a scam again. On the other hand, the same
study mentions that this minority vulnerable to scams is not necessarily composed by people who
are poor decision-makers, but people who are more vulnerable to persuasion. In some cases, these
individuals become ―chronic‖ or serial scam victims (The Office of Fair trading, 2009).
7.3.1. Pretexting
Due to the value of in-game items and accounts, gamers are an attractive target for attackers. As
Landesman (2009) mentions, chances are that most of the time the information compromised is not
the one regarding the website or online game credentials, but information such as the email specified
in the account has been compromised. Since many players use the same username and password for
their email and Instant Messaging (IM) and such open the door to attacks that may come from their
own IM contact list, a link claiming to be something else (such as a video) and that asks for their IM
account to view it. Or the attacker can ―dig deeper‖ and find more information about the player such
as his phone number and call him claiming to be an employee of the game company to get the
missing information. Unfortunately, Landesman (2009) states that attackers do not need to find out
too much information about the player and just persuade them that they are a game master,
regardless of the small chances of actually being contacted by one in real life (―if they want to play
the game, they aren't going to advertise the fact that they are also a GM. Otherwise, players will
never stop badgering them with questions‖).
Although most of the attacks against players have been conducted using phishing emails and
websites, there have also been cases where social engineering, as the act of psychological
manipulation, have been employed to convince players to willingly give out their account
information. One such case was the Microsoft‘s Bungie.net breach that exposed a portion of the
Xbox Live accounts by an online group called ―Clan Infamous‖ (Naraine, 2007). The group‘s
website mentions that the attack method employed is not technological, but social, their method of
attack consisting in convincing the support personnel at Microsoft to help them take control of the
accounts by telling a story about something going wrong with their account – from a system crash to
a sibling changing the password, and ask for help in ―recovering‖ the information.
Additionally, attackers do not need to pretend to be a person of authority (a game master, an
employer at the game company, etc.) to convince players to give up their personal information. In
some cases it is sufficient for another player to befriend the victim, pretend to offer to help him, start
a friendly conversation regarding the things that the player might enjoy: favorite movies, colors,
where they live, to know as much they can about the victim and then present them a new feature
offered by the game company that allows players to share the same account so that the victim will
give away their log in information to be further ―helped‖ (MMOWNED, 2010).
The attack employed by Clan Infamous is called pretexting (Tetzlaff, 2010), a more focused
engineering attack since it requires the attacker to know some information about the victim or the
business they are attacking. They call into the company/person and make false claims, asking the
62
victim to reveal passwords and other confidential information. The term appeared in 2006 after the
CEO of Hewlett Packard hired private investigators to impersonate board members of their
company and obtain information from the employees (McVey, 2011). As a consequence, in March
2008 the Federal Trade Commission intervened by implementing CPNI (Customer Proprietary
Network Information) that require more information than the mother‘s maiden name or place of
birth to identify a person.
Another type of social engineering attack was conducted (Leyden, 2010) through an exploit that
allowed tampering with the data on the Xbox hard drive, allowing the players to join game sessions
with temporary names and allow attackers to impersonate game developers and ask for log in
information.
7.3.2. Third party in-game merchants
Sites and advertisements that provide ―Boosting‖ services for achievement points* (such as
Gamerscore*) or ―leveling‖ for MMORPGs can lead to a compromised account. Typically these
sites/persons offer to play on your account to increase your level or Gamerscore and require you to
give them your username and password in exchange of some form of ―payment‖ (Xbox Forums,
2011). In this category we can also include the sellers on the games trade chat although their endpurpose is to redirect players on phishing sites. They are included in the social engineering group
since they have an actual in-game conversation with the player to convince them that the item they
are selling is real and that they are not trying to trick them. One example of such a scam is the
―Spectral Tiger Mount‖ in World of Warcraft (Torres, 2009b) where a high level, server-known
player offers this mount or any other trade-able in-game loot for a price that is high enough to be
believable but low enough to attract ―customers‖. When contacted the attacker only asks the player
if he has the in-game currency via the in-game tools before agreeing to send the code using an out of
game mail which contains the code and a link to phishing site.
7.3.3. Prevention
The best prevention against social engineering relies in understanding how users behave, what in
their behavior makes them vulnerable and design security systems to correct and eliminate them.
7.3.3.1.
User awareness
The best method to prevent these types of attacks is to raise awareness in both players and game
companies‘ employees. As 2010 18th DEF CON Hacking Conference (Olavsrud, 2011) contest
demonstrated, it does not matter how much a company invests in security if the company‘s
employees give out information to social engineers. Since these scams are based on human
weaknesses and not software, it is harder to devise a method that would stop the attacks in their
tracks. For instance McVey (2011) mentions that his cable/internet provider company is coping with
social engineering attacks by making it illegal for the employees to discuss phone records, email
addresses, passwords and other sensitive information without an access code of personal
identification number (either a random one or a provided one).
On the other hand, some biases presented at the beginning of the chapter can be manipulated to
encourage people to modify their behavior in a more positive way. For instance, the status quo bias
which determines people not to change their default privacy setting and avoid any effort in changing
Achievement points = points awarded for the completion of predefined game-specific challenges (GiantBomb, 2011g)
Gamerscore = achievement system that measures the number of achievement points accumulated by a Xbox player on the
LIVE profile (GiantBomb, 2011g)
63
their choices can be overcome by making the maximum privacy protection option the default. As
(Baddeley, 2010) mentions, the majority of users will be too lazy to change the default options.
Additionally, by setting the default to maximum it might also correct biases such as the valence
effect, rational ignorance or overconfidence.
7.3.3.2.
Tools
Although social engineering cannot be stopped completely using tools and software technologies, its
effects can be mitigated in-game using some simple tools – such as filters. The limitation of such a
filter is the fact that it does not prevent the player from giving the information through other means,
but at least it would stop the in-game ―game master‖ social engineering scams (scams where another
player poses as a game master to get the password and account name). Every time the password is
written in the clear it will be hidden.
7.4.
Phishing
Although phishing is a form of social engineering, the methods employed by the attackers are
different than the ones presented on the previous section. Phishing attacks take advantage of both
technical and social vulnerabilities (cognitive biases and judgment errors). Reasons why people
often fall to phishing scams are visceral triggers (The Office of Fair Trading, 2009) - phishing
attacks usually exploit basic human desires and needs, such as greed, fear, the desire to be liked or
avoid physical pain. In this situation, the attackers intend to provoke intuitive reactions and reduce
to motivation to process the content deeply.
Additionally, the scarcity cues (The Office of Fair trading, 2009), the elements used to personalize
the attack give the impression that the offer is unique and urgent, thus reducing the victim‘s
motivation to process the scam. Another motivation is the disproportionate relation between the size
of the alleged reward and the cost of obtaining it – the victims concentrate more on the reward,
making the cost look insignificant in comparison. Expectations are very important in the
susceptibility of a target to an attack – if the attack presents rewards that are outside the normal
range of expectations (a free in-game mount or a free beta key), it will be hard for the victim to
ignore them. The lack of emotional control (The Office of Fair trading, 2009), a feature that is not
present in non-victims, play an important role in the victim‘s decision making. Coupled with the
promise of a reward, this would make the victim impulsive.
7.4.1. What is phishing?
―Phishing‖ (―password harvesting fishing‖) is a form of social engineering described as the
fraudulent acquisition of personal sensitive information (username, passwords and credit card
details) through deception (making the user think that the attacker is a trustworthy entity) in an
electronic communication (FSTC, 2005, Jagatic et al., 2005). The origin of the term comes from the
early Internet analogy that criminals use emails as lures to fish for account and personal information
(Ollmann, 2007). Hacked accounts or personal and financial data are called ―phish‖ and are actively
traded as a form of electronic currency.
Phishing has been referred to as ―two-fold scam‖ or ―cybercrime double play‖ since it almost
always involves two separate acts of fraud (Stevenson, 2005): first stealing the identity and then
acquiring the personal information. It is typically carried out by e-mail (―spam‖ emails) or instant
64
messaging and generally convinces the user to click on a link that will direct him to a fake website,
which looks almost identical to the legitimate one and which will steal the log in information once
the user enters his details (Tan, 2006).
Regardless the method employed, the objective is the same – create a fake copy of a trusted brand
that can process input data and make it available to the attacker. The next step of the attack is
redirecting legitimate users to the fake website instead of the legitimate one. In this situation, the
phisher has three options: alter the DNS for a target website, redirect traffic using pharming (by
changing the host files on the victim‘s computer) or use spam email. The list of email addresses
used in the attack is purchased from the same sources as conventional spam (junk e-mail).
Another important aspect is the fact that phishing does not occur in isolation, but operates within a
complex network where the individuals involved may or may not understand how to stage the whole
attack (Cristopher, 2005). Additionally, they are tied to a phishing market where they can trade
goods (credentials valued according to their level of detail), services and money (Cristopher, 2005).
Also called ―context aware phishing‖ (Jagatic et al., 2005), ―spear phishing‖ (Report on Phishing,
2006) is a colloquial term used to describe any highly targeted phishing attack that incorporates
elements of context to become more effective. Spear phishers send emails that appear genuine to a
specific group, users of a specific online product or service, employees or members belonging to a
certain company, government agency or social networking website. The emails appear to come from
a trusted source, an employee or a colleague, as to make the request for valuable information more
plausible.
7.4.2. Phishing attack vectors
7.4.2.1.
Man-in-the-middle attack (MITM)
As one of the most successful vectors in gaining access to customer information, using the man-inthe-middle attack, the phisher positions himself between the customer and the legitimate website
and proxies all communications between the two (see Image 6). The attack succeeds only if the
attacker manages to impersonate each endpoint of the communication, at which point he can
observe and record all transactions. The customer connects to the attacker‘s server believing to be
the real one, while the attacker‘s server makes a simultaneous connection to the real site. When
secure HTTPS communications are made, a SSL connection is made between the customer and the
attacker, while the attacker creates its own SSL connection between itself and the legitimate server
(Ollman, 2007). The attacker may or may not modify the communications. When tokens are used,
the MITM attacker can intercept them and replay the one-time password before the token expires.
Additionally, when challenge questions are used, the attacker can observe the challenge question
and present it to the user, replaying the user answer to the legitimate site (MC Master University
Wiki 2009). The customer can be redirected to the fake server using several techniques: transparent
proxies, DNS cache poisoning, URL Obfuscation, Browser Proxy Configuration.
65
Image 6: Man-in-the-Middle Attack
Transparent Proxies
"A 'transparent proxy' is a proxy that does not modify the request or response beyond what is
required for proxy authentication and identification" – (Hypertext Transfer Protocol—HTTP/1.1 RFC 2616, 1999). Assuming that the SSL has not been tempered with, web filtering proxies should
not be able to see inside HTTPS transaction. Thus phishers, who want to bypass the filtering will
use and open and anonymous HTPPS transparent proxy. The web filter cannot distinguish between
the transactions using a transparent proxy and a legitimate one. A transparent proxy combines a
proxy server with a gateway or router, forcing connections made by the client browser to the
gateway, and diverting the proxy without the knowledge of configuration of the client.
DNS Cache Poisoning
The Domain Name System (DNS) is responsible for translating the host names to IP addresses (and
vice versa). Cache Poisoning (or pollution) disrupts the normal traffic route by injecting false IP
addresses for key domain names. When the DNS caches the non-authentic data (fake IP address) for
performance optimization, it becomes poisoned since it supplies the fake data to the server clients
and diverts traffic to another computer (US-CERT 2009).
URL Obfuscation
Obfuscation techniques are used by the attacker to hide the name of the fake server. URL
obfuscation methods will be presented in the following section in more detail.
Browser Proxy Configuration
Traffic can also be redirected by changing the customer‘s web-browser setup and setting proxy
configuration options (Ollmann, 2007). Usually this attack is carried out before the actual Phishing
message. On the other hand, this method is not transparent, the user being able to verify his web
browser setting and identify the modifications.
7.4.2.2.
URL obfuscation attacks
The main objective of a phisher is to convince the victim to follow a hyperlink to the attacker‘s
server instead of the real one. The most common methods used are bad domain names, friendly
login URLs, third party shortened URLs, host name obfuscation and URL obfuscation (Ollmann,
2007). For instance Blizzard‘s Battlenet official URL is Battle.net. An attacker can set up websites
with the URLs Batt1e.net or BattIe.net. In each case the letter ―l‖ was replaced with a character that
looked similar (―1‖ or ―I‖).
66
One of the easiest methods of hiding the URL is by using bad domain names. Two popular methods
of achieving these bad domain names are typosquatting and homographs (homoglyphs) attacks.
Typosquatting (Slavitt, 2004) relies on intentional mistakes such as typographical errors, usually
natural human typos. The phisher can set up a server with the name Battel.net or Batle.net. On the
other hand, homograph attacks (Namkara, 2009) are harder to detect by human users since the
attacks are based on exploiting the way Unicode characters work and replacing different characters
with different ones that look alike (homographs). For instance, in our example the Latin character
―a‖ is replaced by the Cyrillic character ―a‖.
In order to show the effectiveness of Unicode characters swapping, Raskin (2010) proposes an easy
experiment - using elements from the Unicode (such as the Cyrillic alphabet "a") in a Google (and
any other search engine) query would make it seem like the search engines are censoring certain
news. He also mentions a method of making a website number one on any search engine. Raskin
also states that these attacks can be nullified by watching for such strange characters in the middle
of normal words and quietly replacing them with the correct ones.
Another method is using the Friendly Login URLs allowed by many web browser implementation:
URI://username:password@hostname/path. Phishers can substitute the ―password‖ or ―username‖
fields for details related to the target organization. Due to the success of this attack, many browsers
have dropped the support for this URL encoding method.
Since many web-based applications have long and complex URLs, third-party organizations are
offering free services to provide shortened URLs, services which can be used by the attackers to
obfuscate the true destination behind a URL. Examples of such third parties are http://smallurl.com
and http://tinyurl.com.
Host Name Obfuscation. Instead of using the domain name (www.fakesite.com), the attacker can
include the IP address to hide it and possibly bypass content filtering systems. For example
http://mygame.com:login@fakesite/phishing/page.htm
can
be
transformed
into
http://mygame.com:login@213.145.178.38/login.html thus feeding on the fact that some users do
not know how to interpret IP addresses.
URL Obfuscation can be done by using one or a mix of the following encoding schemes (Ollmann,
2007):
 Escape Encoding – the method of representing URLs that may need special syntax
handling so they can be correctly interpreted. This is done by encoding the characters that
need to be interpreted with a sequence of three characters: ―%‖ followed by two
hexadecimal digits (representing the octet code of the original character). For instance the
US-ASCII represents a space with the octet code 32, or hexadecimal 20 and thus it‘s URLencoded representation will be %20.
 Unicode Encoding – a method of referencing and storing characters with multiple bytes by
providing a unique reference number for every character regardless of language and
platform as to allow a Universal Character Set (UCS) to contain most of the world‘s
writing systems. Many communication standards (XML, Java, JavaScript), operating
systems and web clients/servers use Unicode characters. For instance, space is encoded as
%u0020 represents a space, while %u01FC represents the accented Ǽ.
 Inappropriate UTF-8 Encoding – Unicode UTF-8 preserves the full US-ASCII character
range and thus provides many opportunities for disguising standard characters in longer
escape-encoded sequences, such as ―.‖ Can be represented as %2E, or %C0%AE, or
67

7.4.2.3.
%E0%80%AE, or %F0%80%80%AE, or %F8%80%on will b%80%AE, or even
%FX%80%80%80%80%AE.
Multiple Encoding – guidelines explain the methods of decoding escape encoded
characters and hint the issues associated with decoding multiple times at multiple layers of
the application. However, this has not stopped the fact that many applications still
incorrectly parse escape-encoded data multiple times. Thus phishers may try to obfuscate
the URK information by encoding characters multiple times such as the back-slash ―\‖
character may be encoded as %25 originally, but could be extended to: %255C, or %35C,
or %%35%63, or %25%35%63, etc.
Cross-site scripting attacks
Using cross-site scripting (CSS or XSS), the attackers use the custom URL or inject code into a
valid URL or imbedded data fields due to poor programming techniques. CSS injection into a
legitimate
URL
can
be
done
using
full
HTML
substitution
(http://game.com/forum?URL=http://fakesite.com/page.htm), inline embedding of scripting content
(http://game.com/forum?page=1&client=<SCRIPT>fakecode...) or forcing the page to load external
scripting code (http://game.com/forum?page=1&response=fakesite.com%21fakecode.js&go=2).
7.4.3. Present session attacks
Both HTTP and HTTPS are stateless protocols, thus users are tracked through a page and given
access to resources that require authentication through Session Identifiers (SessionID‘s)
implemented through cookies, hidden fields or fields contained within the page URLs. Poor state
management allows client connections to define a SessionID, but require the user to authenticate
before accessing the ―restricted‖ areas. In this case, the phishing message contains a link to the real
server in addition to a predefined SessionID field. The attacker‘s system constantly polls the server
to access a restricted page. Unless a valid user authenticates against the fake SessionID, the attacker
will receive error messages from the server. Once the attacker is authenticated, he can access the
restricted page and continue his attack.
7.4.3.1.
Hidden attacks
Apart from the URL obfuscation techniques mentioned before, the attacker may use HTML,
DHTML or other scriptable code and use it to manipulate the display of the rendered information, to
hide fake content (particularly, the source of the page content) as belonging to the real page, wither
as a man-in-the-middle attack or a fake site altogether hosted on the attacker‘s server. There are
three main methods of achieving this: hidden frames, overriding page content or graphical
substitution.
Hidden Frames
Frames are commonly used in this type of attacks since they have uniform browser support and an
easy coding style. Hidden frames can also be used to hide code from the customers. The code will
not be visible through the standard ―View Source‖ functions. In the following example, the page
linked to within the hidden frame can be used to hide additional content (only the URL of the master
frameset document will be visible in the browser interface unless the user follows a link with the
target attribute site to ―_top‖), provide a fake secure HTTPS wrapper (forcing the browser to display
the padlock or similar visual security clue), retrieve confidential information (by implementing code
68
that will report back to the attacker) or execute screen-grabbing and key logging observation code
(Ollmann, 2007).
<frameset rows="100%,*" framespacing="0">
<frame name="real" src="http://mybank.com/" scrolling="auto">
<frame name="hiddenContent" src="http://evilsite.com/bad.htm" scrolling="auto">
</frameset>
Overriding Page Content
One of most common method used by phishers to override page content is taking advantage of the
function DIV that allows the attacker to place content into a ―virtual container‖ which has absolute
position and size and which can be positioned to hide or replace the underlying content (by sitting
―on top‖ of it). This method allows the attacker to build a complete page on top of the real one.
Graphical Substitution
Although phishers can modify the page content easily, they still have to face the problem of hiding
the visual clues that would warn the user: the padlock representing a HTTPS secure connection, the
URL and the Zone of the page source. These clues can be overcome using browser scripting
languages (JavaScript, VBScript or Java) to ―write‖ fake information over these clues. It is relatively
easy for the attacker to know the browser name and version, as well as prepare images and code
suitable for the most common browsers.
Additionally, it is important to mention that phisers have combined graphical substitution with
additional scripting code to fake other browser functionality (implementing ―right-click‖
functionality and menu access, presenting fake popup messages, displaying fake SSL certificate
details or security settings through the use of images). Using the window.createPopup() and
popup.show() commands the attacker can construct an entire fake interface to capture and
manipulate what the customer sees:
op=window.createPopup();
op.document.body.innerHTML="...html...";
op.show(0,0,screen.width,screen.height,document.body);
7.4.3.2.
Observing customer data
Additionally, phishers can use key-loggers (observe and record all the victim‘s key presses, either
recording all key presses regardless of the application or to observe key presses within the context
of the web browser) and screen-grabbers (code designed to take a screen shot of the data entered) to
observe confidential data as it is entered in the web applications, information that is collected locally
and retrieved through methods such as:

Continuous streaming of data using a custom data sender/receiver pair while keeping a
connection open to the customer‘s computer.
69


Local collection and batching of information to upload to the attacker‘s server using
protocols such as FTP, HTTP or SMTP.
Backdoor collection by the attacker – observational software that allows the attacker to
remotely connect to the customer‘s computer and take the data when needed.
Brandt & Fechner (2010) mention that in 2010 a new key-logger appeared that attacks Microsoft‘s
DirectX libraries. DirectX is the Windows engine used by most 3D games to render graphics, play
sound effect and manage game controllers. According to Brandt & Fechner the Trojans attack the
DirectX only when the engine is in use, which means that it attacks only when the player is in the
game. According to the Webroot authors the Trojan can be defeated with the Trojan-PWS-Cashcab
or simply reinstalling DirectX on top of itself (Brandt & Fechner, 2010). Additionally, they mention
another technique in which the key-logger replaces the Input Method Editor (IME) that acts as a
map for the keyboard and which is particularly important in Asian languages where the number of
characters exceed the keyboard‘s 104 keys and which require two or more presses.
7.4.3.3.
Client-side vulnerabilities
The client‘s browser, similar to any piece of software is vulnerable to attacks regardless of the
vendors‘ efforts to eliminate the threats. Additionally, in some cases even though the protection
methods exist, the clients do not know how to apply them. Moreover, since the user is allowed to
install add-ons (such as Flash Player, Real Player or plug-ins), there are many opportunities for the
phisher to attack. Similarly to the threat of viruses and worms, the vulnerabilities in the browsers
can be exploited in several ways. Additionally, these attacks are much harder to detect by anti-virus
software, and thus much harder to prevent (usually the attack is noticed after the damage has been
done).
7.4.3.4.
Phishing through compromised Web Servers
A common method of phishing attacks involves breaking into vulnerable servers and installing
malicious web content. Once the server is compromised, a rootkit or password protected backdoor is
installed to allow easy access to the phisher. In the case of o compromised web server, phishing web
sites are set up (The Honeynet Project, 2011). The next step is for the phisher to advertise the fake
web site via spam email.
7.4.3.5.
Phishing using botnets
Botnets consist of networks of compromised computers that can be controlled remotely by an
attacker. Although they are usually used to send Distributed-Denial-of-Service attacks, they are
sometimes used to send out spam emails and phishing attacks. A CipherTrust study from October
2004, suggested that 70% of the monitored phishing spam was sent using one of five active botnets
and they assume that many more might be used for phishing attacks (BBC News, 2004).
In order to improve the quality of the deception the phisher might use one of these techniques:


Using IP addresses instead of domain names in the links to the fake website. Most of the
times users do not check the address.
Embedding hyperlinks from the real website into the email contents to ensure that
whenever the users attempts to connect automatically to the fake web server after reading
70




the email, the user‘s browser makes most of the connections to the legitimate server and
only a few to the fake one.
Exploiting browser weaknesses Using techniques such as address bar spoofing (modifying
the ―From‖ field to hide the true author of the email) or IFrame element bugs.
In some cases the phishing web site records the input data given by the user and the silently
logs them in to the real site (The Honeynet Project, 2011).
Redirect victims to a fake website using malware to install a malicious Browser Helper
Object (Brandt & Fechner, 2010) (a tool that helps the attacker create a custom, specialized
module built on top of the WebBrowser control) on their local PC and trick the victims into
believing that they are accessing legitimate content.
Using malware to manipulate the host‘s files used to maintain the local mappings between
DNS names and IP addresses by inserting fake DNS entries.
7.4.4. Message delivery
7.4.4.1.
Email and spam
The techniques used within phishing emails are similar to those presented above: ―personalized‖
official sounding emails, layout almost identical to the real one (with minor URL changes), HTML
based email as to obfuscate the target URL information, virus/worm attachments, many anti spamdetection inclusion, fake postings to popular message boards or mailing lists, use of fake ―Mail
From:‖ addresses to disguise the actual source of the email.
7.4.4.2.
Web-based delivery
The techniques used to deliver web-based phishing attacks include the inclusion of HTML disguised
links on popular web-sites or massage boards, use of third-party or fake banner advertising graphics
to lure customers to the fake websites, the use of web-bugs to identify potential attack victims, the
use of pop-up frameless windows to hide the true origin of the message, embedding malicious
content by abusing the customer‘s browser vulnerabilities (installation of key-loggers, screengrabbers, back-doors or Trojan horse programs).
7.4.4.3.
Fake banner advertising
Fake Banner Advertising is a very convenient method for Phisers to redirect a certain organization
customers to a fake website by placing a copy of the legitimate banner advertising on popular
websites and using URL obfuscation to hide the final destination. For example, in the following
table we can see an example of fake banner advertising: the image on the left is the real World of
Warcraft Cataclysm banner, and on the right the image is a fake. Although the graphics of the
images are similar, the fake image has one definitive characteristic: it states ―World of Warcraft the
Cataclysm‖, instead of ―World of Warcraft Cataclysm‖ (see Table 11).
71
Table 11: Example of fake banner advertising: the banner on the left is the official banner and the right is the fake one.
7.4.4.4.
Real banner
Fake Banner
(Cataclysm, 2010)
(Ziebart, 2009)
In-game mails/messages
The easiest method of directing traffic to a phishing site is using in-game mails and messages.
Usually these messages are attached to an in-game event such as a tournament, an expansion*
release (such as the World of Warcraft Cataclysm scams that offered a free invite to a ―closed
friends and family alpha release*‖ of the game), free in-game pets and rewards or impending
account bans. As the Blizzard Account Security Section warns, ―if an email asks for your password,
makes urgent appeals, sounds too good to be true, or links to “account management” sites outside
of Blizzard’s sites (you can find a list of all official Blizzard domains here), you are dealing with a
phishing attempt‖.
Another method of phishing account information is to sell in-game mounts on the in-game trading
channel. A worthy example is the Spectral Tiger Mount scam, regarding a mount that was available
as a code on a rare World of Warcraft trading card (from the WoW Trading card game). In this
situation, the attacker would advertise the offer to sell the code for a particular in-game sum of
money; a transaction that is legal according to Blizzard‘s Terms of Service (purchasing in-game
items with in-game money) and thus attracted many ―wealthy‖ players. Additionally, to increase
the authenticity of such claims, attackers used compromised accounts from high level characters
(Torres, 2009b).
Additionally there have been phishing attacks combined with social engineering inside the game,
where players received an in-game mail from a misspelled guild member or friend which include
website and instructions to look for a picture or something similar (see Image 7). This attack works
since the player trusts the person who delivered the mail and if he is not paying attention, his
account can be stolen. Furthermore, it is also hard for some players to identify their guild members
name – having players from different countries playing on the same server results in players that
have names difficult to spell for some, which have characters (specific to their country‘s alphabet)
they cannot reproduce and some players even have different characters whose names are misspelled
from the original one.
Expansion= expansions provide extra content for an already existing game, sold separately and require the original version of
the game in order to execute (GiantBomb, 2011h)
Alpha release=‖The alpha phase of the release life cycle is the first phase to begin software testing.‖(Bethke, 2003)
72
Image 7: Example of in-game phishing (Casual WoW, 2008)
7.4.5. Prevention
In order to mitigate and minimize the impact of phishing three key elements have to be taken into
consideration: the methods of protection, the methods of detection and the response measures
(Bindra, 2010). Usually, the best ways to deal with phishing attacks is increase user awareness and
offer tools and (built in) applications that can help the user spot the fake emails from the legitimate
ones (see Image 8).
7.4.5.1.
User awareness
Image 8: User awareness provided by Yahoo.com
However, no matter how effective software is in preventing attacks, it is not foolproof. In this
respect, Agarwal et. al (n.d.) proposes a set of axioms, called ―Rusty‘s Axioms‖ that have to be
taken into consideration when dealing with anti-phishing protection:
1.
2.
3.
Anything a phisher can see, he can spoof.
Anything a user knows, he can reveal to the phisher.
Any phishing solution is only as good as its first step.
73
―It is a cat-and-mouse game. Due to the social and human components, there are no completely
effective solutions. Only through learning from our shared experiences can we hope to better
protect Internet users.‖ (Bindra, 2010).Since software solutions are not foolproof, an extra level of
security can be added through user awareness, a set of tips and good practices that the player can
employ to protect himself. These rules are usually presented on the official websites of games and
on forums. Blizzard, for instance has a section dedicated to account security advising players not to
share their account information with anybody, to not employ the services of goldsellers and leveling
companies and to be careful with phishing emails (Blizzard Entertainment). Additionally, Blizzard
also offers a set of guidelines and methods of identifying attacks, as well as the list of domains
affiliated with their company and advices regarding what tools (anti-virus, firewall software) the
players can use to protect their computers.
NCsoft provides the rules for the games they offer as well as forum threads where the players can
discuss any security issues they might encounter. Similarly, Microsoft proposes a set of rules to
protect against attacks for the Xbox live accounts. Moreover, email providers (Yahoo, Google,
Microsoft) and browsers (Firefox, Chrome, Internet Explorer, Safari, Opera) give guidelines on how
the user can identify attacks without the use of software. For instance, Yahoo offers four rules for
protection against email phishing: ―Don’t believe every warning you read‖, ―Do NOT click any
buttons in these pop-ups‖ and ―There is no Yahoo! Lottery‖. WoW Insider advises all World of
Warcraft players: ―Please remember that account safety and computer security is your
responsibility! While WoW.com has provided you with resources to additional information, do your
homework and make sure you know what you're doing before installing any antivirus or other
software. And if your account does get stolen, please see our guide on what to do next.‖ (WoW
Insider, n.d.)
Additionally the mindset of the player has to change – any player can become the victim of an
attack. As Andrew Brandt (2010) mentions, many players consider these security measures
unnecessary, stating justifications that range from ―I use a Mac, I‘m not vulnerable to getting
hacked‖ to ―I am smarter than these people that need authenticators, I don‘t need extra security.‖
Moreover (MMOCrunch, 2009) talks about a case where the player hacked was a security
concerned person and mentions that the only thing the persons attacked have in common is their
―love for gaming‖.
As presented in the previous sections, phishers have a large number of attacks at their disposal.
Therefore, there is no single solution that is able to combat all these different vectors. Additionally,
it is not wise to just rely on particular browser anti-phishing tool, software or anti-virus. However, it
is possible to prevent attacks by utilizing a mix of information security technologies and techniques,
as well as good practices that the user must employ.
A mechanism inside the game client can be used to provide a scanning facility for the user‘s
computer. The game client will scan the user‘s computer for any keyloggers, Trojans or malware.
This solution could be beneficial for users who do not have an anti-virus or who know nothing about
threats to the computers. However, this solution has a series of disadvantages. Firstly, it would be
intrusive for the player (although there is no transfer of data from the user‘s computer to the game
server) and more costly. Game developers would have to have a contract with an anti-virus company
and would force the player to buy an antivirus bundled with every other game they buy.
Additionally, having the game scan the computer would annoy most players since the performance
is slowed down considerably (most players claim that they turn off their anti-virus and firewall for a
smoother game). Moreover, due to the overconfidence bias (presented in the previous chapter)
74
players will not turn on the anti-virus or do regular manual scans since they believe that the
possibility of an attack on their computer is too low to take into consideration.
Despite differences of content, all phishing attacks have the same features in common, suggesting
that some people can be educated to recognize and resist them. Thus, the most effective way to stop
players from falling victims to phishing attack is by educating them, making them aware of the risks
and of how phishers operate in order to combat some of the cognitive biases that would make them
fall into the scam in the first place and by giving them more information so that can reach a logical
conclusion, rather than use a mental shortcut. On the other hand, there are several problems
concerning this approach. One such problem is the fact that users rarely read the rules they have to
follow in order to keep their accounts safe. Most online game publishers state that they would never
ask for their password and username, but it has not stopped players to succumb to visceral
influences and give the information in the hope that they will receive the ―special‖ prize the
phishing email advertises. Another issue is the background knowledge and overconfidence bias
which would make players believe that they are competent in that field and that an attack will never
happen to them because they‘re ―too smart‖.
Making people focus on the potential losses, rather than the gain would reduce their tendency to
respond to attacks and would also reduce the ―regret effect‖. The Office of Fair Trading (2009)
mentions that awareness raising campaigns should encourage people to change their view on
phishing, by approaching all unsolicited communications with the question ―How much could I lose
here?‖ instead of ―What’s in this for me?‖. This would make them more likely to look for reasons
not to respond to the attack: ―if you think an offer might be a scam, it almost certainly is – your gut
instinct is almost invariably right.‖(The Office of Fair Trading, 2009)
A method to make these rules known to players is to send periodical pop-ups in the game (when the
player is in the game‘s chat application or in a loading) with the messages from the rules. Such an
example is one of the messages presented by Blizzard in World of Warcraft‘s loading screens: ―A
Blizzard employee will never ask for user id or password‖.
Since most game accounts are tied to an email, that is the main target of most phishing attacks and
the most common vectors of trying to divert the player‘s traffic to fake websites. The author will
present below the most common tools and features offered by three important email providers:
Yahoo, Google and Microsoft.
Email protection tools
7.4.5.2.
This part presents various protection tools employed by the major email client providers.
7.4.5.2.1.
Yahoo
Yahoo‘s ―Prevent Password Theft‖ Sign-In-Seal (Yahoo Protect Login, 2011)
To protect Yahoo accounts from malicious attacks, Yahoo has proposed a phishing detection
method based on an image selected by the user that Yahoo will subsequently be displayed on the
user‘s account every time he accesses it. Similar to the Bank of America SiteKey, the Sign-In-Seals
are usually secret messages or images chosen by the user to ensure them that they are dealing with a
legitimate website. The advantage of the Yahoo Sign-In-Seal is the fact that the images displayed
are chosen from the user‘s personal image gallery and can be anything from a personal photograph,
75
a stock picture, a drawing or the image can be generated from text provided by the user. This variety
makes phishing attacks harder since the user has a way of authenticating the real website.
Additionally, unlike passwords and other account information, this piece of information cannot be
―shared‖ by a normal user. A downside of Yahoo‘s seal is the fact that it is set only on the user‘s
computer in a cookie, thus it offers no protection for the cases where the user accesses his account
from a different source. Moreover, the effectiveness of this tool is nullified if the user does not
protect against browser or page flaws (Agarwal et al., n.d.).
Domain-Keys (Yahoo Help, 2010)
DomainKeys are tools that try to identify forged email by identifying their origin and comparing it
with a publicly available database of legitimate DNS. Each legitimate email has a digital tag
attached, a small icon of an envelope and key in the email header, as presented in Image 9.
Image 9: Example of digital tag for yahoo emails
Yahoo also mentions that this tool is most effective coupled with user verification of the email‘s
―From‖ field.
Certified Email (CertifiedEmail Icon, 2011)
Yahoo‘s certified email feature adds another level of protection by identifying the real and fake
senders using the certified email icon next to the sender‘s ID (see Image 10).
Image 10: Example of certified email icon next to the sender‘s ID
7.4.5.2.2.
Google – Gmail
Report Spam. Gmail has provided a ―Report Spam‖ button that sends reported spam to a separate
folder where the Google‘s anti-spam software is notified.
Hypertext Links. This security feature disables ―hypertext links‖ inside e-mails (Perez, 2004).
76
Warning Message. Phishing emails are displayed with a large red box, announcing the user about a
possible attack.
Gmail Labs: Authentication icon for verified senders. The icon shaped as a key is placed next to the
sender if he is ―Super-trustworthy‖ (Bindra, 2010). As an additional protection feature, the Official
Gmail Blog (Santerre, 2009) suggests that the user should use a ―smart‖ password.
Domain-Keys. According to the Official Gmail Blog, Domain Keys and Domain Keys Identified
Mail (DKIM) has been available since 2004 (Taylor, 2008). Taylor mentions that DKIM is a key
tool of eliminating spam from Gmail inboxes. On the other hand, he mentions that the features are
effective when ―high volume senders consistently use them to sign their mail -- if they're sending
some mail without signatures, it's harder to tell whether it's phishing or not‖. Additionally, Gmail
also provides an equivalent to Yahoo‘s sign in seal based on features such as Gmail themes (Chen,
2008) presented in Image 11.
Image 11: Example of Gmail themes (image google.com)
7.4.5.2.3.
Microsoft-Hotmail
SmartScreen Technology (Arbogast, 2003). Unveiled at the COMDEX Las Vegas 2003, the
SmartScreen Technology consists of a machine learning filtering technology that uses a probability
algorithm to ―learn‖ what is spam and what is legitimate email based on characteristics of both
emails. The ―learning material‖ is gathered from the e-mail users who contribute to the feedback
loop program.
Hotmail Filters (Hotmail, 2011) In addition to the anti-spam technologies, Hotmail also allows users
to set the filters‘ levels by adding a sender or domain name to the Safe Senders and Domains List to
ensure that the particular sender will never be considered junk regardless the message‘s content.
Users have the option of enabling a mode that accepts only messages from the Safe Senders List as
77
well as block certain email addresses or domain list (Blocked Senders List). Additionally, emails
marked as ―Junk‖ by users are used as feedback for the other filtering technologies.
Anti-Phishing Technology – Symantec Brightmail (Hotmail, 2011) As an additional filtering
technology, the Symantec Brightmail anti-spam content filter works by using a collection of more
than 200,000 email addresses designed to attract junk mail and eliminate it before it reaches the enduser.
Sender ID (Microsoft Safety, 2011). Similarly to Yahoo and Google, Hotmail provides a Sender ID
Framework, an e-mail authentication technology protocol intended to work against spoofing and
phishing by verifying the domain name from which the e-mail originated. The Sender ID validates
the origin of the message by comparing the sender‘s IP address against that of the alleged owner of
the sending domain.
7.4.5.3.
Browser Protection Tools
Microsoft
Microsoft’s Internet Explorer promises protection from phishing sites through the use of their
integrated phishing filter Add-in. According to Microsoft Corporation (n.d.), the add-in is updated
several times an hour and protects the user by scanning the web sites visited and warning the user if
the sites are suspicious and blocking the user from sharing personal information on known phishing
web sites.
Mozilla Firefox
Mozilla Firefox offers a built-in, default feature that checks the sites visited against lists of reported
phishing and malware sites, lists that are updated approximately every 30 minutes (Firefox Phishing
and Malware Protection, n.d.). The technical details for the safe-browsing protocol are publicly
available and work on the Google Safe Browsing Protocol (integrated in both Firefox and Google
Chrome) – Protocolv2Spec (Google Safe Browsing, 2011). Additionally, Firefox also supports a
great array of plugins, sodebars and toolbars to fight these attacks such as the FirePhish AntiPhishing Extension (Fire Phish, 2007) that works on the Open Phishing Database, or LocationBar
(Locationbar, 2011) which emphasizes the domain names to reduce spoofing, DontPhishMe (Don‘t
Phish Me, 2010) - an initiative of MyCERT (CyberSecurity Malaysia) to provide safe online
banking for local Malaysian banks, Smart Text (Smart Text, 2010) which makes common
operations on URLs easier for users by highlighting hostnames, linking elements of URLs such as
sub-domains, easy methods if selecting whole URLs or parts of it, or Web of Trust (Web of Trust,
2010).
Google Chrome
Similarly, Google Chrome offers a built-in default safe phishing and malware detection tool
designed to protect the user‘s computer and privacy, as well as conserve transmission bandwidth by
sending small amounts of data to and from the user‘s computer (Chrome 2011). The technologies
used by Chrome are Safe Browsing, sandboxing and auto-updates (Chrome, 2011). When the Safe
Browsing option is selected, Google downloads to the user‘s browser a list of information about
known or suspicious phishing sites. The list does not include the full URL of each site, but a hashed
version, broken into portions. While surfing, the browser creates hashed versions of the URLs
78
visited and checks them against the list. Whenever a hashed fragment matches one in the list, the
browser will contact Google‘s servers to request the full list of the hashed URLs. The user‘s
computer will then determine if that is indeed a suspicious site and warn the user. When the browser
contacts the servers, Google receives standard log information (including the user‘s IP address and a
cookie). Sandboxing prevents malware from installing itself on the user‘s computer, which monitor
the user‘s web activities or try to steal private information from his hard drive. Using the autoupdates, Chrome ensures that the browser is automatically updated with the latest security features,
and does this without any action required from the user.
Depending on the suspicious site, the user might get one of these warnings: "Warning: Visiting this
site may harm your computer!" or "Warning: Something's Not Right Here!" (in the eventuality of a
malware attack), "This is probably not the site you are looking for!" (cases where the URL listed
does not match the actual URL), "The site's security certificate is not trusted!" (the site‘s certificate
did not come from a trusted organization), "The site's security certificate has expired!" or "The
server's security certificate is not yet valid!" (whenever a site‘s certificate is not up to date), and
"The server's security certificate is revoked!" (a trusted organization has marked this certificate as
invalid).
Safari
Safari uses Google‘s AntiTrust database to detect phishing websites and warns the user using a
dialog box, webpage overlay or a combination of two.
Opera
Opera’s Fraud and Malware Protection (Opera Software, n.d.) default features prevents attacks
by sending the domain names of websites visited to the Opera‘s Fraud and Malware Protection
server together with a hash of the domain names. HTTPS are checked via an encrypted channel,
while IP addresses on the local intranet are never checked. The security status of each page is
displayed in a security badge in the address field. Opera states that the maximally secure site,
marked with a padlock
, has an encryption level sufficient to protect the traffic between the user
and the website (and thus protect against man-in-the-middle attacks and other eavesdroppers) and
should have a valid security certificate (providing assurance that the intended website has been
reached). The table below (Table 12) describes the security badges used in the Opera browser:
Table 12: Opera‘s security badges
Security badge
Status
Maximally secure site, with Extended Validation (EV), where the identity
of the owners of the website have been thoroughly verified
Secure site, where the credentials of the site owner have been checked
Normal site, or a site where there are problems with encryption, or where
information is not available to enable verification
File or folder on your computer
Site that has been listed as a known fraudulent site
Site that has been listed as a known malware site
79
7.4.5.4.
In-game solutions
Since most gamers turn off their anti-virus while playing to make sure that the system works at the
maximum performance, their systems become open to vulnerabilities. In order to stop these attacks,
NCsoft Europe, which provides games such as Aion, Guild Wars, City of Heroes, uses the
behavioural security software company Sana Security SafeConnect application to protect gamers
while playing (Gamasutra, 2008). Sana says its SafeConnect software "detects and removes
malicious software, including difficult new variants designed to steal password and identity
information via 'phishing,' spyware, malware and other methods."
7.4.5.5.
Authenticators
In order to help reduce the number of account thefts, Blizzard has offered players a Blizzard
Authenticator – a Two-Factor Authentication (TFA) device, similar to authenticators banks use for
their clients, device that generates a unique, one-time use eight-digit code to use in addition to the
regular password every time the user logs in (Blizzard Store, 2011). To encourage players to invest
in the extra security option, Blizzard also offers the players who have bought the $6.50 device, 99cent Mobile Authenticator from iPhone or the free Android version an in game pet (Brandt, 2010).
The connection between the player‘s account and the 8-digit codes is done by linking the
authenticator‘s serial number to the account (The Stoppable Force, 2009) as seen in Image 12.
Image 12: Blizzard Authenticator activation (image The Stoppable Force, 2009)
The codes are unique to the authenticator‘s serial number and only last for one login or 30 seconds
(whichever comes first). Usual authentication methods are based on the use of one of three options:
1. A secret known by the individual (password, PIN etc).
2. A token owned by the individual (passport, id-card, physical token, etc).
3. Data identifying the bearer‘s individuality (static biometric data, fingerprinting, iris etc).
4. Something the individual does (dynamic biometrics – voice handwriting).
To establish a two-factor authentication, any two of the methods mentioned above must be used.
80
7.4.5.5.1. Functionality
In essence the authenticator provides the user with a nonce (number used once) which is sent along
with the password and is checked on the server side for authenticity (Wowpedia, n.d.). If either the
password or the nonce does not match the expected result, the login request is rejected. However
this description merely scratches the surface. On a deeper level, the authenticator has two important
stages: the initial configuration (also known as the authenticator initialization) and the code
calculation.
Initial Configuration
During the initial configuration the authenticator creates an initialization request which is then
encrypted using the public key provided by Blizzard for authenticators , the key is ―0x101‖ and the
modulo is:
0x955e4bd989f3917d2f15544a7e0504eb9d7bb66b6f8a2fe470e453c779200e5e3ad2e43a02d06c4adbd8d328f1a426b8
3658e88bfd949b2af4eaf30054673a1419a250fa4cc1278d12855b5b25818d162c6e6ee2ab4a350d401d78f6ddb99711e7
2626b48bd8b5b0b7f3acf9ea3c9e0005fee59e19136cdb7c83f2ab8b0a2a99.
The initial message is then encrypted using the public key with RSA-1024. The plaintext
initialization request is split into 4 parts (see Image 13).
Image 13: Plaintext initialization request (image from Wowpedia, 2010)
The function byte is predefined and always set to 1. The response encryption key is generated using
the Java random number generator seeded with the value of the system‘s current time and hashed
using SHA1. The output of the hash becomes the encryption key. The region code and mobile
model are purely descriptive and not used by the algorithm itself. The region code is used to direct
the authenticator to the correct server and the mobile model can be a used as a variable that can
signal possible anomalies (the authenticator may be installed on a new phone by cloning the
information in the first authenticator).
After the encrypted message is received, the server first decrypts the initialization request using its
private key and then sends back the authenticator initialization response message (see Image 14).
Image 14: Authenticator initialization response message (image from Wowpedia, 2010)
81
The current time of the server is given in order to synchronize the time between the authenticator
and the server. The encrypted initial data is encrypted by XOR-ing the message using the response
encryption key which the server has received. The information is then sent to the authenticator
which decrypts the encrypted initialization data into the secret key for code calculation and the
authenticator serial number (see Image 15).
Image 15: The initialization data encryption and decryption (image from Wowpedia, 2010)
The serial number is used to link the user‘s account to the respective authenticator by manually
inserting the serial number into the account‘s specific options designed for adding authenticators.
The secret key for code calculation is used each time a new code is generated by the authenticator.
As mentioned before, a noteworthy aspect is that Battle.net provides the authenticators either as
mobile applications or as standalone physical tokens. Their functionality is identical but bites may
differ. For example the 16 bytes attributed to the mobile model are not required in the case of a
physical token.
Code Calculation
For the code calculation the secret key is used in combination with the code interval number. The
interval number is based on the time difference since midnight January 1 1970 UTC. From this point
on every 30 seconds a new code interval number is generated, time synchronization between the
server and the authenticator is important. In case of a de-synchronization a simple resynchronization
algorithm can be applied to resynchronize the time on the client. The code interval number is
inputted as the message into a HMAC-SHA-1 algorithm with ―the secret key for code calculation‖
used as the secure key. The result is a 20 byte MAC. The last 4 bytes of the MAC determine which
4 bytes in the MAC are considered significant. From these 4 bytes the last 8 digits are displayed by
the authenticator as the current code.
7.4.5.5.2.
Weaknesses
Although the authenticator provides extra protection, it is not foolproof. According to an article
published on MMO Crunch (2010) the first successful attack on the authenticator has been
implemented using a man in the middle attack: a Trojan installed on the client‘s computer was
intercepting the account name, password and authenticator code, blocks the user‘s log in by sending
a wrong authentication code to keep the nonce unused and uses the real nonce to connect instead of
the user. Attacks like these are very hard to prevent but as stated in the previous section, there are
methods users can adopt in order to minimize risks of such attacks. Even considering this
vulnerability the authenticator still adds extra protection since the hacker cannot reconnect to the
account once the Trojan is disabled, even if the user does not change his password because the
attacker will not know the correct nonce (TTH, 2010).
82
Another possible flaw lies within the use of the weak response encryption key (WoW Wiki, n.d.)
which is generated using the current time as a seed and the java random number generator. Making
the authenticator prone for even higher level man in the middle attacks, if the hacker can intercept
the initialization response message he also knows that the message is XOR-ed with a 37 byte key
which was generated using the java random number generator with the local computers current time
as a seed value. Therefore by trial and error the attacker can fairly quickly estimate the time that was
used by the client as the seed value and then the security will only lie in the random number
generator which is known to give similar results on the same seed values. If the attacker manages to
decrypt the message he will know the secret key for code generation as well as the time offset
between the authenticator and the server which will allow him to generate the correct codes using
the same code generation algorithm as the actual authenticator.
Additionally, the mobile authenticator has another vulnerability as opposed to the physical token
authenticator: the mobile‘s software may be corrupted allowing an attacker to run the authenticator
application at will to generate codes that are forwarded by a Trojan to the attacker, enabling him to
obtain the nonce. However the attacker will still require the account name and password of the
person before being able to login, which in theory he would only be able to acquire by scams or by
installing a different Trojan/key logger on the computer the user will use to login. Unfortunately,
Blizzard provides a number of services which are now available for phones which require users to
login (i.e. the mobile armory and auction house). Therefore if the phone security is compromised the
attacker will gain the entire information he requires for hijacking the user‘s account, including the
ability to remove the authenticator from the account, changing the password and adding a new
authenticator thus usurping the legitimate user.
7.4.5.5.3.
Improvements
Probably the biggest weakness in the authenticator lies in the use of the weak encryption key
generated by using the random from java with the current time as a seed value, if however a
cryptographically secure pseudo-random number generator would have been used the problem
would be fixed using very little extra computation costs. A number of such algorithms are presented
very clearly in NIST SP 800-90 (Barker & Kelsey, 2007) presenting step by step implementation for
both the algorithm itself and for the selection methods available for seed values.
Another problem is that the login password and nonce are sent in plaintext over insecure channels,
thus allowing an attacker to intercept the information, block the initial transmission and use the
password and nonce to login instead of the legitimate user during the timeframe in which the nonce
is still active. The issue can be resolved by either encrypting the outgoing data using the public key
of the server, this is still not sufficient though because an attacker that intercepts the encrypted
message can block the original message from the sender and replay the message, but if unique
identification would be appended to the message prior to encryption such as the computer‘s IP
address then on the server side the IP of the incoming connection would be compared with the IP
that has been sent inside the encrypted message, the only possible way of still attacking this
algorithm would be by spoofing the IP address of the attacker to appear equal to the one of the
initial sender, which is not trivial. Another possibility would be using symmetric encryption instead
of public key encryption for the login message but in this case a key establishment protocol should
first be used such as Diffie-Hellman (Diffie & Hellman, 1976) the algorithm can be used for each
new login attempt to nullify the need of saving the shared secret key on the client‘s computer which
would create new vulnerabilities. In case of video games which in this day have very high hardware
83
requirements, none of the methods will create any noticeable additional overhead on the client‘s side
but some extra strain will be added to the servers in the case of public key encryption. Alternatively
if the shared secret key is saved in a secure manner it could also be viable to establish new secret
keys only when the client‘s cached password is unavailable.
7.4.5.5.4.
Further ideas
Although Blizzard‘s approach with the authenticator is very efficient, it is not feasible to implement
it over a market of gamers that play several different games, because for each game they would have
a new token or at least for each company/game publisher. The best solution would be the use of a
third party company that provides the nonce, allowing game publishers to have access to the
information of accounts linked directly to their game, but at the same time allowing the users to add
an unlimited amount of games that would be protected by the nonce generator. The granularity of
such a device could be a research topic on its own (should a single nonce be used for all the games
or should each game use different nonce?).
Also the main advantage of the authenticator not being saved on computers but only on mobile
phones or physical tokens is theoretically based on the separation of concerns principle but
practically, the benefit in this case is almost insignificant but the disadvantages are great (tokens and
phones can be easily lost, damaged, de-synchronization with the server in a area which lacks
wireless connectivity to allow resynchronization). The only gain would be that if the computer is
compromised the hacker must wait in order for the client to input his login data before being able to
hijack the account and he can only reconnect when the user is first generating a nonce. If the
authenticator would be stored on the same pc as the game a Trojan could automatically start the
authenticator and gain access to the nonce at will or even find the code generation secret and then
use it to generate codes in an application that clones the behavior of the authenticator‘s code
generation process. This, however, is not such a big advantage since in both cases the accounts
would be compromised. The 2nd method would only increase the window in which the client can
detect the Trojan and remove it. Taking into account the advantages/disadvantages of running the
authenticators on computers, the author believes that it would be best if this would be left as a
choice for the user (but of course he should be informed of the advantages and disadvantages of
each option).
84
Chapter 8
Game specific cheats
This chapter deals with some examples of cheats situated in the so-called ―grey zone‖, that are
illegal in some video games, but acceptable behavior in others.
8.1.
What are game specific cheats?
Some forms of in-game activities invite more discussion as to whether they can be classified as
cheats or permissible behavior. For instance, a player may discover an aspect of the game which
grants him an advantage that was not planned by the developers. Such an advantage, if not
destructive to the gameplay, is considered a sign of sophisticated game design in single-player
games (Smith, 2007). However, in multiplayer games, such activities are labeled as ―exploits‖,
activities aimed at intentionally achieving an advantage afforded but not intended by the game
design, and the use of which is most of the time punishable. Knowingly using an exploit in a game
is considered punishable by many video game developers. On the other hand, the player is within
the framework of the game code and it should not be up to him to determine if a certain
phenomenon is intended by game designers or not. Therefore, the player must have an
understanding of what the developers intended to do in the game (and, consequently, did not intend
to do) and the developers must determine whether the players ―knowingly‖ used an exploit (Smith,
2007).
Additionally, Zetterström (2005) argues that if two parties meet and agree not to ―cheat‖, those two
will end up fighting on what ―not cheating‖ actually means. Furthermore, he stated that cheating can
be defined as deliberately violating the game rules and that those rules are not clearly underlined.
On the other hand, the definition proposed states that a cheat is ―anything that was not intended by
the game developer and was not stipulated in the end-user agreement‖ then the following elements
can be considered cheats only if they are specified by the game‘s EULA. Since these types of cheats
are specific to the type of game (what is illegal according to the EULA of one game, can be
accepted behavior for another), it is difficult to analyze all of them. The author will shortly present
only the most common ones. The techniques are different from game to game and from cheat to
cheat and should be analyzed as a separate topic.
8.2.
Cheating by exploiting a bug or loophole
Yan et.al (2005) argues that abusing bugs or loopholes in the game without modifying the client
code or data is still cheating. Zetterman states this ―cheat‖ is sometimes called ―sploitz‖
(Zetterström, 2005) and that is mostly the result of a programming error. The bug is defined as “an
error, flaw, mistake, failure, or fault in a computer program that prevents it from working correctly
or produces an incorrect result. Bugs arise from mistakes and errors, made by people, in either a
program's source code or its design” (FYICenter, n.d.). This category is considered a cheat, but it is
not a hack. According to Zetterström (2005) the bug is sometimes called a ―glitch‖ and the person
85
who abuses it, a ―glitcher‖. Moreover, he states that there is more than one way to exploit a bug. For
instance, he argues that in some first player shooters players can reach areas of the maps that are not
supposed to be accessible and become invisible to other players by sitting in them. He also states
that bugs are not as harmful as hacks since people who exploit them are easily spotted and the bug
can always be corrected through the use of patches. Additionally, the effect of the exploit is directly
proportional to the severity of the bug.
On the other hand, despite the opinions of researchers, it is incorrect to consider exploiting bugs as a
cheat unless it is stipulated in the EULA agreement. Since most games do not advertise the
possibility of not having a perfect code, they do not include it in the user agreement. Zetterström
(2005) argues that exploiting a bug is not a punishable cheat unless the player exploits a very serious
bug that has the same effect as a hack (e.g. seeing through walls). On the other hand, Pritchard
(2000) mentions that there are situations where the developers have a hard time identifying abuses
as ―cheaters actively try to keep developers from learning their cheats‖. Regardless of the opinion
of gamers or researchers, if exploiting a bug is not specifically written in the EULA, it cannot be
considered a cheat and thus the player cannot be punished (since they occur due to an element
neglected by the developers).
8.3.
Tweaking
As mentioned in the section regarding the definition of cheating, some advantages over another
player can be unfair but cannot be considered cheating. For instance, a computer with a faster
Internet speed is not considered unfair or a cheat, although it provides an advantage over players
that suffer from increased latency (network lag). The desire to have a machine that offers an
advantage has led to the rise of the activity called ―tweaking‖ (Zetterström, 2005) where the players
alter certain inputs on the computer and in the game to maximize performance. Zetterström offers as
an example the situation of FPS-gamers that strive to achieve as many frames per second as possible
(to ensure that the game runs smoothly) can modify the file that allows them to personalize their
settings called ―configuration‖ – cfg (as was intended by the developers). He argues that in some
cases the modifications made will be considered cheats by the gaming community – for example,
turning the fog option off can reduce the workload on the video card and enhance performance, but
will also allow the player to see farther than the player with the option turned on and thus gain an
―unfair‖ advantage. Another example is turning off the ambient sounds, allowing the player to filter
only the sounds that he considers necessary (allowing him to hear the footsteps of the opponent).
Zetterström states that in some communities any tweak that was done outside the interface of the
game (e.g. changing the resolution of the screen) is considered a cheat. He mentions that playing
only with the tweaks provided by the game is called playing ―out-of-the-box‖.
8.4.
Scripting and macros
This method is more common in MMOs where players create a few lines of script code that
automates certain tasks. One such example is the assignment of multiple different actions to specific
keys (Zetterström, 2005). For instance, in World of Warcraft players might create macros that
congratulate their guild mates automatically when they have completed an achievement or are used
to create ASCII drawings instantly. While simple scripts allow the creation of ―combos*‖ which are
accepted in most gaming communities, more advanced scripts give birth to bots which are
considered a cheat in any game. The opinion on macros and scripts depends from one game to
another – they might be accepted in some and considered a cheat in others. Blizzard terms of use for
Combo= a game-play mechanic consisting of actions performed in sequence, usually with strict timing limitationsusually
performed in fighting games (GiantBomb, 2011i)
86
instance mention that it is illegal for any player to use any delay-casting or scheduling of casting in
macros (WoW Wiki, n.d.).
8.5.
Add-ons
Add-Ons are extra files that put into the interface directory of a game will supplement the existing
interface. These are very common in World of Warcraft where they are accepted as long as the LUA
or XML files are run through the Blizzard interpreter, which according to a post in the Blizzard
World of Warcraft forum, would enable the administrators to turn them off if they are in violation of
the ToS (World of Warcraft Forums, 2009) - ―The only thing you need to know is that if the
developers deem that some proposed functionalities alter game play, they will render their use
impossible‖. Additionally, to prevent the misuse of add-ons Blizzard has published a list of rules
that these files must follow (World of Warcraft Policy, 2009). However, in this situation we have to
ask ourselves whether the players that use add-ons that tell them what to do in certain situations
don‘t have an advantage over the ones that do not have them installed.
8.6.
Camping
Kimppa & Bissett (2005) defines camping as ―reserving a spot which is optimal for spotting and
killing other characters, typically near a respawn* area‖. Behavior typical of first player shooters,
players might choose an area that either enables them to kill their opponents while shielding them
completely (e.g. snipe other players while hiding in a tree where the opponents cannot aim).
According to most game forums, the use of camping or other cowardly tactics is honorless, but it is
not a cheat (Call of Duty Black Ops, 2010). The game itself allows the player to hide in one spot
and wait patiently for other players to approach and kill them without much effort. Moreover, in
order to reach the safe place the camper himself has to survive and thus be a potential victim to
other players. Moreover, as Smith (2004) states this option is also available to the enemy.
Despite being an integrated part of the game, Smith (2004) argues that camping is against the set of
rules created by players who reached a mutual understanding on how the game should be played,
implicit rules that are subject of intense debate among players. In this situation, we are confronted
with what is and what is not against the ―spirit of the game‖ (Smith, 2004). Interestingly, these
discussions about ethics inside the game can reach such a degree that the developers will add that
rule to the actual game rule as such there have been games where the player is forced to move if he
spends too much time in one place.
Yan et al. (2005) state that camping is a cheat when the player sits next to a spawn point – a place in
the game where the player revives immediately (or with a certain delay) after he dies. The authors
state that a player might be prevented from joining the game by killing him as soon as he revives. In
order to correct this cheat some games do not allow players to be killed in the spawn point nor
killing a player in such a point will not give the camper experience points. For instance, in
Wintergrasp*, a player cannot be killed if he is next to a ―spirit healer*‖. However, if the opponent
is close enough, the player revived in the spawn point can kill the camper.
Respawn area = In most video games the death of the avatar is not permanent, when the character dies he is
usually teleported to a special location where he is revived (GiantBomb, 2011j).
Spirit healer= an entity marking a respawn area (Stickney, 2011)
Wintergrasp = location in WoW which is host to constant sieges between the game‘s factions (Yonzon,2008)
87
Chapter 9
Conclusion
9.1. Assessment
Based on the success criteria provided in chapter 2.4 and the research objectives provided in chapter
2.2, the thesis is successful: not only did it analyze and find flaws in existing taxonomies and
definitions of cheating but has also provided a new definition and an extended taxonomy complete
taxonomy with clear examples (based on the taxonomy provided by Yan, 2005). Additionally, the
thesis attracts attention towards a new category: game specific cheats. These cheats are called ―game
specific‖ because they can be considered cheats by some games whereas the same activities if done
in another game could be considered normal behavior or even encouraged (e.g. games such as Eve
Online allow stealing items, infiltration – actions which in other games would be considered
immoral and even prohibited explicitly by the game‘s terms of service).
Furthermore, the thesis insists on human related attacks by analyzing the specifics of each of the
cheats and providing both an account of existent solutions practiced by some companies/game
developers but not by all; as well as provide extensions to these solutions and newly proposed
solutions that may be used in order to mitigate risks of such domains.
In Chapter 8 the thesis also gives a few examples of ―game specific cheats‖ in order to clarify the
need for a new category and the need for specific research being done by each game developer on
their existence and specific prevention methods.
The drawback of the paper is that due to the fact that the topic of cheating in video games is so vast
it would not have been feasible to look upon every type of cheating in detail and provide
prevention/mitigation methods for each. However, due to time limitations, it has never been part of
the original research objectives. Additionally, the methods of prevention proposed by the paper
should be thoroughly assessed for their effectiveness in a simulated environment as well as in reallife case studies to confirm their efficiency.
9.2. Human related cheats - prevention summary
In the case of human related cheats for which a detailed analysis has been provided, the theses has
insisted upon the variations in the methods of coping with such attacks deployed by different
companies, as well as provide weaknesses and strengths for each. Therefore, the thesis gives
feasible solutions that generally provide protection against the human related cheats and shows that
if monitoring tools, cryptography and user awareness are high within a game environment, the game
will have a high security standard which, in turn, will increase the gaming experience of players and
improve the game developer‘s reputation, as well as keep the gamers more interested in the game or
future games from the same company.
88
However, since the multitude of methods of prevention have been used by different companies all
have certain advantages and disadvantages; and the newly proposed methods are imperfect on their
own, it is becoming clear that hybrid methods (resulted from combining several different prevention
methods that have complementary strengths and weaknesses) would bring the best overall results.
9.3. Future work
Future work – The new definition and the proposed taxonomy (which includes the new category of
cheats) aim towards standardizing the way cheats are categorized and handled in video games.
Although, human related cheats have been given an in-depth presentation, game specific cheats have
only superficially been presented since they were only intended to clarify the category of cheats and
attract attention for researchers and game developers. Additionally, the inclusion of such a category
(―game specific cheats‖) within the taxonomy highlights the importance and the need for research
done on individual games, as opposed to game genres or game types, in order to eliminate the
ambiguity and provide a clear understanding for the gaming community of what is and what is not
allowed in a game, as well as raise awareness on the importance of rules and regulations offered by
the developers. Moreover, further research on this topic will prepare game developers to cope with
possible cheaters that fall within this category.
Additionally, the technical cheats have been only described in general, without going into an indepth analysis. Further research should analyze the technical cheats thoroughly and provide clear
methods of prevention/mitigation against such attacks. By providing a clear taxonomy and
prevention methods for all cheats (both technical and human related), the first steps towards a fully
standardized methodology for cheating prevention in the gaming environment would be established.
9.4. Personal views
Although, the thesis has achieved the initial goals (I am personally satisfied with the new definition
and extended taxonomy, which will make future classifications less ambiguous), I must admit that
certain cheating categories have attracted my attention more than others. The methods of prevention
against virtual identity theft have been very interesting and could have been extended more by
providing even more detailed solutions, which would be implemented and tested based on specific
simulation scenarios to certify their validity.
Additionally, one of my favorite parts has been writing about the authenticator, which, I believe, can
be transformed into a great business opportunity: a company specializing in the manufacturing and
delivery of authenticators can act as a globalized distributor amongst all game companies and even
extend its gates to companies not related to video games (such as email client). Thus the player will
be able to choose an authentication method from one of the following: a single authenticator for
multiple applications, an authenticator for each application, or a master authenticator for an account
that may contain several private authenticators (the master being able to replace any of the private
authenticators at any point in time).
89
Therefore, I believe that if I were to restart the research on this subject, I would restrict the thesis‘s
spectrum even more and, after providing a clear definition and taxonomy, focus on a single category
for the remainder of the thesis.
REFERENCES
MAIN REFERENCES
Yan, J. (2003) ―SECURITY DESIGN IN ONLINE GAMES‖. In Proc. of the 19th Annual Computer Security Applications
Conference (ACSAC'03), IEEE Computer Society, Las Vegas, U.S.A., December, 2003. Available online:
http://homepages.cs.ncl.ac.uk/jeff.yan/yan_acsac03.pdf. Last accessed: 03.06.2011.
Yan, J. , & Randell, B. (2005) ―A SYSTEMATIC CLASSIFICATION OF CHEATING IN ONLINE GAMES‖. In Proc.
ACM NetGames '05, pp. 1-9.
Yan, J. , & Randell, B. (2009) "AN INVESTIGATION OF CHEATING IN ONLINE GAMES," vol. 7, no. 3, pp. 37-44.
Yan, J. J.& Choi, H.J., (2002) ―SECURITY ISSUES IN ONLINE GAMES‖. The Electronic Library Volume 20 Number 2
Available online: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.99.8270. Last accessed 09.06.2011
Zetterström, J. (2005) "A LEGAL ANALYSIS OF CHEATING IN ONLINE MULTIPLAYER GAMES". Available online:
http://www.gamecareerguide.com/education/theses/20050610/A%20Legal%20Perspective%20on%20Cheating%20in%20On
line%20Multiplayer%20Games.pdf. Last accessed: 03.06.2011.
Castronova, E. (2003) "ON VIRTUAL ECONOMIES" the international journal of computer game research volume 3, issue 2
december 2003. Available online: http://www.gamestudies.org/0302/castronova/. Last accessed: 03.06.2011
Castronova, E. (2006b) ―VIRTUAL WORLDS: A FIRST-HAND ACCOUNT OF MARKET AND SOCIETY ON THE
CYBERIAN FRONTIER‖. In K. Salen, & E. Zimmerman (Eds), The Game Design Reader: A Rules of Play Anthology (pp.
814-863). Cambridge, MA: MIT Press.
Junbaek Ki, Jung Hee Cheon, Jeong-Uk Kang, Dogyun Kim, (2004) "TAXONOMY OF ONLINE GAME SECURITY", in
Electronic Library, The, Vol. 22 Iss: 1, pages 65 to 73.
Schneier, B. (2000) "INSIDE RISKS: SEMANTIC NETWORK ATTACKS" Communications of the ACM, Volume 43 Issue
12, Dec. 2000
Schneier, B. (2008) “THE PSYCHOLOGY OF SECURITY” AFRICACRYPT 2008, LNCS 5023, Springer-Verlag, 2008, pp.
50-79. Available online: http://www.schneier.com/paper-psychology-of-security.html. Last accessed: 03.06.2011.
90
Secondary References
Abad, C. (2005) "THE ECONOMY OF PHISHING: A SURVEY OF THE OPERATIONS OF THE PHISHING MARKET".
First Monday.org. Available online at: http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/1272/1192.
Last accessed: 11.02.2011
Abramovitch, S., & Cummings D. (2007) ―VIRTUAL PROPERTY, REAL LAW: THE REGULATION OF PROPERTY IN
VIDEO GAMES,‖ Canadian Journal of Law and Technology, July 2007
Acquisti, A., & Grossklags, J. (2006) ―WHAT CAN BEHAVIORAL ECONOMICS TEACH US ABOUT PRIVACY?‖
Draft, preliminary version. Presented as Keynote Paper at ETRICS 2006. To appear in: Digital Privacy: Theory,
Technologies and Practices (Taylor and Francis Group, 2007)
Adams, D. (2005) "BLIZZARD CRACKS DOWN ON 'GOLD FARMING'" IGN Available online:
http://pc.ign.com/articles/595/595918p1.html. Last accessed: 02.08.2011
Agarwal, N., Renfro, S. & Bejar, A. (n.d.) ―YAHOO!'S SIGN-IN SEAL AND CURRENT ANTI-PHISHING SOLUTIONS‖
Yahoo
Inc.
Available
online:
http://www.security-science.com/pdf/yahoo-sign-in-seal-and-current-anti-phishingsolutions.pdf. Last accessed: 03.06.2011.
AION Account Protection (2010) Available online: http://na.aiononline.com/board/notices/view?articleID=197&page=5.
Last accessed: 03.06.2011.
Ambrogi, R. J. (2007). "VIRTUAL INCOME, REAL WORLD TAXATION". Legal Blog Watch. Available online:
http://legalblogwatch.typepad.com/legal_blog_watch/2007/10/for-friday-virt.html. Last accessed: 03.06.2011.
Anonymous (2010) ―HOW TO HACK/CHEAT ALMOST ANY FLASH GAME‖. Available online
http://www.durabe.com/how-to-hackcheat-almost-any-flash-game-netlog-facebook.html. Last accessed: 03.06.2011.
at:
Apperley, T. H. (2006) "GENRE AND GAME STUDIES" University of Melbourne. Available online:
http://unimelb.academia.edu/ThomasApperley/Papers/358573/Genre_and_Game_Studies_Toward_a_Critical_Approach_to_
Video_Game_Genres. Last accessed 10.08.2011
Arbogast, B. (2003) "Q&A: MICROSOFT ADDS NEW SPAM FILTERING TECHNOLOGY ACROSS E-MAIL
PLATFORMS" http://www.microsoft.com/presspass/features/2003/nov03/11-17spamfilter.mspx. Last accessed: 03.06.2011.
Arkes, H. R. & Blumer, C (1985) ―THE PSYCHOLOGY OF SUNK COST‖. Organizational Behavior and Human Decision
Processes, 35, 124-140.
Associated Press (2006) ―3 CHINESE TRIED FOR COUNTERFEITING WEAPONS IN ONLINE GAME‖. Available
online: http://newsinfo.inquirer.net/breakingnews/infotech/view_article.php?article_id=19103. Last accessed: 03.06.2011.
Avizienis A, Laprie JC, Randell B, & Landwehr C. (2004) ―Basic concepts and taxonomy of dependable and secure
computing‖. IEEE Transactions on Dependable and Secure Computing 2004; 1(1): 11–33.
Bacon, R. (2010) "Distributed Denial of Service (DDoS) Attack Timeline Be Informed; Be Prepared" Parabon Computation
Inc. Available online: http://www.parabon.com/faqs/DDoS-timeline.html. Last accessed: 03.06.201.
Baddeley, M. (2010) ―SECURITY AND HUMAN BEHAVIOR‖ Security: Foundations from behavioral economics.
91
Barker, E., & Kelsey, J. (2007) "RECOMMENDATION FOR RANDOM NUMBER GENERATION USING
DETERMINISTIC
RANDOM
BIT
GENERATORS
(REVISED)"
NIST
Special
Publication
800-90
http://csrc.nist.gov/publications/nistpubs/800-90/SP800-90revised_March2007.pdf. Last accessed: 03.06.2011
Battle.net (2011) Available online: http://eu.battle.net/sc2/en/ Last accessed 10.08.2011
BBC News Technology (2004) "CYBER CONMEN 'HIJACK DESKTOP
http://newswww.bbc.net.uk/2/hi/technology/3762264.stm. Last accessed: 03.06.2011.
PCS'"
Available
online:
Bethke, E. (2003) ―GAME DEVELOPMENT AND PRODUCTION‖. Texas: Wordware Publishing, Inc. ISBN 1-55622951-8.
Blazer, C. (2006). "THE FIVE INDICIA OF VIRTUAL PROPERTY".
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=962905. Last accessed: 03.06.2011.
Pierce
Law
Review
5:
137.
Blizzard Entertainment (n.d.) Battle.net Security. Available online: http://us.battle.net/en/security/. Last accessed 23.06.2011
Blizzard Store (2011) Available online: http://eu.blizzard.com/store Last accessed 10.08.2011
Blizzard Store (n.d.) Available online: http://us.blizzard.com/store/details.xml?id=1100000822. Last accessed: 03.06.2011.
Blizzard Support (2011) Available online: http://us.blizzard.com/support/article.xml?locale=en_US&articleId=20572. Last
Accessed: 23.05.2011
Brandt, A. (2010) "PHISHERS BREAK WOW‘S MAGIC SPELL OVER GAMERS". Available online:
http://www.businesscomputingworld.co.uk/phishers-break-wows-magic-spell-over-gamers/. Last accessed: 03.06.2011.
Brandt, A., & Fechner, C. (2010) "GAME TROJANS‘ BIGGEST TRICKS IN 2010" Webroot Threat Blog. Available online:
http://blog.webroot.com/2010/10/22/game-trojans-biggest-tricks-in-2010/. Last accessed: 03.06.2011.
Burghardt, T. (2010) "GHOST IN THE MACHINE: SECRET STATE TEAMS UP WITH AD PIMPS TO THROTTLE
PRIVACY" Dissident Voice Available online: http://dissidentvoice.org/2010/12/ghost-in-the-machine-secret-state-teams-upwith-ad-pimps-to-throttle-privacy/. Last accessed: 03.06.2011.
Buro, M. (2003) ―ORTS: A HACK-FREE RTS GAME ENVIRONMENT‖ in Proceedings of the International Joint
Conference on AI 2003.
Call
of
Duty
Black
Ops
(2010)
"CHEATING
IN
COD"
http://www.mw2forum.com/forum/viewtopic.php?f=13&t=15474. Last accessed: 03.06.2011.
Available
online:
Carey, R., & Burkell, J. (2009) ―A HEURISTICS APPROACH TO UNDERSTANDING PRIVACY-PROTECTING
BEHAVIORS IN DIGITAL SOCIAL ENVIRONMENTS‖. In I. Kerr, V. Steeves & C. Lucock (Eds.), Lessons From the
Identity Trail. New York: Oxford University Press.
Carless, S (2006) "IGE: Inside The MMO Trading Machine" Available online at:
http://www.gamasutra.com/view/feature/1837/ige_inside_the_mmo_trading_machine.php. Last accessed: 03.06.2011.
Castronova, E. (2003)"THEORY OF THE AVATAR", p. 3/5. SSRN 385103.
Castronova, E. (2004) ―THE RIGHT TO PLAY‖. New York Law School Law Review 49, 185-210.
Castronova, E. (2005) ―SYNTHETIC WORLDS: THE BUSINESS AND CULTURE OF ONLINE GAMES‖. Chicago: The
University of Chicago Press. p. 164. ISBN 0226096262.
Castronova, E. (2006a) ―ON THE RESEARCH VALUE OF LARGE GAMES: NATURAL EXPERIMENTS IN
NORRATH AND CAMELOT‖. Games and Culture 1(2), 163-186.
92
Casual WoW (2008) "WOW PHISHING" Available online: http://casualwow.blogspot.com/2008/01/wow-phishing.html.
Last accessed: 03.06.2011.
Cataclysm, Blizzard Entertainment, available online: http://eu.blizzard.com/en-gb/games/cataclysm/ Last accessed:
31.07.2011.
CertifiedEmail Icon (2011) Available online: http://antispam.yahoo.com/certifiedemailicon. Last accessed: 03.06.2011.
Chaplin, H. (2007) ―IS THAT JUST SOME GAME? NO, IT‘S A CULTURAL ARTIFACT ― The New York Times.
Available online: http://www.nytimes.com/2007/03/12/arts/design/12vide.html. Last accessed: 03.06.2011.
Chen, A. (2008) ―SPICE UP YOUR INBOX WITH COLORS AND THEMES‖, Gmail engineer – The Official Gmail Blog.
Available online: http://gmailblog.blogspot.com/2008/11/spice-up-your-inbox-with-colors-and.html. Last accessed:
03.06.2011.
China Daily (2003) ―LAWSUIT FIRES UP IN CASES OF VANISHING VIRTUAL WEAPONS‖, Available online:
http://www.chinadaily.com.cn/en/doc/2003-11/20/content_283094.htm. Last accessed: 31.07.2011.
Chrome
(2011)
"GOOGLE
CHROME
AND
BROWSER
SECURITY"
http://www.google.com/chrome/intl/en/more/security.html. Last accessed: 03.06.2011.
Available
online:
Cialdini, R. B., Bassett, R., Cacioppo, J. T., & Miller, J. A. (1978) ―LOW-BALL PROCEDURE FOR PRODUCING
COMPLIANCE: COMMITMENT THEN COST‖. Journal of Personality and Social Psychology, 36, 463-476.
Cluley, G. (2011) "PHISHING IN A WORLD OF WARCRAFT" Sophos Naked Security Available online:
http://nakedsecurity.sophos.com/2011/01/20/phishing-in-a-world-of-warcraft/ Last accessed: 11.02.2011
Cluley, G. (2011) "SOPHOS REPORT REVEALS INCREASE IN SOCIAL NETWORKING SECURITY THREATS"
Sophos Naked Security. Available online: http://nakedsecurity.sophos.com/2011/01/19/sophos-security-threat-report-2011social-networking/ Last accessed: 11.02.2011
Cluley, G. (n.d.) "STEAM PHISHING TARGETS VIDEO GAME PLAYERS" Sophos Naked Security Available online:
http://nakedsecurity.sophos.com/2011/02/16/steam-phishing-targets-video-game-players/. Last accessed: 12.02.2011
Cohen, F., Johnson, T. A., Taylor, E. & Francis (2007) ―A FRAMEWORK FOR DECEPTION‖, (one chapter in National
Security Issues in Science, Law, and Technology), (in press).
Consalvo, M. (2005) ―GAINING ADVANTAGE: HOW VIDEOGAME PLAYERS DEFINE AND NEGOTIATE
CHEATING‖, paper presented at the Changing Views, Worlds in Play Vancouver, BC. Available online:
http://www.waikato.ac.nz/film/2005papers/319B/docs/Consalvo.pdf, Last accessed: 09.06.2011
Consalvo, M. (2007) ―CHEATING: GAINING ADVANTAGE IN VIDEOGAMES PUBLISHED IN THE MIT PRESS‖.
Corwin, P. (2009) "VIRTUAL CURRENCIES AND VIRTUAL GOODS — DEFINITIONS AND REVENUE STREAMS
IN SOCIAL NETWORKS" How to Start a Social Network. Available online: Last accessed 10.08.2010.
Davis, S.B. (2001), ―WHY CHEATING MATTERS: CHEATING, GAME SECURITY, AND THE FUTURE OF GLOBAL
ON-LINE GAMING BUSINESS‖, in Proceedings of Game Developer Conference. Available online:
http://www.secureplay.com/papers/docs/WhyCheatingMatters.pdf. Last accessed: 03.06.2011.
DDoS,Cert (2001) Available online: http://www.cert.org/tech_tips/denial_of_service.html. Last accessed: 03.06.2011.
Defending the Net (2010) "SOCIAL ENGINEERING: YOU HAVE BEEN A VICTIM". Available online:
http://www.defendingthenet.com/NewsLetters/SocialEngineering.htm. Last accessed: 03.06.2011.
93
der Sloot, B.V. (2011) "VIRTUAL IDENTITY AND VIRTUAL PRIVACY: TOWARDS A CONCEPT OF REGULATION
BY
ANALOGY"
eGov
Präsenz,
2011-1,
p.
41-43.Available
online:
http://www.ivir.nl/publications/sloot/eGov_prasenz_2011_1.pdf. Last accessed: 03.06.2011
Dibbell,
J.
(2003)
"THE
UNREAL
ESTATE
BOOM".
Wired
http://www.wired.com/wired/archive/11.01/gaming.html. Last accessed: 03.06.2011.
(11.01).
Available
online:
Dibbell, J. (2006) ―PLAY MONEY: OR, HOW I QUIT MY DAY JOB AND MADE MILLIONS TRADING VIRTUAL
LOOT‖. Basic Books. ISBN 0465015352
Dictionary. com (2011) Available online: www.dictionary.com
Diffie, W., & Hellman, M. E. (1976) "NEW DIRECTIONS IN CRYPTOGRAPHY" , IEEE Transactions on Information
Theory, vol. IT-22, Nov. 1976, pp: 644–654.a
DMW (2011) Available online: http://www.dmwpro.com/. Last accessed: 23.05.2011
Donath, J.S. (1998) "IDENTITY AND DECEPTION IN THE VIRTUAL COMMUNITY" Identity and deception in the
virtual community. Available online: http://smg.media.mit.edu/papers/Donath/IdentityDeception/IdentityDeception.pdf. Last
accessed: 03.06.2011
Donovan, T.
(2010) ―WHY ARE VIDEOGAMES IMPORTANT?‖ The Sunday Times. Available online:
http://technology.timesonline.co.uk/tol/news/tech_and_web/article7135392.ece. Last accessed: 03.06.2011
Don't Phish Me (2010) ―MOZILLA SAFE BROWSING ADD-ONS‖. Available online: https://addons.mozilla.org/enUS/firefox/addon/dontphishme/. Last accessed: 03.06.2011.
EffeTech Sniffer (2003) Available online: http://www.ip-sniffer.com/ Last accessed 10.08.2011
Ellison,
B.
(2008).
"Defining
Dialogue
Systems".
Gamasutra.
Available
http://www.gamasutra.com/view/feature/3719/defining_dialogue_systems.php Last accessed 10.08.2011.
online:
Erwin, S. (2005) ―VIRTUAL MANEUVERS: GAMES ARE GAINING GROUND, BUT HOW FAR CAN THEY GO?‖
National Defense: NDIA‘s Business & Technology Magazine (December 2005), pages 44-49.
EVE Online (2011) "ACCOUNT SECURITY IMPROVEMENTS PART 1 - PHISHING". Available online:
http://www.eveonline.com/devblog.asp?a=blog&bid=845. Last accessed: 03.06.2011.
Even Balance (2011) Available online: http://www.nprotect.com/index.html. Last accessed: 23.05.2011
Fairfield, J. (2005) "VIRTUAL PROPERTY". Boston University Law Review 85: 1047. Available online:
http://www.bu.edu/law/central/jd/organizations/journals/bulr/volume85n4/Fairfield.pdf. Last accessed: 03.06.2011.
Festinger, L. (1957) ―A THEORY OF COGNITIVE DISSONANCE‖. Stanford, CA: Stanford University Press.
Firefox Phishing and Malware Protection (n.d.) Available online: http://www.mozilla.com/en-US/firefox/phishingprotection/. Last accessed 23.06.2011
FirePhish Anti-Phishing Extension (2007) ―MOZZILA SAFE BROWSING ASS-ONS‖ Available
https://addons.mozilla.org/en-US/firefox/addon/firephish-anti-phishing-extens/. Last accessed: 03.06.2011.
online:
Fischer, P., Kubitzki, J. Guter, S., & Frey, D. (2007)‖VIRTUAL DRIVING AND RISK TAKING: DO RACING GAMES
INCREASE RISK-TAKING COGNITIONS, AFFECT AND BEHAVIOURS?‖ Journal of Experimental Psychology:
Applied, 13, 22-32.
Floyd, D.L., Prentice-Dunn, S., & Rogers, R. W. (2000) ―A META-ANALYSIS OFRESEARCH ON PROTECTION
MOTIVATION THEORY,‖ Journal of Applied Social Psychology 30, no. 2: 408.
94
Ford, S. (2011) "EVE ONLINE NEWS - SECURITY IMPROVEMENTS: PHISHING & FLOWCHARTS" Available
online: http://www.mmorpg.com/gamelist.cfm/loadNews/19336. Last accessed: 03.06.2011.
Frey, D. (1986). ―RECENT RESEARCH ON SELECTIVE EXPOSURE TO INFORMATION‖. In L. Berkowitz (Eds.),
Advances in experimental social psychology, pp. 41-80. San Diego, CA: Academic Press.
FYICente,
(n.d.)
"SOFTWARE
QA
AND
TESTING
RESOURCE
CENTER".
Available
http://sqa.fyicenter.com/FAQ/Why-Bugs-in-Software/What_is_the_difference_between_a_bug_a_defect_.html.
accessed 06.22.2011
online:
Last
Gamasutra (2008) ―NCSOFT EUROPE CHOOSES SANA ONLINE SECURITY PACKAGE‖ Available online:
http://gamasutra.com/view/news/17677/NCsoft_Europe_Chooses_Sana_Online_Security_Package.php.
Last
accessed:
03.06.2011.
GameConnect (2010) ―FORUMS - MICROSOFT DISCUSSION - MICROSOFT'S XBOX KINECT BEYOND HACKERS,
HOBBYISTS‖ Available online: http://gamrconnect.vgchartz.com/thread.php?id=122626. Last accessed: 03.06.2011.
Gamespot
(2010)
―THOSE
WHO
DISCONNECT
BEFORE
LOSING‖
Available
http://www.gamespot.com/xbox360/action/soulcaliburiv/show_msgs.php?topic_id=m-1-57393329&pid=940048.
accessed: 03.06.2011.
online:
Last
GameUSD (n.d.) Available online: www.gameusd.com Last accessed: 03.06.2011
GauthierDickey, C., Zappala, D., Lo, V. & Marr, J. (2004) ―LOW LATENCY AND CHEAT-PROOF EVENT ORDERING
FOR PEER-TO-PEER GAMES‖ in Proceedings of the 14th international workshop on Network and operating systems
support for digital audio and video, pages 134 to 139.
Gee, J. P. (2003) ―WHAT VIDEO GAMES HAVE TO TEACH US ABOUT LEARNING AND LITERACY‖. New York:
Palgrave/St. Martin‘s.
GIA (2011) "WORLD VIDEO GAMES MARKET TO EXCEED US$61.9 BILLION BY 2012, ACCORDING TO A NEW
REPORT BY GLOBAL INDUSTRY ANALYSTS, INC." Global Industry Analysts (GIA) Inc. Available online:
http://www.prweb.com/releases/video_games_online/console_pc_handheld/prweb720734.htm. Last accessed: 03.06.2011.
GiantBomb (2011a) "SPLIT-SCREEN MULTIPLAYER" Available online: http://www.giantbomb.com/split-screenmultiplayer/92-322/ Last accessed 10.08.2011.
GiantBomb (2011b) "HOTSEAT" Available online: http://www.giantbomb.com/hotseat/92-354/ Last accessed 10.08.2011
GiantBomb (2011c) "EXPERIENCE POINTS" Available online: http://www.giantbomb.com/experience-points/92-39/ Last
accessed 10.08.2011
GiantBomb (2011d) "MOUNT" Available online: http://www.giantbomb.com/mounts/92-409/ Last accessed 10.08.2011
GiantBomb (2011e) "LEVELING UP" Available online: http://www.giantbomb.com/leveling-up/92-475/ Last accessed
10.08.2011
GiantBomb (2011f) "ILLIDAN STORMRAGE" Available online: http://www.giantbomb.com/illidan-stormrage/94-637/Last
accessed 10.08.2011
GiantBomb (2011g) "ACHIEVEMENTS" Available online: http://www.giantbomb.com/achievements/92-29/ Last accessed
10.08.2011
GiantBomb (2011h) "EXPANSION" Available online: http://www.giantbomb.com/expansion/92-344/. Last accessed
10.08.2011
GiantBomb (2011i) "COMBO" Available online: http://www.giantbomb.com/combo/92-18/ Last accessed 10.08.2011
95
GiantBomb (2011j) "RESPAWN" Available online: http://www.giantbomb.com/respawn/92-989/ Last accessed 10.08.2011
Gigerenzer, G., & Goldstein, D. G. (1996) ―REASONING THE FAST AND FRUGAL WAY: MODELS OF BOUNDED
RATIONALITY,‖ Psychological Review 103, no. 4 : 650–669.
Go
rumors
(2010)
―GLOBAL
VIDEO
GAME
MARKET
GROWTH‖
http://gorumors.com/crunchies/video-gaming-console-market/. Last accessed: 03.06.2011.
Available
Goodchild,
J.
(2010)
"SOCIAL
ENGINEERING:
THE
BASICS".
Available
http://www.csoonline.com/article/514063/Social_Engineering_The_Basics. Last accessed: 03.06.2011.
online:
online:
Google Safe Browsing (2011) Available online: http://code.google.com/p/google-safe-browsing/wiki/Protocolv2Spec. Last
accessed: 03.06.2011.
Greenhill, R. (1997) ―DIABLO, AND ONLINE MULTIPLAYER GAME'S FUTURE‖. GamesDomain, Available online:
www.gamesdomain.com/gdreview/zones/shareware/may97.html. Last accessed: 09.10.2010.
Hack Forums (2010) "HIV BACKTRACK" Available online: http://www.hackforums.net/archive/index.php/thread773367.html. Last accessed: 03.06.2011.
HackShield (2011) Available online: http://hackshield.ahnlab.com/hs/site/en/main/main.do Last accessed: 23.05.2011
Halfpap,
B.
(2010)
"SOCIAL
ENGINEERING
HACKING
BY
ASKING"
http://ctovision.com/2010/08/social-engineering-hacking-by-asking/. Last accessed: 03.06.2011
Available
online:
Hardy, R. S. (2009) "CHEATING IN MULTIPLAYER VIDEO GAMES" Thesis Master of Science In Computer Science
and Applications, Faculty of the Virginia Poly thechc Institute and State University Available online:
http://scholar.lib.vt.edu/theses/available/etd-04242009-125827/unrestricted/Thesis.pdf. Last accessed: 03.06.2011.
Hassenzahl, M. (2008) ―USER EXPERIENCE (UX): TOWARDS AN EXPERIENTIAL PERSPECTIVE ON PRODUCT
QUALITY‖. In: Proceedings of the 20th French-speaking conference on Human-computer interaction (Conférence
Francophone sur l'Interaction Homme-Machine) IHM '08 (Metz, France, September 2008).
Hecht, E. (2006) "WOW MOVIEWATCH: TIER 6" Available online: http://wow.joystiq.com/2006/12/29/wow-moviewatchtier-6/ Last accessed 10.08.2011
Henry, A. (2011) "BLIZZARD UNVEILS DUNGEON FINDER: CALL TO ARMS" Available
http://azeroth.metblogs.com/2011/04/07/blizzard-unveils-dungeon-finder-call-to-arms/ Last accessed 10.08.2011
online:
Higgins, K. J. (2009) "SQL SERVER DATABASE HACK TRICKS FORENSICS" Available
http://www.darkreading.com/security/attacks-breaches/212903514/index.html. Last accessed: 03.06.2011.
online:
Hill, S. (2010) "MOST EXPENSIVE ITEMS EVER SOLD IN AN MMO". Available
http://www.brighthub.com/video-games/mmo/articles/29070.aspx#ixzz1Ffz89J3H. Last accessed: 03.06.2011
online:
Hoglund, G., & McGraw, G. (2007) "EXPLOITING ONLINE GAMES: CHEATING MASSIVELY DISTRIBUTED
SYSTEMS" Addison Wesley Professional, ISBN: 0-13-227191-5
Holisky,
A.
(2008)
"GOLD
SELLERS
HOLD
ACCOUNT
HOSTAGE".
http://wow.joystiq.com/2008/03/08/gold-sellers-hold-account-hostage/. Last accessed: 03.06.2011.
Available
online:
Holisky, A. (2010) "MAN IMPRISONED ON FRAUD AND THEFT CHARGES OVER ACCOUNT SELLING SCAM"
Available online: http://wow.joystiq.com/2010/01/13/man-imprisoned-on-fraud-and-theft-charges-over-account-selling-s/.
Last accessed: 03.06.2011.
Hotmail (2011) Available online: http://mail.live.com/mail/junkemail.aspx. Last accessed: 03.06.2011.
96
Hsu, D. (2008) "HACKED ON XBOX 360: A VICTIM'S
http://xbox360.ign.com/articles/933/933332p1.html. Last accessed: 01.08.2011,
TALE"
IGN
Available
online:
Hu, J., Bertok, P. & Tari, Z. (2008), ''TAXONOMY AND FRAMEWORK FOR INTEGRATING DEPENDABILITY AND
SECURITY,'' Chapter 6, Part II: Modelling the Interaction between Dependability and Security, <<Information Assurance:
Dependability and Security in Networked Systems>>, Y. Qian, J. Joshi, D. Tipper, P. Kirshanamurthy (eds.), Elsevier, 149170.
ISBN:
978-0-12-373566-9.
Available
online:
http://seit.unsw.adfa.edu.au/staff/sites/hu/Sample_Publication/bookchapter1.pdf. Last accessed: 03.06.2011.
Hubert‐Wallander,C.B., Green, S., & Bavelier, D. (2010) ―STRETCHING THE LIMITS OF VISUAL ATTENTION: THE
CASE OF ACTION VIDEO GAMES‖. Published online Nov 17 2010 12:00 AM. Available at:
http://wires.wiley.com/WileyCDA/WiresArticle/wisId-WCS116.html. Last accessed: 03.06.2011.
Huizinga, J. (1955). ―HOMO LUDENS: A STUDY OF THE PLAY ELEMENT IN CULTURE‖. Boston: Beacon Press.
Huizinga, J. (2000) ―HOMO LUDENS: A STUDY OF THE PLAY-ELEMENT IN CULTURE‖. Routledge: London.
Humphries, M (2010) "STARCRAFT II ACCOUNTS BEING TARGETED IN PHISHING SCAM " Geek.com Available
online: http://www.geek.com/articles/games/starcraft-ii-accounts-being-targeted-in-phishing-scam-2010083/. Last accessed:
11.02.2011
Huy, J., & Zambetta, F. (2006) ―PLAYNOEVIL GAME SECURITY NEWS & ANALYSIS: VIRTUAL ITEM THEFT
RING BUSTED‖. Available online at: http://playnoevil.com/serendipity/index.php?/archives/1051-Virtual-Item-Theft-RingBusted.html#extended. Last accessed: 03.06.2011
IEEE (1990). ―IEEE STANDARD 610.12-1990‖, IEEE Standard Glossary of Software Engineering Terminology.
IGE (n.d.) Available online: www.ige.com. Last accessed: 03.06.2011
IGN Accounts (2011) Available online: http://www.ignaccount.com/buywowusaccount.html. Last accessed: 23.05.2011
IGN Staff (2000) "STARCRAFT" IGN Available online: http://pc.ign.com/articles/152/152159p1.html. Last accessed:
2.08.2011
IT
Security
Blog
(2010)
"PEERING
INTO
THE
STORM
WORM"
Available
http://www.maikmorgenstern.de/wordpress/?tag=bot-and-botnet-research. Last accessed: 03.06.2011.
online:
Jaquet-Chiffelle D.-O.(2002), "AUTHENTICATION AND/OR IDENTIFICATION THROUGH THE VIRTUAL WORLD".
Position paper, Stork, 3 p..
Jaquet-Chiffelle D.-O., Benoist E., Anrig B. (eds), ―IDENTITY IN A NETWORKED WORLD: USE CASES AND
SCENARIOS‖., 16 p., 08-2006. [Document] [publisher's website]
Jensen, C., Potts, C., & Jensen, C. (2005) ―PRIVACY PRACTICES OF INTERNET USERS: SELF-REPORTS VERSUS
OBSERVED BEHAVIOR,‖ International Journal of Human-Computer Studies 63: 203–227.
Jimmy, N., & Hwang, H. (2009) "SOUTH KOREA ACE TEAM DRAWS ROCK-STAR STATUS PROGRAM HELPS
COUNTRY‘S PRO GAMERS FILL MANDATORY SERVICE REQUIREMENTS" Star and Stripes. Available online:
http://www.stripes.com/news/south-korea-ace-team-draws-rock-star-status-1.89236. Last accessed: 03.06.2011.
Joshi, R. (2008) "CHEATING AND VIRTUAL CRIMES IN MASSIVELY MULTIPLAYER ONLINE GAMES" Royal
University of London. Available online: http://www.ma.rhul.ac.uk/static/techrep/2008/RHUL-MA-2008-06.pdf. Last
accessed: 03.06.2011.
Jowett, G.S., & O'Donnell, V. (1999) "PROPAGANDA AND PERSUASION". Published by Sage Publications, Inc; 3rd
edition. ISBN-10: 0761911472
97
Kahneman, D., & Tversky, A. (1974) ―JUDGMENT UNDER UNCERTAINTY: HEURISTICS AND BIASES,‖ Science
185,: 1124–1131.
Kane, S. F., & Duranske, B.T. (2008) ―VIRTUAL WORLDS, REAL WORLD ISSUES‖ Available online:
http://www.americanbar.org/content/dam/aba/migrated/intelprop/magazine/LandslideSep08_Kane.authcheckdam.pdf. Last
accessed: 03.06.2011.
Kato, P. M. (2010) ―VIDEO GAMES IN HEALTH CARE: CLOSING THE GAP‖. Review of General Psychology, Vol
14(2), Jun 2010, 113-121. doi: 10.1037/a0019441
Kayser, J. J. (2006) "THE NEW NEW-WORLD: VIRTUAL PROPERTY AND THE END USER LICENSE
AGREEMENT" Entertainment Law Review Loyola of Los Angeles Entertainment Law Available online:
http://elr.lls.edu/issues/v27-issue1/documents/08.Kayser.pdf. Last accessed: 03.06.2011
Kimak, J. (2009) "6 IMPORTANT REAL WORLD SKILLS YOU LEARNED FROM VIDEOGAMES" Available online at:
http://www.bspcn.com/2009/03/28/6-important-real-world-skills-you-learned-from-videogames/. Last accessed: 03.06.2011.
Kimppa, K. K A., & Bissett, K. (2005) ―THE ETHICAL SIGNIFICANCE OF CHEATING IN ONLINE COMPUTER
GAMES‖. International Review of Information Ethics, Vol. 4 (12/2005) Available online: http://www.i-r-ie.net/inhalt/004/Kimppa-Bissett.pdf. Last accessed: 03.06.2011
Kray, L. J., & Galinsky, A. D (2003) ―THE DEBIASING EFFECT OF COUNTERFACTUAL MIND-SETS: INCREASING
THE SEARCH FOR DISCONFIRMATORY INFORMATION IN GROUP DECISIONS‖ Organizational Behavior and
Human Decision Processes, 91, 69-81.
Kuecklich, J. (2004) ―OTHER PLAYINGS-CHEATING IN COMPUTER GAMES‖,
http://www.cs.uu.nl/docs/vakken/vw/literature/03.kuecklich.pdf. Last accessed 09.06.2011
Available
online:
Landesman, M. (2009) "ONLINE GAME COMPROMISE AND SOCIAL ENGINEERING" Available online:
http://antivirus.about.com/od/securitytips/a/wowscams.htm. Last accessed: 03.06.2011.
Langenderfer, J., & Shimp, T. A (2001) ―CONSUMER VULNERABILITY TO SCAMS, SWINDLES, AND FRAUD: A
NEW THEORY OF VISCERAL INFLUENCES ON PERSUASION‖. Psychology and Marketing, 18, 763-783.
Lastowka, F.
G. (2007) ―RULES OF PLAY‖ (paper presented at AoIR 8, October 17). Available online:
http://terranova.blogs.com/RulesofPlay.pdf. Last accessed: 03.06.2011
Lastowka, F. G. (2009) ―PLANES OF POWER: EVERQUEST AS TEXT, GAME AND COMMUNITY. GAME
STUDIES‖ Available online: http://gamestudies.org/0901/articles/lastowka. Last accessed: 03.06.2011
Lastowka, F. G., & Hunter, D. (2004) ―THE LAWS OF THE VIRTUAL WORLDS‖ California Law Review 92, 1-73.
Lee, J. (2005) "FROM SWEATSHOPS TO STATESIDE CORPORATIONS, SOME PEOPLE ARE PROFITING OFF OF
MMO GOLD." Available online:http://www.1up.com/features/wage-slaves-mmo-goldfarming. Last accessed: 03.06.2011.
Lehdonvirta, V. (2005) ―ECONOMIC INTEGRATION STRATEGIES FOR VIRTUAL WORLD OPERATORS‖, Available
online: http://www.hiit.fi/~vlehdonv/documents/economic_integration_thesis.pdf . Last accessed 09.06.2011
Lehdonvirta, V. (2010) "VIRTUAL WORLDS DON'T EXIST: QUESTIONING THE DICHOTOMOUS APPROACH IN
MMO STUDIES" Available online: http://gamestudies.org/1001/articles/lehdonvirta. Last accessed: 03.06.2011
Lehn, M., Triebel, T., Gross, C., Stingl, D., Saller, K., Effelsberg, W., Kovacevic, A. and Steinmetz, R. (2010) "DESIGNING
BENCHMARKS FOR P2P SYSTEMS" From active data management to event-based systems and more Available online:
http://portal.acm.org/citation.cfm?id=1985642 Last accessed 10.08.2011
Lemos,
R.
(2007)
"ACCOUNT
PRETEXTERS
PLAGUE
http://www.securityfocus.com/news/11452. Last accessed: 03.06.2011.
98
XBOX
LIVE"
Available
online:
Leyden, J. (2010) "EVERYTHING YOU EVER WANTED TO KNOW ABOUT XBOX HACKING" Available online:
http://www.theregister.co.uk/2010/02/21/xbox_hacking_phishing_analysis/page3.html. Last accessed: 03.06.2011.
Locationbar (2011) ―MOZILLA SAFE BROWSING ADD-ONS‖. Available online:https://addons.mozilla.org/enUS/firefox/addon/locationbar%C2%B2/ Last accessed: 03.06.2011.
MacInnes, I. (2005) "THE IMPLICATIONS OF PROPERTY RIGHTS IN VIRTUAL WORLD BUSINESS MODELS".
Former
Departments,
Centers,
Institutes
and
Projects.
Paper
63.
Available
online:
http://surface.syr.edu/cgi/viewcontent.cgi?article=1062&context=ischool_other&seiredir=1#search=%22THE+IMPLICATIONS+OF+PROPERTY+RIGHTS+IN+VIRTUAL+WORLD+BUSINESS+MODEL
S%22. Last accessed: 03.06.2011
Masnick, M. (2006) "NICE WORK RETRIEVING THAT MAGIC SWORD…... BUT NOW YOU NEED TO PAY UNCLE
SAM FOR IT" Techdirt. Available online: http://www.techdirt.com/articles/20061017/163943.shtml. Last accessed:
03.06.2011.
Mäyrä, F. (2010) "GAMING CULTURE AT THE BOUNDARIES OF PLAY" the international journal of computer game
research volume 10 issue 1 April 2010 ISSN:1604-7982 Available online: http://gamestudies.org/1001/articles/mayra. Last
accessed: 03.06.2011
MC Master University Wiki (2009) "MAN IN THE
MIDDLE ATTACK"
http://www.cas.mcmaster.ca/wiki/index.php/Man_in_the_Middle_Attack. Last accessed: 03.06.2011.
Available
online:
McCurley, M. (2010) "THE LAWBRINGER: ACCOUNT SECURITY AND YOU" WoW Insider, Available at:
http://wow.joystiq.com/tag/account-theft/, Last accessed 23.05.2011
McCurley, M. (2010), "BLIZZARD ANNOUNCES AUTOMATED ACCOUNT RECOVERY FROM HACKED
ACCOUNTS". Available online at: http://wow.joystiq.com/2010/09/22/blizzard-announces-automated-account-recoveryform-for-hacked-ac/#continued. Last accessed: 03.06.2011.
McVey, B. (2011) "SOCIAL ENGINEERING AND PRETEXTING" Available online: http://ezinearticles.com/?SocialEngineering-and-Pretexting&id=1148633. Last accessed: 03.06.2011.
Microsoft
Corporation
(n.d)
―PHISHING
FILTER
AT
A
GLANCE‖.
Available
online:http://www.microsoft.com/mscorp/safety/technologies/antiphishing/at_glance.mspx. Last accessed: 03.06.2011.
Microsoft
Safety
(2008)
Microsoft
Corporation.
Available
http://www.microsoft.com/mscorp/safety/technologies/senderid/default.mspx. Last accessed: 03.06.2011.
online:
Mirkovic, J., Martin, J., & Reiher, P. (2002) ―A TAXONOMY OF DDOS ATTACKS AND DDOS DEFENSE
MECHANISMS‖ Available online: http://lasr.cs.ucla.edu/DDoS/ucla_tech_report_020018.pdf. Last accessed: 03.06.2011.
Mitchell,
B.
(n.d.)
"INTRODUCTION
TO
COMPUTER
NETWORK
SPEED"
Available
online:
http://compnetworking.about.com/od/speedtests/a/network-speed.htm Last accessed 10.08.2011
MMOCrunch
(2009)
"THIEVERY
AND
SHENANIGANS"
http://www.mmocrunch.com/2009/05/06/thievery-and-shenanigans/. Last accessed: 03.06.2011.
Available
MMOCrunch (2010) "WORLD OF WARCRAFT AUTHENTICATOR HACKED" Available
http://www.mmocrunch.com/2010/02/28/world-of-warcraft-authenticator-hacked/ Last accessed: 03.06.2011.
online:
online:
MMOWNED (2010) ―[IN-GAME SCAM] BOOST SCAM‖ Available online: http://www.mmowned.com/forums/world-ofwarcraft/general/scam-prevention/184261-game-scam-boost-scam.html Last accessed: 03.06.2011.
Mørch, K. H. T. (2003) "CHEATING IN ONLINE GAMES – THREATS AND SOLUTIONS" Available online:
http://publications.nr.no/Cheating_in_Online_Games.pdf. Last accessed: 03.06.2011.
MozzilaFirefox (n.d.) ―PHISHING AND MALWARE PROTECTION‖ Available online: http://www.mozilla.com/enUS/firefox/phishing-protection/. Last accessed: 03.06.2011.
99
Musgrove, M. (2005). "VIRTUAL GAMES CREATE A REAL WORLD MARKET". The Washington Post. Available
online: http://www.washingtonpost.com/wp-dyn/content/article/2005/09/16/AR2005091602083.html. Last accessed:
03.06.2011.
Namkara (2009) "HOMOGRAPH DOMAIN NAMES" Namkara Domain Name Information. Available online:
http://www.namkara.com/homograph. Last accessed: 03.06.2011.
Naone, E. (2007) "CATCHING CHEATERS WITH THEIR OWN COMPUTERS: ANTI-CHEATING HARDWARE
COULD KEEP ONLINE GAME PLAYERS HONEST" MIT's Technology Review, Available online at:
http://www.technologyreview.com/Infotech/19005/?a=f . Last accessed: 20.07.2011.
Naraine, R. (2007) "MICROSOFT: XBOX LIVE ACCOUNT THEFT WAS SOCIAL ENGINEERING ATTACK"
Available online: http://www.zdnet.com/blog/security/microsoft-xbox-live-account-theft-was-social-engineering-attack/134.
Last accessed: 03.06.2011.
Naraine,
R.
(2007)
"XBOX
LIVE
HACKED,
ACCOUNTS
STOLEN"
Available
http://www.zdnet.com/blog/security/xbox-live-hacked-accounts-stolen/131. Last accessed: 03.06.2011.
online:
Nash, J., & Schneyer, E. (2004) ―VIRTUAL ECONOMIES: AN IN-DEPTH LOOK AT THE VIRTUAL WORLD OF
FINAL FANTASY XI: ONLINE‖. Available online: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.95.6615. Last
accessed: 03.06.2011.
Nelson, J. (2010). "THE VIRTUAL PROPERTY PROBLEM: WHAT PROPERTY RIGHTS IN VIRTUAL RESOURCES
MIGHT LOOK LIKE, HOW THEY MIGHT WORK, AND WHY THEY ARE A BAD IDEA". McGeorge Law Review 41:
281, 285–86. Available online: http://ssrn.com/abstract=1469299. Last accessed: 03.06.2011.
Neumann, C., Prigent, N., Varvello, M., & Suh, K. (2007) ―CHALLENGES IN PEER-TO-PEER GAMING‖. ACM
SIGCOMM Computer Communication Review. 37, 1, pages 79 to 82. Available online: http://matteovarvello.com/pdf/p79v37n1p-neumann.pdf. Last accessed: 03.06.2011.
nProtect (2011) Available online: http://www.nprotect.com/index.html. Last accessed: 23.05.2011
Oates, J. (2010) "CALL OF DUTY DDOS ATTACK POLICE ARREST TEEN" The Register. Available online:
http://www.theregister.co.uk/2010/12/09/hacker_held_gaming_attack/. Last accessed: 03.06.2011.
ObserverMode (n.d.), Guild Wars. Available online: http://www.guildwars.com/gameplay/pvp/observermode/ Last accessed
10.08.2011
Office of Fair Trading, (2009) ―THE PSYCHOLOGY OF SCAMS: PROVOKING AND COMMITTING ERRORS OF
JUDGEMENT‖ Prepared for the Office of Fair Trading by the University of Exeter School of Psychology. Available online:
http://www.oft.gov.uk/shared_oft/reports/consumer_protection/oft1070.pdf. Last accessed: 03.06.2011.
Olavsrud, T. (2011) "COMPANIES FAIL DEFCON SOCIAL ENGINEERING SECURITY TEST" Available online:
http://www.esecurityplanet.com/news/article.php/3896386/Companies-Fail-DefCon-Social-Engineering-Security-Test.htm.
Last accessed: 03.06.2011.
Ollmann, G. (2007) "THE PHISHING GUIDE (PART 1) UNDERSTANDING AND PREVENTING PHISHING
ATTACKS"
Technical
Info
making
sense
of
security
Available
Online:
http://www.technicalinfo.net/papers/Phishing.html. Last accessed: 03.06.2011.
OllyDbg (2010). Available online: http://www.ollydbg.de/. Last accessed: 03.06.2011.
Olzak, T. (2009) "SOCIAL ENGINEERING V. PHYSICAL SECURITY" CSO Security and Risk Available online:
http://blogs.csoonline.com/. Last accessed: 03.06.2011.
Opera
Software (n.d.)
"OPERA‘S
FRAUD AND MALWARE
PROTECTION"
http://www.opera.com/browser/tutorials/security/fraud/. Last accessed: 03.06.2011.
100
Available
online:
Oracle The Java Tutorials Available online: http://download.oracle.com/javase/tutorial/essential/concurrency/deadlock.html
Last accessed: 02.08.2011
Oster, R. (2004) "ECONOMY STATS" Available online: http://www.simbaspaws.org/Aziri/SWG/economy.htm Last
accessed 10.08.2011
Parker,
J.
(2007)
"CHEATING
BY
VIDEO
GAME
PARTICIPANTS"
Available
http://journals.sfu.ca/loading/index.php/loading/article/viewPDFInterstitial/24/23. Last accessed: 09.06.2011
online:
Paypal (2008) ―A PRACTICAL APPROACH TO MANAGING PHISHING‖ Available online: https://www.paypalpress.fr/imagelibrary/downloadMedia.ashx?MediaDetailsID=166. Last accessed: 03.06.2011.
Perez,
J.
C.
(2004)
"GOOGLE
UPGRADES
GMAIL"
PC
World.
http://www.pcworld.com/article/118567/google_upgrades_gmail.html. Last accessed: 03.06.2011.
Available
online:
Play No Evil (2007) "WORLD OF WARCRAFT REAL MONEY TRANSACTION (RMT) CRIME - UPDATED" Available
online:
http://www.playnoevil.com/serendipity/index.php?/archives/1211-World-of-Warcraft-Real-Money-TransactionRMT-Crime-UPDATED.html. Last accessed: 03.06.2011.
Posey, B. (2004) "HOW SPYWARE AND THE WEAPONS AGAINST IT ARE EVOLVING". WindowsSecurity.com.
TechGenix Ltd. Available online: http://www.windowsecurity.com/articles/Spyware-Evolving.html. Last accessed:
03.06.2011.
Pritchard, M. (2000), ―HOW TO HURT THE HACKERS‖, in Game Developer Magazine, Jun. 2000. pp. 28-30. Available
online: http://www.gamasutra.com/view/feature/3149/how_to_hurt_the_hackers_the_scoop_.php. Last accessed: 03.06.2011.
Project
Natal
101
(2009)
Microsoft.
2009-06-01.
http://blog.seattlepi.com/digitaljoystick/archives/169993.asp . Last accessed: 03.06.2011.
Available
Qualls,
E.
(n.d.)
"WHAT
IS
A
GAMERSCORE?"
http://xbox.about.com/od/xbox360faqs/f/gamerscorefaq.htm Last accessed 10.08.2011
Available
online:
online:
Raskin, A. (2010) "DOES GOOGLE CENSOR TIANANMEN SQUARE? HOW TO CREATE AN INTERNET HOAX"
Available online: http://www.azarask.in/blog/post/does-google-censor-tiananmen-square-how-to-create-an-internet-hoax/.
Last accessed: 03.06.2011.
Report on Phishing (2006) ―REPORT ON PHISHING: A REPORT TO THE MINISTER OF PUBLIC SAFETY AND
EMERGENCY PREPAREDNESS CANADA AND THE ATTORNEY GENERAL OF THE UNITED STATES‖.
Binational
Working
Group
on
Cross-Border
Mass
Marketing
Fraud.
Available
online:
http://www.justice.gov/opa/report_on_phishing.pdf. Last accessed: 03.06.2011.
Robles, R.J., Sang-Soo, Y., Young-Deuk, M., Gilcheol, P., Seoksoo, K. (2008) "ONLINE GAMES AND SECURITY
ISSUES" Future Generation Communication and Networking, 2008. FGCN '08. Second International Conference, ISBN:
978-0-7695-3431-2. Available online: http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?tp=&arnumber=4734193. Last accessed:
03.06.2011
Ruch, A. (2009) "WORLD OF WARCRAFT: SERVICE OR SPACE?" the international journal of computer game research
volume 9 issue 2 November 2009 ISSN:1604-7982 Available online: http://gamestudies.org/0902/articles/ruch. Last
accessed: 03.06.2011
Rybka,
J.
(n.d.)
"CRYSIS
CHEATS
PC"
http://vgstrategies.about.com/od/pccheatsc/a/CrysisPCCheats.htm Last accessed 10.08.2011
Available
online:
Salen, K., & Zimmerman, E. (2004) ―THE RULES OF PLAY: GAME DESIGN FUNDAMENTALS‖. Cambridge, MA:
MIT Press.
Santerre,
M.
(2009)
"CHOOSING
A
SMART
PASSWORD"
Gmail
Blog
http://gmailblog.blogspot.com/2009/10/choosing-smart-password.html. Last accessed: 03.06.2011.
101
Available
online:
Schreier, J. (2010) "EA: SINGLE-PLAYER GAMES ARE ‗FINISHED‘"
http://www.wired.com/gamelife/2010/12/ea-single-player/ Last accessed 10.08.2011
Wired
Available
online:
Schulz-Hardt, S., Fischer, P., & Frey, D. (2008) ―SELECTIVE EXPOSURE TO INFORMATION: A NEW
EXPLANATION BASED ON BIASED ARGUMENT EVALUATION‖. Submitted for publication.
Schwarz, N., Frey, D., & Kumpf, M. (1980) ―INTERACTIVE EFFECTS OF WRITING AND READING A PERSUASIVE
ESSAY ON ATTITUDE CHANGE AND SELECTIVE EXPOSURE‖. Journal of Experimental Social Psychology, 16, 1-17.
Sezen, T.I., & Isikoglu, D. (2007) "FROM OZANS TO GOD-MODES: CHEATING IN INTERACTIVE
ENTERTAINMENT
FROM
DIFFERENT
CULTURES"
Available
online:
http://web.mit.edu/commforum/mit5/papers/sezen_isikoglu.pdf. Last accessed: 03.06.2011.
Shenglong, B.C., Sheng, G.C., Han, K.Z, Jiaqi, L., and Kuangwei, D.S (2007) "HISTORY OF MULTIPLAYER GAMES"
Available online: http://www.ssagsg.org/LearningSpace/EntertainmentGaming/HistoryMPG.htm Last accessed 10.08.2011.
Shilov, A. (2010) "MICROSOFT WINDOWS 8 FEATURES LEAKED: INSTANT-ON, FACIAL RECOGNITION, NEW
TECHNOLOGIES"
Available
online:
http://www.xbitlabs.com/news/other/display/20100629221134_Microsoft_Windows_8_Features_Leaked_Instant_On_Facial
_Recognition_New_Technologies.html. Last accessed: 03.06.2011.
Simon,
C.
(2006)
"IGE:
INSIDE
THE
MMO
TRADING
MACHINE"
Available
online
http://www.gamasutra.com/view/feature/1837/ige_inside_the_mmo_trading_machine.php. Last accessed: 03.06.2011
at:
Slavitt, K. M.
(2004) ―PROTECTING YOUR INTELLECTUAL PROPERTY FROM DOMAIN NAME
TYPOSQUATTERS‖. Available online: http://library.findlaw.com/2004/May/11/133410.html. Last accessed: 03.06.2011.
Smart Text (2010) ―MOZZILA SAFE BROWSING ADD-ONS‖ Available online:
US/firefox/addon/smart-text/. Last accessed: 03.06.2011.
https://addons.mozilla.org/en-
Smed, J., & Hakonen, H. (2006) ―ALGORITHMS AND NETWORKING FOR COMPUTER GAMES‖ Published by Wiley;
1 edition (July 10, 2006), ISBN-10: 0470018127
Smed, J., Knuutila, T., & Hakonen, H. (2006) "CAN WE PREVENT COLLUSION IN MULTIPLAYER ONLINE
GAMES?", in Honkela, Raiko, Kortela, and Valpola (eds.) Proceedings of the Ninth Scandinavian Conference on Artificial
Intelligence
(SCAI
2006),
pp.
168–175,
Espoo,
Finland.
Available
online:
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.76.1764. Last accessed: 03.06.2011.
Smith, J.H. (2004) "PLAYING DIRTY – UNDERSTANDING CONFLICTS IN MULTIPLAYER GAMES" Paper presented
at the 5th annual conference of The Association of Internet Researchers, The University of Sussex.
Smith, J.H. (2007) "TRAGEDIES OF THE LUDIC COMMONS - UNDERSTANDING COOPERATION IN
MULTIPLAYER GAMES" the international journal of computer game research volume 7 issue 1 2007 ISSN:1604-7982,
Available online: http://gamestudies.org/07010701/articles/smith. Last accessed 09.06.2011
Sony (2004). Station.com Knowledge Base: Here is the solution: Star Wars Galaxies: ―HOW DO I KNOW IF I AM
EXPLOITING OR NOT?‖
Spohn,
D.
(2002)
"CHEATING
IN
ONLINE
GAMES"
http://internetgames.about.com/od/gamingnews/a/cheating.htm. Last accessed: 03.06.2011.
Available
online:
SpyBot(nd) Available online: http://www.safer-networking.org/en/dictionary/spyware.html. Last accessed 2.08.2011
SpyCop (2007) "HARDWARE KEYLOGGER DETECTION" Available online: http://spycop.com/keyloggerremoval.htm
Last accessed 10.08.2011
Squire, K. (2005) ―CHANGING THE GAME: WHAT HAPPENS WHEN VIDEO GAMES ENTER THE CLASSROOM?‖
Innovate 1, 6.
102
Steam (2011) Available online: http://forums.steampowered.com/forums/forumdisplay.php?f=35. Last accessed: 23.05.2011
Stevenson, R. L. B. (2005) "PLUGGING THE "PHISHING" HOLE: LEGISLATION VERSUS TECHNOLOGY" 2005
Duke L. & Tech. Rev. 0006. Available online at: http://www.law.duke.edu/journals/dltr/articles/2005dltr0006.html. Last
accessed: 03.06.2011.
Stickney, A. (2011) "KNOW YOUR LORE: THE MYSTERIOUS CONNECTION BETWEEN SPIRIT HEALERS AND
THE VAL'KYR" Available online: http://wow.joystiq.com/tag/spirit-healer/ Last accessed 10.08.2011
Tan,
K.
T.
(2006)
"PHISHING
AND
SPAMMING
http://isc.sans.edu/diary.html?storyid=1905. Last accessed: 03.06.2011.
VIA
IM
(SPIM)".
Available
online:
Taylor, B. (2008) "FIGHTING PHISHING WITH EBAY AND PAYPAL". Official Gmail Blog. Available online:
http://gmailblog.blogspot.com/2008/07/fighting-phishing-with-ebay-and-paypal.html. Last accessed: 03.06.2011.
Taylor, T.L. (2006) ―PLAY BETWEEN WORLDS: EXPLORING ONLINE GAME CULTURE‖. Cambridge, MA: MIT
Press.
TCPDUMP/LIBPCAP (2010) Available online: http://www.tcpdump.org/. Last accessed: 03.06.2011.
techPowerUP!
forums
(2010)
"STEAM
.T35
LINKS.
BEWARE!"
http://www.techpowerup.com/forums/showthread.php?t=116135. Last accessed: 03.06.2011.
Available
online:
Tetzlaff, R. (2010) "UNDERSTANDING SOCIAL ENGINEERING - TECHNIQUES USED" Available online:
http://www.brighthub.com/computing/enterprise-security/articles/64737.aspx. Last accessed: 03.06.2011.
The
Ancient
Gaming
Noob
(2010)
"CATACLYSM
BETA
PHISHING".
http://tagn.wordpress.com/2010/06/15/cataclysm-beta-phishing/. Last accessed: 03.06.2011.
Available
online:
The Honeynet Project (2011) Available online: http://www.honeynet.org/node/88. Retrieved 11.02.2011. Last accessed:
03.06.2011.
The Stoppable Force (2009) "THE BLIZZARD AUTHENTICATOR: A JOURNEY IN PICTURES" Available online:
http://thestoppableforce.net/2009/04/03/the-blizzard-authenticator-a-journey-in-pictures/. Last accessed: 03.06.2011.
Thompson, C. (2007) "What TYPE OF GAME CHEATER ARE YOU?" Games Without Frontiers. Available online:
http://www.wired.com/gaming/virtualworlds/commentary/games/2007/04/gamesfrontiers_0423. Last accessed: 09.06.2011
Torres, R. (2009a) "AN INTERVIEW WITH A SCAMMER" WoW Insider.
http://wow.joystiq.com/2009/06/06/an-interview-with-a-scammer/. Last accessed: 03.06.2011.
Available
online:
Torres, R. (2009b) "BEWARE OF BLOOD ELVES SELLING MOUNTS" WoW Insider. Available online:
http://wow.joystiq.com/2009/06/04/beware-of-blood-elves-selling-mounts/. Last accessed: 03.06.2011
Torres, R. (2009c) "POPULAR SCAMS AND HOW TO AVOID THEM" WoW Insider. Available online:
http://wow.joystiq.com/2009/06/12/popular-scams-and-how-to-avoid-them/. Last accessed: 03.06.2011.
True Poker (n.d.) ―ANTI-COLLUSION, FRAUD DETECTION, AND RANDOM CARD SHUFFLING‖. Available online:
http://www.truepoker.ag/poker-support/security-and-integrity/anti-collusion/. Last accessed: 03.06.2011.
TTH (2010) "WOW - VIRUS BYPASSES BNET AUTHENTICATOR" Ten Ton Hammer . Available online:
http://www.tentonhammer.com/node/81582. Last accessed: 03.06.2011.
US-CERT (2009) ―VULNERABILITY NOTE VU#800113‖. Available online: http://www.kb.cert.org/vuls/id/800113#pat.
Last accessed: 03.06.2011.
103
van Summeren, R. (2011) "SECURITY IN ONLINE GAMING" Radboud University Nijmegen. Available online:
http://www.cs.ru.nl/bachelorscripties/2011/Rens_van_Summeren___0413372___Security_in_Online_Gaming.pdf.
Last
accessed: 03.06.2011.
Ward, M. (2005). "WARCRAFT GAME MAKER IN SPYING ROW".
http://news.bbc.co.uk/1/hi/technology/4385050.stm. Last accessed: 03.06.2011.
BBC
News.
Available
online:
Warner, D. & Raiter, M. (2005) "SOCIAL CONTEXT IN MMOGS : ETHICAL QUESTIONS IN SHARED SPACE"
International Review of Information Ethics, vol. 4 (12/2005). Available online: http://www.i-r-i-e.net/inhalt/004/WarnerRaiter.pdf. Last accessed: 03.06.2011.
Wason, P. C., & Shapiro, D (1971) ―NATURAL AND CONTRIVED EXPERIENCE IN A REASONING TASK‖. Quarterly
Journal of Experimental Psychology, 23, 63-71.
Web of Trust (2010) ―MOZILLA SAFE BROWSING ADD-ONS‖. Available online: https://addons.mozilla.org/enUS/firefox/addon/wot-safe-browsing-tool/. Last accessed: 03.06.2011.
Webb, S. D., & Soh, S. (2007) "CHEATING IN NETWORKED COMPUTER GAMES – A REVIEW". Available online:
http://portal.acm.org/citation.cfm?id=1306839. Last accessed: 03.06.2011.
Weeks,
M.
(2010)
"COLLUSION
AND
CONSEQUENCES"
Available
http://worldchesschampionship.blogspot.com/2010/02/collusion-and-consequences.html. Last accessed: 09.06.2011
online:
Woollacott, E. (2010) "GAMERS MORE PRONE TO SPAM AND PHISHING" TG Daily Available online:
http://www.tgdaily.com/security-features/49202-gamers-more-prone-to-spam-and-phishing Last accessed: 05.05.2011
World
of
Warcraft
Forums
(2009)
Available
http://forums.worldofwarcraft.com/thread.html?topicId=442927843&sid=1&#5. Last accessed: 03.06.2011.
online:
World of Warcraft Policy (2009) Available online: http://www.worldofwarcraft.com/policy/ui.html. Last accessed:
03.06.2011.
WoW Insider (n.d.) Security News. Available online: http://wow.joystiq.com. Last accessed 23.06.2011
WoW Vault (2006) ―STRANGLETHORN FISHING EXTRAVAGANZA EXPLAINED‖. Comment by Nat_Pagle_Dude .
Available online at: http://wowvault.ign.com/View.php?view=Guides.Detail&id=124. Last accessed: 03.06.2011.
WoW Wiki (n.d.) ―MAKING A MACRO‖ Available online: http://www.wowwiki.com/Making_a_macro. Last accessed:
03.06.2011.
Wowpedia
(n.d.)
―BATTLE.NET
MOBILE
AUTHENTICATOR
SPECIFICATION‖
http://www.wowpedia.org/Battle.net_Mobile_Authenticator_Specification. Last accessed: 03.06.2011.
WoWWiki,
(n.d.)
"BATTLE.NET
MOBILE
AUTHENTICATOR"
http://www.wowwiki.com/Battle.net_Mobile_Authenticator. Last accessed: 03.06.2011.
Available
Available
online:
online:
XBOX (2011) Available online: http://www.xbox.com/en-US/live Last accessed 10.08.2011.
XBOX Live (2011) "MICROSOFT POINTS" Available online: http://www.xbox.com/en-US/Live/MicrosoftPoints Last
accessed 10.08.2011
Yahoo
Help
(2010)
"DOMAINKEYS
HELP
DETECT
FORGED
EMAIL"
http://help.yahoo.com/l/us/yahoo/mail/classic/context/context-07.html. Last accessed: 03.06.2011.
Available
Yahoo Protect Login (2011) Available online: https://protect.login.yahoo.com/. Last accessed: 03.06.2011.
104
online:
Yonzon,
Z.
(2008)
"WRATH
101:
WINTERGRASP
ZONE
OVERVIEW"
http://wow.joystiq.com/2008/12/14/wrath-101-wintergrasp-zone-overview/ Last accessed 10.08.2011
Available
online:
Ziebart, A. (2009) "PSA: DON'T GET SCAMMED BY CATACLYSM PHISHING"
Available online:
http://wow.joystiq.com/2009/08/12/psa-dont-get-scammed-by-cataclysm-phishing/. Last accessed: 03.06.2011.
Ziebart, A. (2010) "IN DEFENSE OF CARE PACKAGES AND MANDATORY AUTHENTICATORS". Available online:
http://wow.joystiq.com/2010/01/11/in-defense-of-care-packages-and-mandatory-authenticators/. Last accessed: 03.06.2011.
105