📜 ⬆️ ⬇️

Danger of in-game data collection

image

Nick Yi sells secrets.

He knows what you like, dislike, and, more importantly, why. He knows what motivates you, what prevents you from reaching your goal, what repels you, and everything that is in the middle of this spectrum. At least in terms of games. For many years he has collected information about players, having conducted more than 200,000 voluntary surveys, and found out their age, their favorite and least favorite things, and their preferred genres.

And he sells this knowledge to developers. Some of them change their games based on this knowledge so that you buy them, play them, and tell others about them.
')
Yi founded Quantic Foundry in 2015, selling its data to gaming companies such as Tencent (owner of League of Legends ), PopCap (studio that developed Plants vs Zombies) and Wizards of the Coast (publisher Magic: The Gathering).

“In the past, the game development industry could not find out the true data,” Yi says. “When people played on consoles without the Internet, the developers did not receive feedback. They could not make a clear picture of how users play their games. ”

Now, according to the developers, they have large amounts of data, both from telemetry products (in-game behavior of players) and from external sources (for example, from Yi surveys). And some people begin to fear that they may have too much information.

According to many developers, frequent password leaks from social networks, companies and other sources make privacy in games a very important aspect. In addition, the development of interconnected government systems, such as the Chinese social credit system, poses the question: can in-game behavior influence how you are perceived in the real world?

The first kind of data is information collected from players by Nick Yi. It is rather generalized, anonymous and helps developers identify common traits and personality types. For example, Quantic Foundry software allows developers to select a game (say, Civilization), and then see a graph that shows how to motivate players.

The second kind of data is more specific. How do players interact with the game? What choice do they make?

This information can be used to create better games. And it can also be combined with other types of information to create clear personal profiles. Such personal profiles are usually used for targeted advertising, but privacy experts warn that in the future this information can be used in a frightening, unexpected way.

"Sometimes the data collection infrastructure is created for one purpose ... but then people start thinking about other ways to use it," says Jay Stanley, a senior political analyst at the American Civil Liberties Union.

And if it really becomes a reality, then developers should start creating games in such a way as to prevent such manipulations before they occur.

image

Data collection concerns


Many games (if not most) have embedded systems that track how users play them. Developers can use this information to change storylines, difficulty levels, and also take it into account when adding new content.

This data is usually isolated. For example, an XCOM game can track which of the two missions you have chosen. But from this simple solution it is difficult to understand something about the player’s identity.

But in some cases, games collect more personal data, which improves accuracy. Such information as the choice of the option in the dialogue or the real personality tests that can give an understanding of the player's identity is saved.

Privacy experts and some developers fear that such information can be linked to a network of online services and used in a questionable way. That is why, according to screenwriter Sam Barlow, if his 2009 Silent Hill: Shattered Memories game were released today, he would have behaved differently.

“The development process would definitely have added complexity,” he says.

At the beginning, the game asks players to take a psychometric test, and based on their answers, the game content changes. For example, as Barlow says, some players always maintain respect for the authorities. If the Silent Hill personality test detects this, then the players will see a policeman who first appears as an ally, but as a result turns out to be hostile and rude.

Players experiencing (according to the psychological test) distrust of the authorities will meet with a caring and responsive policeman who genuinely worries about them ... and then leaves the player alone. According to Barlow, this approach uses personality traits to disprove expectations and increase the intensity of drama.

“I wrote down all this data and analyzed it further. To be honest, it seems that you are spying on someone, ”admits Barlow.

Sam recalls how he showed the game at E3 and watched as the players became more and more nervous, reading questions like “have you ever changed your partner?” At the same time, according to Barlow, it was amazing to see how interactive gameplay was created in a “direct and personal” way.

But since then, Barlow has begun to ask himself: how can he create interactive stories while maintaining the anonymity of the data?

“It makes you think twice about what information we collect,” says Barlow.

“In Shattered Memories there is a moment when a player walks down the corridor, and usually this path takes about 15 seconds. But at the same time there is a conversation between the main character and his wife, which lasts 30 seconds - the game remembers whether the player listened or ignored the conversation. ”

"Based on this and some other variables, she gives him the ending he deserves."

It is these types of microscopic decisions that determine the psychometric profile of the player. Although this is by no means a strict psychological test, the consequences of data collection are becoming more pronounced.

“Can it happen that in ten years you will not be able to get a job because the game showed that you are not a team player?” He says.

“I spoke with people from the marketing department who are very interested in creating an interactive storytelling, and they usually reported that there was a lot of money involved,” says Barlow.

In 2018, Netflix experimented with a “choose your own adventure” narrative. She released the interactive episode of “Black Mirror”, in which she recorded the decisions made by the audience . Tracking user choices plays a huge role in marketing: by collecting a lot of minor decisions made by viewers (or players), companies like Netflix can create profiles not only about their personalities, but also about the types of products and services that they like and dislike. And still such data can be sold.

Netflix has known about your TV habits for many years. An important difference is how microscopic such systems become. Instead of finding out what you like more — Gilmore Girls or Breaking Bad, Netflix can now understand with the help of an interactive narrative: do you need a storyline in which Walter White kills his enemies or gives them freedom?

“Any data element can easily be extrapolated. If you are tracking the activity of a husband and wife, information about when they play games can give an analysis of the time they spend together. You can measure the level of their relationship, ”says Barlow.

According to Barlow, such assumptions make polls more interesting, but also more dangerous. And that means creating games like Silent Hill: Shattered Memories poses moral questions to developers about the data they collect.

“If it was on the iPhone, then theoretically you could link your game with a psychological profile,” says Barlow.

Anonymous data does not exist


Privacy experts and developers note that every piece of data recorded by a company can be matched with other databases. Separate pieces of data, such as the decision made in a game, may seem harmless, but when combined with other sets, they become a detailed model of behavior and psychology.

This fear is also experienced by Obsidian Design Director Josh Sawyer. He worked on the game Fallout: New Vegas , which also used player personality tests.

Like Barlow, Sawyer says that if the game were released today, he would have made other decisions regarding its design.

“We would take it differently,” says Sawyer. “Telemetry data collection is performed in all our games, but we always discuss information security. We ask users if they want data collection to be enabled by default. ”

Although Sawyer said that this test was not based on some psychological structure and was needed as a way to select the skills of the players' characters, in itself it is a curious questionnaire. In it, players are asked if they prefer certain traits (for example, honesty) to others, such as modesty.

“There’s no psychological certainty,” says Sawyer, but he also notes that this is the problem.

If this information is collected and used in new algorithms, the idea that programs, applications or services can extract psychological data from tests that do not have psychological rigor imposes a great responsibility on game designers.

“The most frightening thing about Amazon’s algorithms is that they can determine if a woman is pregnant, or if a man is gay ... And they do not extract this information from their obvious actions, but from side features that no one considers to be something unique,” says barlow.

There is one case in which the Target chain of stores sent a catalog with goods for children to a teenage girl, predicting her pregnancy on the basis of previous customer behavior.

"That is why I believe that we need to be very careful and cautious with the data collected and transmitted, because they can be used in a much more sophisticated way."

Nick Yi reports that the data corresponding to personal habits collected during these tests will be fairly rough and unreliable, but those with such information can understand this and still use it.

And this is not a hypothetical assumption. The Chinese government’s recently introduced social credit system relies on thousands of incoming data, including information about reckless spending or playing video games that are too long. Municipal authorities of one city said that in the future they will impose penalties for cheating in games .

Nintendo's Miitomo application also experimented with this type of data collection — it asked users about preferences for different products. Although Nintendo stated that it would not sell information to third parties, users began to see relevant advertising on other Nintendo devices linked by a digital account.

Nick Yi fully admits that future credit systems like the Chinese can, among other things, take into account decision-making data in video games.

“If you repeatedly choose the more risky option, despite the presence of a less aggressive alternative ... can this tell the creditor something about your readiness for a loan?” Yi asks.

Breaking the barrier between the digital and the real world


“If we are using telemetry today, we are very careful about how information from games is used and collected,” says Sawyer.

"In games like New Vegas, I would be very worried that we allow players to make choices that may be very gloomy or sad."

This is not just a problem for AAA developers. Independent developer Michael Hicks released his game The Path of Motus in early 2018. The game is experimenting with the player's reactions to different bullying techniques.

“Almost everyone immediately responded to the aggressors with cruelty, even though the game never says exactly how to play,” says Hicks.

"A very small percentage of people instantly regretted the use of violence and tried to find other solutions, while the majority experienced insight and changed tactics only towards the middle of the game."

Hicks says that although his game uses minimal player tracking and does not link data with personal profiles, he thought during the development process that the information collected was a kind of survey. However, as in New Vegas , the test is not psychologically rigorous and is not stored in a database by which an individual can be identified.

Callie Schröder is a legal assistant in the transaction and data and intellectual property protection department at Lewis Bass Williams & Weese. She believes that regardless of the intentions of the developers, such data can be collected and combined with other information.

“There are many interconnections that people don’t think about, and thanks to the rapid development of technology, it is easier and cheaper to establish such connections today than five years ago,” she says.

image

Data and dependency


Alex Champandar is an AI specialist and active developer who worked at Rockstar on Max Payne 3 and in Guerrilla on the Killzone series. He does not consider himself a panic attacker, but says that creating psychological profiles based on game data is simply a logical evolution of the development of games and business.

“I’m not so much concerned with privacy,” says Champandar. “The big danger is that this data becomes a weapon of addiction: developers create a game to manipulate the content with the physiology and emissions of dopamine.”

“Sometimes it is necessary to entice the player and immerse him in the game. But if we combine this with procedural systems, then we essentially get a catastrophic situation, ”says Champandar. “Imagine a pack of microtargeting cigarettes that deliver a cigarette right in your hand the very moment you feel most vulnerable.”

Ubisoft, in which Yi previously worked as a data analyst, has been collecting data on Assassin's Creed for many years, directly asking players to evaluate individual missions after they have been completed. Such information can be processed in a variety of ways to encourage users to return to the game.

Yi processes the information provided by the players themselves, and uses it as a kind of "person" - generalized profiles that developers can use to better target their games. The information is anonymous, therefore it is impossible to associate email addresses or names with it, but it can contribute to the creation of hyper-targeted advertising campaigns.

It is this valuable information that Yi collects. Including the one that published Blizzard in the Armory game World of Warcraft . This dataset was published several years ago; it allows any user to view the character's name, the details of his previous actions, the preferred class in the game ... and even how many times he was hugged.

Yi says that he cannot share the details of working with clients, but mentions one case where his company helped with the game Crusaders of the Lost Idols , which was created by Codename Entertainment. The game was a “clicker” (idle clicker), that is, almost did not require any other input, except for pressing the button.

With the help of his player models, Yi was able to show the company that most “clicker” players also like more complex games like Diablo 3 and EVE Online .

An analysis of the open Steam data allowed Yi and another co-founder of the company to conclude that clickers and similar games have something in common: a sense of development when a player moves from one level to another. This emphasis on development was taken into account in the Codename marketing campaign, which led to a multiple increase in sales.

According to Yi, this is a fairly standard user research process.

It is unusual that digital data is used to create digital fingerprints of players' actions and to adapt a complex network of sales funnels.

In his 2011 scientific article , which documented an experiment inside World of Warcraft , Yi raised the issue of potential privacy issues in online worlds.

"... before the advent of Armory, players could expect a reasonable level of privacy in WoW ," Yi wrote.

"... but now it is impossible to hope for it."

A look into the future


According to Yi, one of the barriers to creating psychological profiles in games is that many people play different kinds of games for different reasons. Transferring behavior from one game to another often does not work.

“There are games with limited settings, so it’s impossible to understand the preferences of the players. Everything is easier with others. ”

“But a more serious problem is the transference of this behavior [...] for example, in some games you can compete more actively with opponents than in others.”

But Champandar says that we still should not relax. The moment when in-game behavior becomes targeted, it will come faster than we think ... and artificial intelligence will contribute to this.

“What the AI ​​programmers have been doing for many years can be built into the system and receive new data. If someone realizes this, it will have serious consequences. ”

Source: https://habr.com/ru/post/451790/


All Articles