This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
| 2 minute read

Gaming Series #5: Computer games – player data and AI

AI advances bring new opportunities

Computer games have a long history of using complex algorithms to control NPCs, power chess opponents and improve gameplay.  More recently, AI has been used to detect cheaters, guide new players and analyse player behaviour. 

The rapidly growing sophistication of AI provides significant opportunities for computer games to enhance the player experience, improve game development and offer new revenue streams. However, as the games industry seeks to push boundaries in implementing these technology advances, care needs to be taken to address the regulatory and reputational risks.

Enhancing player experience 

For example, AI can be highly effective at profiling players and analysing gameplay to offer the right in-game purchases at the right time to maximise take up.  

AI can assist new players to learn about features or navigate maps.  It can also improve playability for experienced players. 

However, an overly effective AI-powered microtransaction (MTX) strategy can lead to a negative and over-commercialised experience for players.  Similarly, AI-enhanced features and gameplay can lead to concerns about over engagement. 

Regulators will expect you to protect players, especially children

Data protection and consumer regulators from a range of jurisdictions have clearly indicated that they are monitoring the use of AI in computer games for commercialisation and engagement.

Scrutiny from regulators is likely to increase around the use of AI and player profiling in association with loot boxes and other types of more engaging or interactive MTX purchases. 

Regulators will also focus on whether games are using AI to increase engagement in a way that is unfair or detrimental to players. 

But, as always, regulators will be particularly interested in any use of AI with children’s data.

Identifying risks and safeguarding players 

The key to commercialising your data successfully is to identify and manage your regulatory and reputational risks.  

Undertaking a data protection impact assessment (DPIA) before deploying AI in-game will provide you with a framework to assess whether your solution uses players’ data fairly.  It will also allow you to check for any unwanted bias or detrimental impacts to children or other groups. 

Where you do identify risks from using players’ data with AI, your DPIA will assist you to identify when and how you can implement appropriate safeguards to protect players. 

Some safeguards data protection regulators have highlighted recently include:

  • Clear and age-appropriate explanations of when players’ data is used with AI
  • Opportunities for players to object to AI-powered personalised content, gameplay or offers 
  • Regular MTX spending updates
  • Prompts to plan breaks and regulate gameplay
  • Parental controls over personalisation and privacy settings

Development teams and engineers will understandably be keen to capitalise on the benefits of recent AI progress.  Undertaking a DPIA and identifying appropriate safeguards will allow you to support your development teams while reducing the risk of unwanted attention from data protection and consumer regulators.

 

Visit our dedicated Games and Interactive Entertainment page to find out about our Linklaters gaming offering.

Tags

gaming, data and cyber, ai, online safety