Think, fight, feel: how video game artificial intelligence is evolving.
In May, as part of an otherwise unremarkable corporate strategy meeting, Sony CEO Kenichiro Yoshida made an interesting announcement. The company’s artificial intelligence research division, Sony AI, would be collaborating with PlayStation developers to create intelligent computer-controlled characters. “By leveraging reinforcement learning,” he wrote, “we are developing game AI agents that can be a player’s in-game opponent or collaboration partner.” Reinforcement learning is an area of machine learning in which an AI effectively teaches itself how to act through trial and error. In short, these characters will mimic human players. To some extent, they will think.
Sign up for Alex Hern’s weekly technology newsletter, TechScape.
This is just the latest example of AI’s evolving and expanding role in video game development. As open-world games become more complex and ambitious, with hundreds of characters and multiple intertwined narratives, developers have to build systems capable of generating intelligent, reactive, creative characters and emergent side quests.
For its Middle-earth games, developer Monolith created the acclaimed Nemesis AI system, which lets enemies remember their fights against the player, creating blood feuds that flare up throughout the adventure. The recent Watch Dogs: Legion generates life stories, relationships, and daily routines for every London citizen you interact with – so if you save a character’s life one day, their best mate may well join you the next. The experimental text adventure AI Dungeon uses OpenAI’s natural language modeler GPT-3 to create new emergent narrative experiences.
But the field of AI has a problem with diversity. Research published by New York University in 2019 found that 80% of AI professors speaking at major events were men, while just 15% of AI researchers at Facebook were women, and only 10% at Google. Statistics for people of color in tech are worse: just 2.5% of Google’s workforce are black, 4% at Facebook. The risk of such a homogeneous working culture is that gender and racial biases can feed unchecked into AI algorithms, producing results that replicate entrenched imbalances and prejudices. Over the past five years, there have been numerous examples, from facial recognition systems that discriminate against people of color to AI recruitment tools that favour male applicants.
Now that the games industry is exploring many of the same AI and machine learning systems as academia and the big tech giants, is the diversity problem something it should be tackling? We know that video game development has presented similar issues with homogeneity, both in its workforce and in its products – it is something the industry claims it is keen to address. So, if we’re going to see AI-generated characters and stories about diverse backgrounds and experiences, don’t developers need to be thinking about diversifying the teams behind them?
We’ll see new forms of play that leverage feelings of creativity, love and joy more so than triumph or domination
Uma Jayaram, general manager of SEED, the innovation and applied research team at Electronic Arts, certainly thinks so. As a tech entrepreneur, she has worked in cloud computing, VR, and data-at-scale as well as AI, and says she has sought to comprise her global team – based in Sweden, the UK, Canada, and the US – of different genders, ethnicities, and cultures.
“A diverse team allows for multiple points of view to coalesce and creates possibilities for a more representative outcome and product,” she says. “It also enhances opportunities to create awareness, empathy, and respect for individuals who are different from us. A video game is in a way an extension of our physical world and a place where people spend time and form rich experiences that loop back into the collective sense of self and community. As such, it is a great opportunity to bring in diversity in two ways: in the teams designing and architecting these worlds, and in the worlds being created and the denizens that inhabit them.”
Electronic Arts is currently looking into developing systems that can use machine learning to replicate facial expressions, skin types, and body movements from video and photos, rather than having to bring actors into a mo-cap studio. In theory, this should expand the range of genders and ethnicities that can be produced in games, and Jayaram says EA is committed to using diverse data in its R&D projects. The company is also looking at employing user-generated content in games and allowing players to make a unique avatar by capturing their own likeness and expressions on a smartphone or webcam and uploading it into the game.
The emphasis on diverse data is important because it highlights a misconception about AI: that it is somehow objective because it is the result of computation. AI algorithms rely on data, and if that data is coming from a single demographic, it will reflect that group’s biases and blind spots. “We’re used to thinking about AI like physics engines or multiplayer code – something technical that happens behind the scenes,” says AI researcher and game developer Michael Cook. “But AI today is a part of the whole creative work. It controls how little AI people behave and treat each other in The Sims; it generates cultures and religions in games like Caves of Qud and Ultima Ratio Regum; it’s part of political statements in Watch Dogs: Legion. AI engineers have as much responsibility to the players as the writers and designers. They create part of the experience and have a huge capacity to harm. We’ve seen recently how AI Dungeon is generating potentially traumatic stories for the player, without warning.”
At Microsoft, the company’s AI research team in Cambridge has several ongoing studies into machine learning and games, including Project Paidia, which investigates reinforcement learning in-game AI agents that can collaborate with human players. The company’s recent virtual summit included several talks on ethical considerations in games AI.
“AI agents can be built to develop, grow, and learn over time, and are only as good as what you are putting in,” says Jimmy Bischoff, director of quality at Xbox Game Studios. “Being culturally appropriate in terms of dialogue and content comes down to how it is trained. We want to build games that everyone wants to play and can relate to, so we need to have people that can represent all our players.”
Microsoft also sees potential in player modeling – AI systems that learn how to act and react by observing how human players behave in game worlds. As long as you have a wide player base, this is one way to increase the diversity of data being fed into AI learning systems. “Next will be characters that are trained to provide a more diverse, or more human-like range of opponents,” says Katja Hofmann, a principal researcher at Microsoft Cambridge. “The scenario of agents learning from human players is one of the most challenging – but also one of the most exciting directions.
“At the same time, I want to emphasize that AI technologies will not automatically give rise to diverse game experiences. Technology developers and creators need to make choices on how to use AI technologies, and those choices determine whether and how well the resulting characters and experiences reflect different genders and heritages.”
Amanda Phillips, the author of Gamer Trouble: Feminist Confrontations in Digital Culture, is similarly cautious about placing the impetus for change solely on diverse people in AI teams. “Having a diverse team is absolutely necessary for ensuring more design angles are being considered, but I think it’s important not to fetishize underrepresented and marginalised individuals as the solutions to problems that often have very deep roots in company and industry practices,” says Phillips. “It puts tremendous pressure on folks who often have less job security, clout, and resources to educate their peers (and supervisors) about issues that can be very personal. This is what is popularly called an “add diversity and stir” approach, where companies bring in “diverse” individuals and expect them to initiate change without any corresponding changes to the workplace.
“Teams need to diversify, but they also need to hire consultants, audit their own practices, make organizational changes, shake up the leadership structure – whatever is necessary to ensure that the folks with the perspectives and the knowledge to understand diversity and equity in a deep way have the voice and the power to influence the product output.”
One of the most fundamental elements set to shape AI unconsciously is the games industry’s inclination to think about video games purely as adversarial systems, where AI’s role is to create allies or enemies that are more effective in combat. But if we look outside the mainstream industry, we do see alternatives. Coder and NYU professor Mitu Khandaker set up her studio Glow Up Games with technologist Latoya Peterson to make social narrative games for diverse audiences. The team is currently working on Insecure: The Come Up Game, a smartphone life sim based around the hit HBO series, which explores the relationships between characters.
“What I’m really interested in as a designer is, how do we build tools that let players construct fun AI systems or AI agents for other people to play with?” says Khandaker. “I’ve been saying this for ages – there’s a broader cultural point around how important it is to create a legibility of AI – creating a way for people to understand how AI even works – and we can do that by exposing them to it in a playful way. It’s effectively just computers doing calculations and trying to predict what it should do. It’s not magic, but certainly what it produces can be delightful.”
The development studio Tru-Luv, which created the hugely successful Self-Care app, is working on AI technologies that reflect the company’s own diverse, progressive, and supportive studio culture. “Our company is currently one-third BIPOC [black, indigenous, and people of color] and two-thirds women,” says studio founder Brie Code. “Our executive team is 100% women, and our board is one-third BIPOC and two-thirds women. We work with consultants and partner organizations from emerging development communities such as those in Pakistan, Tunisia, and Morocco.”
Like Khandaker, Code argues that a diverse workforce won’t just eliminate problematic biases from conventional games but will allow for new interactive experiences. “The games industry has focused on a narrow subset of human psychology for several years,” she says. “It is very good at creating experiences that help people feel a sense of achievement or dominance. Game AI created by a diverse workforce will bring life to NPCs and experiences representing the human experience’s breadth and depth. We’ll see more non-zero-sum experiences, compassion, emotional resonance, insight, and transcendence. We’ll see new forms of play that leverage feelings of creativity, love, and joy more so than triumph or domination.”
As developers begin to understand and exploit the greater computing power of current consoles and high-end PCs, the complexity of AI systems will increase in parallel. Developers will explore elements such as natural language processing, player modeling, and machine learning to develop imaginative, reactive AI characters facilitated by emerging AI middleware companies such as Spirit AI and Sonantic; worlds will begin to tell their own stories to augment those penned by game designers and writers. But right now, those teams need to think about who is coding those algorithms and what the aim is.
“[We have] a golden opportunity to create a ‘new normal’,” says Jayaram. “We can reject stereotypes in portrayal, source diverse data for the machine learning models, and ensure that the algorithms powering the games promote fairness and respect across gender and ethnicity.”
Mike Cook agrees. “Right now, the field of game AI is overwhelmingly male and white, and that means we’re missing out on the perspectives and ideas of a lot of people,” he says. “Diversity isn’t just about avoiding mistakes or harm – it’s about fresh ideas, different ways of thinking, and hearing new voices. Diversifying game AI means brilliant people get to bring their ideas to life, and that means you’ll see AI applied in ways you haven’t seen before. That might mean inventing new genres of games or supercharging your favorite game series with fresh new ideas.
“But also, diversity is about recognizing that everyone should be given a chance to contribute. If Will Wright were a Black woman, would The Sims have been made? If we don’t open up disciplines like Game AI to everyone, then we are missing out on every Black genius, every female genius, and every queer genius; we’re missing out on the amazing ideas they have and the huge changes they might make.”
Think, fight, feel: how video game artificial intelligence is evolving | Games | The Guardian
Hi this is a test