If you haven’t watched the OpenAI video of AIs playing hide-and-seek, stop reading this and do it now. Back? Good. Then let’s talk about how, once again, AI found a way to cheat.
The video describes research by OpenAI on AI agents playing hide-and-seek. There are hiders, seekers, and an environment. The agents’ reward function is simple. Hiders get rewarded for staying hidden. Seekers get rewarded for finding them. Variables control how they can hide, seek, and interact with the environment.
After millions of rounds, the agents learn how to hide and seek. They learn how to cooperate. And they learn how to cheat by exploiting a bug. The bug allows seekers to “box surf” into a hider refuge and find them. (The video linked above shows how this works.)
Such cheating is so common in reinforcement learning, we should expect it. Give AI a goal, and it will optimize towards achieving it. In doing so, it will exploit anything it can. It will find ways to achieve a goal that we never thought of. These are often ways that we would consider cheating. Take a look at this list on Boing Boing. My favourite:
In an artificial life simulation where survival required energy but giving birth had no energy cost, one species evolved a sedentary lifestyle that consisted mostly of mating in order to produce new children which could be eaten (or used as mates to produce more edible children).
This is what people like Nick Bostrom (see his book Superintelligence) fear most with AI. The public may think about Terminators. And we should put limits on autonomous weapons. But a more general and worrying risk of AI is that it will optimize to achieve goals in ways we don’t expect. And some of these may be negative. Bostrom’s famous example? Task an AI with making paperclips and it might turn the world into office supplies.
Seeing AI learn to play complex games, with associated emergent behaviour, is astounding. This technology has incredible potential. But we must figure out how to ensure AIs’ actions uphold human values. Until then, we should be careful about unleashing them from video games.