The first real computer, the ENIAC, was built in 1946. The first computer war game appeared two years later. It was built by the Army Operations Research Office, and it was as rudimentary as you might think. Since then, the relationship between the military and world of games has gotten endlessly deeper. Veterans help develop popular games, and popular games help veterans recover. The US military uses games to recruit, and critics complain that modern war’s cruelty comes because it too closely resembles videogames. In 1997, this magazine ran a cover story about Marines modifying the game Doom for training purposes. This past month, news came of soldiers training with a system called Tactical Augmented Reality.
What if the relationship could be still deeper, though? What if, for example, the best game developers produced tools for the Pentagon? And then what if those tools ended up back in games? What if, instead of videogames copying war, war copied videogames—and the two things became, in a certain way, the same?
The idea comes from Will Roper, a Rhodes scholar in his late 30s with a PhD in mathematics. Roper runs the Defense Department’s secretive Strategic Capabilities Office; his job is to study where war is headed, and to develop the technological tools that help the United States win there. The military services think about today; DARPA thinks about the distant future; Roper thinks about tomorrow.
His office was founded in 2012, but remained classified until last year. Recently, he has started to come out of the shadows. This past January, he appeared on 60 Minutes to demonstrate how the military could use a swarm of tiny drones that his office has built. This March, he sat with this author at South by Southwest. And more recently, he spoke again with WIRED about how the relationship between war and videogames is going, as he says, full circle.
“In the age of the Internet of Things, our senses no longer define the boundaries of our perception,” Roper says. Soldiers in the near future will live in a world with almost infinite available information that could help them. Imagine a man or woman in urban conflict. They’d want a map of all nearby friendly soldiers; they’d want heat signatures of anyone behind a nearby wall, or images of them collected by drone swarms. They’d want a map that constantly changes based on what fellow soldiers learn: If they safely break through a wall and move forward, everyone might want to use that path. If they’re greeted by fire, everyone will want to retreat. They’d want, perhaps, a color-coding system that suggests snipers locations, which changes from yellow to red when one is confirmed. They’d want a notification if ammo runs low. Behind all of this, they’d want deep-learning algorithms predicting the enemy’s next move and proposing options for countering it. And, if they were leading a group of soldiers, they’d want strategic advice based on an initial plan for the battle that adjusts as the fight goes on.
Theoretically, the Army could just send machines into this urban conflict. They can process more information; they can take more risks; they’ve got better armor. But there’s a complex moral problem—does any country want to delegate the decision to kill someone to a machine?—and a technical one, too. As Roper says, “The state of current machine learning is that it can make good decisions when presented with things it has seen before and poorly—potentially disastrously—otherwise, which is why we don’t delegate lethal decisions to machines.”
In the near future, at least the one that Roper is concerned about, the challenge is to get soldiers as much information as possible, with as much learning applied to it, in the simplest and clearest way. “If we had to do this in the Pentagon on our own,” Roper continues, “I’d view it as our hardest challenge. But the videogame industry has already cracked the code.”
It is the gaming industry, of course, that has developed the best ways for players to collaborate across countries as though they are sitting in the same room, and that has built interfaces so intuitive to users that new, complex games don’t even need to be trained.
Recognizing the potential of that collaboration doesn’t mean it will happen. Changing the Pentagon is hard. And the videogame industry might not want to participate. It’s morally complicated enough to build a game that simulates killing people. But do the developers at Activision, who come from all over the world, want to play any part in really killing people?
It’s “Call of Duty for real,” says Roper, which might feel patriotic to some employees and horrifying to others.
Still, the Pentagon does have a $70 billion research and development budget, and Roper’s office seems to be one part of government research that’s growing. According to a recent report, his budget has grown 18-fold since the office was founded five years ago. And so he plans to approach the gaming industry with an idea: You build the Pentagon the systems it wants and give it exclusive access for, say, six months. And then everything can go back into the game. “We don’t own the product, we own the time,” says Roper.
“I could envision calling up a videogame manufacturer and saying you have the latest augmented reality system and I want to install it on a tactical headset. And I’d like to make some improvements. Maybe we want to take someone who would have a local understanding of where they are who can drop icons and see objects behind walls. Maybe we want to also give them a bigger macro view so that they can expand out to the company level.” Roper then adds another potential selling point: “If we had special ops go out with it, there would be a coolness factor.”
Roper knows this isn’t the way the government normally works, or the way the gaming industry does either. And he won’t say if he’s directly approached any gaming companies so far, or how the conversations have gone. “We are going to have to warm this pot slowly,” he says.