The Military Is Funding Ethicists to Keep Its Brain Enhancement Experiments in Check

Efforts involving brain-computer interfaces pose myriad risks to would-be supersoldiers

Illustration of a cyborg from the collarbones up, facing the right side of the image, in a dark space. Its metal face has been slightly detached from the rest of its head to reveal the computer chips and wires inside.
Illustration of a cyborg from the collarbones up, facing the right side of the image, in a dark space. Its metal face has been slightly detached from the rest of its head to reveal the computer chips and wires inside.

The military has long been interested in what medical ethicist Jonathan Moreno calls “the whole supersoldier business” — using technology to produce bionically or pharmaceutically superior warfighters. Moreno, a professor at the University of Pennsylvania, is interested too. Specifically, in one question that keeps gnawing at him: How much can a soldier’s brain bear?

“You can know that with a backpack — 60, 70 pounds — there is a limit,” he tells Future Human. “But what are the kinds of limits to the neurotechnologies that a soldier can carry around?”

Moreno’s discussions on this topic with his former postdoc, Nick Evans, now a professor at the University of Massachusetts, Lowell, together with Lowell forensic psychologist Neil Shortland and Michael Gross of the University of Haifa quickly turned into action. In January, the group, under Evans’ leadership, received funding for a new project to investigate the ethics of soldiers’ participation in experiments with “AI-driven performance enhancements,” like brain-computer interfaces that augment a person’s natural abilities.

Early experiments in this field are already happening, and the possible risks for soldiers who take part in them merit attention. But the technology’s potential is enticing, and military agencies have to weigh those two considerations against each other. The group’s research funding comes from the Air Force Office of Scientific Research via the Minerva Research Initiative, a social science arm of the Department of Defense (DOD), which has a particular interest in human enhancement. And there’s good reason to believe the group’s findings will be relevant for civilians, too.

“AI-driven performance enhancements” encompass many technologies, but those that get the most attention are brain-computer interfaces (BCIs): devices that connect human minds to machines powered by artificially intelligent software. BCIs have the potential to let paralyzed people move, warn people who suffer from epilepsy of oncoming seizures, or give timed-and-targeted electrical boosts to those with depression. But they’re also hypothetically useful beyond therapy — and that’s where things could get complicated.

To develop technology to boost the typical capabilities of soldiers, scientists must experiment on soldiers. The potential risks they face are legion: It could alter their identity. It could trample on privacy. Their devices could get hacked. What if something malfunctions? And what happens when the study is over, and the participant is no longer a proto-supersoldier? Considering these issues when the aim is to treat a health condition or injury is one thing; the waters get muddier when the goal is to artificially augment human beings in the interest of national security, says Moreno.

For the next three years, the Minerva team will examine the ability of warfighters to give informed and uncoerced consent when participating in A.I.-based enhancement research, like that on BCIs. They’ll also look into the strategic value of the technology versus the problems it could cause research participants, and assess what that means for regulations around BCI and other A.I.-based enhancements. Their findings will have applications in society more broadly: Commercial BCI devices, like those proposed by Elon Musk’s company Neuralink, do not always come with clear guidance.

‘98% of these technologies are not going to happen, but some will.’

As part of their work, they’re looking into the ethics of removing neuro-enhancing devices from a soldier. “Not carrying around a 60-pound backpack, that’s fine. You take my weapon away, less fine,” says Moreno. “But if you take this brain booster away, what does that mean?” It may mess with identity — the total loss of a boosted self — and it will definitely mess with ability. “The analogy I might give is if you’ve ever driven in a car with a backup camera, and then you get in a car without one, you can’t reverse,” says Shortland.

Not many federal organizations are interested in funding this kind of research because most scientific agencies focus on therapeutic use cases. But the Department of Defense, “of course, loves thinking about enhancement,” says Evans. Military types have already begun pursuing a future in which people psychically fly drones, or imagery from said drones flows into operators’ optic nerves, or two soldiers connect brain-to-brain on the battlefield. They’ve piloted simulated planes via gray matter and sent video-game commands from one mind to another.

“98% of these technologies are not going to happen,” says Moreno of A.I.-driven enhancements in general, “but some will.” And if that’s the case, it’s worth working out how to do as little harm as possible with these seemingly speculative technologies.

Military researchers — though their ethical history is flawed — may even be more attuned to such philosophical questions than their counterparts in the private sphere. While the latter’s existence is predicated on the desire to turn a profit, the military exists to “provide the military forces needed to deter war and ensure our nation’s security.” And it knows that everything it does is subject to public scrutiny.

“They live in this kind of gray sludge,” says Evans, “where they’re aware that national security is an important thing, but the way national security gets done is often very contentious, very harmful, and needs to be scrutinized very closely.” It’s part of why the Defense Department is funding his project in the first place.

“If you go to a fancy university, maybe you’ll take an ethics course,” says Moreno. “But if you go to the military academies, you will take ethics courses, end of story. I’m not going to say this is the DNA of being in the military, but it’s certainly a core orientation.”

In a 2020 review paper, North Carolina State University’s Veljko Dubljevic reported that 34 papers had been published on the topic between 2016 and 2020, compared to 42 in all prior years. Several categories of concern emerged from these studies. Those include whether BCIs alter identity, and how; whether cyborg abilities “count” as authentically human; whether the devices can be hacked; whether mental privacy is secure; whether access to these devices is fair and just; whether an implanted person is always responsible for or in control of their actions; and how the innovations could be misused.

For now, while much of the technology exists only in fetal form, the most pressing issues are those concerning consent and safety. These are the ones the Minerva group is most interested in.

‘Imagine devices that can keep you at a reasonably high level of attention but not to the point where you’re frazzled.’

Moreno came to focus on the military side of medical ethics after he realized that all the usual ethical question marks are written in 100-point font. “The scales fell from my eyes,” he says, “and I saw there was a huge area here that nobody was exploring.” He and his colleagues are interested in investigating questions about A.I. enhancement while the technology is more than just a pipe dream but less than an operational reality.

The technology’s role in civilian life is closer to the latter, too. Already there are software-saddled brain implants that, for instance, help people see.

“We know from the history of science that stuff that goes through the military goes into the civilian world one way or another,” says Moreno. Imagine, he continues, “devices that can keep you at a reasonably high level of attention but not to the point where you’re frazzled.” A fighter pilot’s superior would love that. But so would an office boss. In fact, said boss might even find a “developed by DARPA” stamp appealing.

Shortland calls that the “CrossFit Effect” — a reference to the fact that military and law enforcement agencies use high-intensity interval training, which can make it more appealing to the civilians of the world.

The Department of Defense has already invested in a number of projects to which the Minerva research has relevance. The Army Research Laboratory, for example, has funded researchers who captured and transmitted a participant’s thoughts about a character’s movement in a video game, using magnetic stimulation to beam those neural instructions to another person’s brain and cause movement. And it has supported research using deep learning algorithms and EEG readings to predict a person’s “drowsy and alert” states.

Evans points to one project funded by Defense Advanced Research Projects Agency (DARPA): Scientists tested a BCI that allowed a woman with quadriplegia to drive a wheelchair with her mind. Then, “they disconnected the BCI from the wheelchair and connected to a flight simulator,” Evans says, and she brainfully flew a digital F-35. “DARPA has expressed pride that their work can benefit civilians,” says Moreno. “That helps with Congress and with the public so it isn’t just about ‘supersoldiers,’” says Moreno.

Still, this was a civilian participant, in a Defense-funded study, with “fairly explicitly military consequences,” says Evans. And the big question is whether the experiment’s purpose justifies the risks. “There’s no obvious therapeutic reason for learning to fly a fighter jet with a BCI,” he says. “Presumably warfighters have a job that involves, among other things, fighter jets, so there might be a strategic reason to do this experiment. Civilians rarely do.”

It’s worth noting that warfighters are, says Moreno, required to take on more risks than the civilians they are protecting, and in experiments, military members may similarly be asked to shoulder more risk than a regular-person participant.

DARPA has also worked on implants that monitor mood and boost the brain back to “normal” if something looks off, created prosthetic limbs animated by thought, and made devices that improve memory. While those programs had therapeutic aims, the applications and follow-on capabilities extend into the enhancement realm — altering mood, building superstrong bionic arms, generating above par memory.

For both military and civilian applications, and from therapeutic and enhancement perspectives, ethical researchers are interested in how BCIs alter their possessors. Do you feel like your true self when you have an implant attached to your brain? The results from different studies are relevant across the board. One recent project investigated feelings of personhood in six patients with BCIs that alerted them of an impending seizure. While the devices correctly predicted seizures, they didn’t always improve the patients’ well-being. The BCI left one participant, for example, estranged from themselves, constantly aware of their illness.

Another participant, meanwhile, felt freer, because she could do things like shower and not fear for her life. But that success poses its own problems. “She became a de novo person, but that de novo person was dependent on this A.I. system in her brain, but this A.I. system was owned by a company. That wasn’t hers,” says Frederic Gilbert, one of the paper’s authors. “And in a way, the company owned the new person, because as soon as the device was explanted, that person was dead.”

That complication applies to the Minerva team’s persons-of-interest, too: How are researchers weighing the benefits of BCI, and other smart-software-enabled technology, against the consequences for study soldier participants? It’s an especially knotty conundrum since a device that successfully enhances someone’s abilities, or those of their unit, doesn’t necessarily enhance their quality of life and may, in fact, degrade it. “Is that the kind of benefit you should be able to risk your body — and the case of BCI, your identity — for?” asks Evans.

To find out, he and his team will go deep into the world of would-be supersoldiers, before brain-boosters go the way of CrossFit.

Freelance science writer; short-fiction lover; trail runner; dog embracer. Views here are my own.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store