Q&A: Drones Fly Into the Future

Missy Cummings, an associate professor of mechanical engineering and materials science, spent eleven years as a naval officer and military pilot, obtaining the rank of lieutenant and flying an F/A-18 Hornet. She’s now director of Duke’s Humans and Autonomy Laboratory.
April 29, 2015

You had to break barriers as one of the first female fighter pilots in the U.S. Navy. Is it easier to work in an academic setting?

Engineering is still a primarily male environment. My research can be a little controversial— human interaction with technology. Some engineers aren’t crazy about it. They think the work belongs more in psychology than in engineering.

So you’re interested in human psychology as well as machines?

Humans can be very good at reasoning through uncertainty, but very bad, for example, at repetitive tasks. Some airline pilots are texting while taxiing. That’s human nature: We get bored, and we get distracted. But we are also critical for helping computers in emergencies, like the Hudson River landing. It makes for very complex design problems.

The name of your lab is a playful reference to the crazed computer HAL in the movie 2001. Are you going to produce something similarly scary?

It’s more ironic than scary. In the movie, HAL didn’t really work well with humans. We’re trying to develop collaborative systems that leverage the strengths between humans and computers. When I was flying fighters, I could see the really bad designs that were killing my friends. In one year, I lost a friend a month—they were all accidents due to human error, but they were exacerbated by poor design in the cockpit.

You see value in drone warfare because there are many people in the loop. Why would a Navy pilot welcome a lawyer’s presence?

In the end, the pilot of a manned or unmanned aircraft still has the final say as to whether the bomb gets dropped. But if we’re trying to figure out whether we’re following the rules of war— how to factor in, for example, the presence of nearby civilians—I’m not sure we want an independent decision maker. Having military lawyers advise pilots and decisionmakers as a strike is unfolding prevents far more innocent deaths than having a pilot make this decision on his or her own.

You’ve talked about the psychological damage that pilots face from lethal missions. How does that compare with the impact on drone controllers?

We don’t really know. But fortunately it’s now more acceptable to admit, “I’m really struggling with these mentalhealth issues.” In my flying days, if you even hinted that you might be having second thoughts, you would be immediately pulled from the cockpit.

Would you be confident flying on a pilotless plane?

It depends on the context. I think commercial cargo airplanes could easily be, and will be, turned into drones. Passenger planes are different, because wherever you have a human presence, you need a leader—someone to manage unruly people, for example. There’s also the “shared fate” concept: We assume a human pilot will do everything to preserve his own life. That’s why the Germanwings crash is so difficult to process.

Will Amazon be dropping off my future book order by drone?

People in areas of China already are getting deliveries by drone. But outside of hard-toreach rural areas, drones will never be the primary delivery mode. They’ll also never be the biggest threat to privacy. It’s funny to me that we are so willing to give up privacy with our cell phones, which track our whereabouts constantly and are far more invasive than any drone could ever be.

What about driverless cars, which you’ve described as drones on the road?

Completely driverless cars won’t happen soon. Google has done some driving on California roads. Until they take that car to Boston in the winter, these aren’t true-tolife road conditions. There are infrastructure issues, sensor issues, reliability issues. But there are some bridging technologies—for example, where the radar will lock on a car in front of you as you’re inching along. The car will automatically do a slow crawl and allow you to do something else, like texting. And then, when you get to a certain speed, it will signal you to take control.

Will technology eventually allow us to master the workings of the brain?

It’s the decade of the brain, but we hardly know anything. We have no idea how to replicate love, judgment, the moral principles that go into deciding whether to fire a weapon on the battlefield. We’re not even close to translating abstract reasoning into bits and bytes.