Trae Stephens Built AI Weapons and Worked for Donald Trump. As He Sees It, Jesus Would Accept It


When I wrote about Anduril in 2018, the company clearly stated that it will not develop lethal weapons. Now he builds fighter jets, underwater drones, and other deadly weapons of war. Why did you make that pivot?

We reacted to what we saw, not only within our military but around the world. We want to align and deliver the best skills in the most ethical way possible. The alternative is that someone is going to do it anyway, and we believe we can do it better.

Were there any emotional conversations before you crossed that line?

There is always an internal discussion about what should be built and whether there is an ethical fit with our work. I don’t think there’s a lot of resources in trying to stop our line while the government is laying that line. They gave clear direction about what the soldiers were going to do. We follow the lead of our democratically elected government to tell us their stories and how we can help.

What is the proper role of autonomous AI in war?

Fortunately, the US Department of Defense has done more work on this than any other organization in the world, outside of the big AI foundation modeling companies. There are clear rules of engagement that keep people from seeing each other. You want to get people out of dull, dirty, and dangerous jobs and make decisions work better while still keeping someone accountable at the end of the day. That is the goal of every policy set, regardless of what happens in self-regulation in the next five or 10 years.

It can be tempting in a conflict not to wait for people to measure, when targets reveal themselves in an instant, especially with weapons like your private jets.

An independent program we are working on is the Fury aircraft [a fighter used by the US Navy and Marine Corps] it is called CCA, Joint Combat Aircraft. There is a man in the plane who controls and commands the anti-robot planes and decides what to do.

What about the drones you build that hang in the air until they spot a target and then jump?

There is an arrangement of drones called loiter munitions, which are aircraft that look for targets and have the ability to move kinetically towards those targets, like a kamikaze. Again, you have someone you know to answer to.

War is dirty. Isn’t there a real concern that those values ​​will be set aside when the war starts?

People fight wars, and people make mistakes. We make mistakes. Even back when we were standing in line shooting each other, there was a system of punishment for breaking the law of engagement. I think that will continue. Do I think there won’t be a case where some independent program is asked to do something that sounds like a serious breach of ethics? It is not, because they are still in charge. Do I believe it is ethical to prosecute a dangerous, messy conflict with robots that are more accurate, more discriminating, and likely to lead to escalation? Yes. To decide not to do this is to continue to put people at risk.

Photo: Peyton Fulford

I’m sure you’re familiar with Eisenhower’s final message about the dangers of a military-industrial complex serving its own needs. Does that warning affect the way you work?

That is one of the best lectures—I read it at least once a year. Eisenhower was talking about a military-industrial complex where the government is not that different from contractors like Lockheed Martin, Boeing, Northrop Grumman, General Dynamics. There is a revolving door at the top levels of these companies, and they become powerhouses because of that connection. Anduril has been pushing for a more commercial approach that doesn’t rely on that closely tied incentive structure. We say, “Let’s build things at low cost, use off-the-shelf technology, and do it in a way that takes a lot of risk.” That avoids some of the potential tensions identified by Eisenhower.



Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top