Aerospace giant Boeing’s proposal to develop thousands of AI-piloted fighter planes for the U.S. military is raising alarms among critics, who cite the company’s recent safety issues and broader ethical concerns about autonomous weapons.
The proposed fleet, dubbed MQ-28 Ghost Bats, would be un-crewed aircraft controlled by artificial intelligence. Boeing has already built and flight-tested three prototypes in Australia for the Royal Australian Air Force (RAAF), with at least one delivered to the United States for further testing.
Steven Feldstein, a former State Department official and current senior fellow at the Carnegie Endowment for International Peace, expressed doubts about Boeing’s suitability for the project.
“Boeing’s track record doesn’t seem to indicate that it’s necessarily the best one to implement this kind of thing,” Feldstein told DailyMail.com. He pointed to the company’s recent troubles, including issues with its commercial passenger jets and space program.
The Ghost Bat, measuring 38 feet in length with a range of over 2,300 miles, is designed to carry various payloads, including potentially tactical nuclear weapons. The RAAF has invested over $500 million USD in the project, aiming to have 10 operational Ghost Bats by 2025.
The U.S. Air Force has even more ambitious plans, with a budget request of nearly $9 billion for developing and maintaining these AI-piloted fighter jets. Each drone is expected to cost around $30 million.
Critics, including Mary Wareham of Human Rights Watch, question the ethical implications of autonomous weapons. “You’re stepping over a moral line by outsourcing killing to machines,” Wareham told the New York Times.
Robert Gonzalez, a researcher who has studied the intersection of the defense industry and venture capital, warns of potential financial motivations behind the AI hype in warfare. “There’s a real financial interest in generating that kind of euphoria or optimism about AI’s potential to solve all kinds of problems,” he said.
Feldstein also highlighted concerns about AI’s current use in warfare, citing Israel’s AI system Lavender and its reported high civilian casualty rate in Gaza. “It’s sort of like putting a self-driving car on the road without having a clear sense of its failure rates,” he explained.
Despite these concerns, Feldstein believes that AI in warfare is likely inevitable given the competition between nation-states. He advocates for the development of international norms and safeguards, similar to those established for nuclear and chemical weapons during the Cold War.
As Boeing competes with other defense contractors for lucrative military contracts, the debate over AI-controlled weapons continues to intensify. The outcome of this technological race could have far-reaching implications for the future of warfare and international security.
Boeing did not respond to requests for comment on the ethical dimensions of AI-piloted fighter jets or its MQ-28 Ghost Bat program.