Industry prepares AI technology for next-generation aircraft

Algorithmic Warfare: Industry Readies AI Technology for Next-Gen Aircraft

Skyborg Concept

AFRL picture

NATIONAL PORT, Maryland — The Air Force wants its sixth-generation fighter jet to have a squad of unmanned systems flying alongside it. Before the autonomous aircraft becomes a record-breaking program, the aerospace industry is eager to meet the challenges of crewed association.

Air Force Secretary Frank Kendall introduced the Air Force’s Next-Generation Air Dominance program as a set of manned and unmanned systems. Although the collaborative fighter jet program is not funded to start until 2024, industry executives said they are readying their autonomous capabilities to expand the potential for manned and unmanned collaboration.

While there is certainty in the service that the unmanned aircraft is the future, there are no requirements in place yet, said Gen. Mark Kelly, commander of Combat Air Command. Discussions are ongoing about how the acquisition process will work, he said.

Autonomy is one of the three essential elements of the system, along with resilient communication links and the authority for the system to move freely. Further tests and experiments will fill in the gaps, he said.

“I’m a proponent of iterating our way there because I think there’s so much we don’t know,” he said during a media roundtable at the annual conference. of the Air and Space Forces Association in National Harbor, Maryland.

Operational tests of the collaborative combat aircraft will take place in two or three years, he said.

Industry must participate in the experimentation that will shape autonomous capabilities, said Mike Atwood, senior director of the advanced programs group at General Atomics Aeronautical Systems.

One of the areas the industry must navigate alongside the Air Force is how it will compete with other artificial intelligence-based systems, he said during a table. round at the conference. This challenge could shape the ethical boundaries of autonomous systems.

The ADAIR-UX program — which is developing an AI-driven aircraft with General Atomics for fighter jets to train — will raise awareness of the difficulty of coping with AI as students of weapons schools s ‘train against opponents with lightning-fast decision making,’ he said. .

“I think this may be the Sputnik moment of cultural change, where we realize when we’ve seen… F-22s and F-35s in the lineup, how hard it is to go against of that,” he told a panel at the conference. .

A new advance in autonomous capabilities with potential for future AI-controlled aerial vehicles is enhanced learning, he said. Using algorithms, an operator can define the world in which the machine is allowed to operate and assign it a set of actions. The machine can then learn all the possible combinations of these actions in the defined environment.

This kind of learning could reassure those worried about AI, especially as the military begins testing its largest class of unmanned aerial vehicles, he said. Setting the limits of what the machine can do can be comforting, but it still allows the system to innovate, Atwood said.

“What we’re seeing now in manned-unmanned teams is that squadrons are ready to start accepting more degrees of freedom for the system – not just circling, but perhaps signaling mission systems , perhaps by waging electronic warfare. [or] do communication features,” he said.

He added that programs like the trusty Skyborg wingman program — for which General Atomics provides core software — are advancing the autonomous capabilities needed for the aircraft of the future.

According to the Aviation.

“I think we’re on the verge of something very, very special with the collaborative combat aircraft,” Atwood said.

Lockheed Martin has been thinking internally about how it could set up an industry collaboration similar to that of the Manhattan Project, said John Clark, vice president and general manager of Skunk Works.

Given an urgent national need, companies could come together to build capacity in 12 to 18 months, he said.

“The environment isn’t quite to that point, but maybe it’s a day or an event away from having that kind of environment,” he told the panel.

Clark said the loss of the AlphaDogfight competition — a series of trials testing manned and unmanned team capabilities run by the Defense Advanced Research Projects Agency — called for a review of the limits of Lockheed’s AI. During the competition, Lockheed limited AI control to follow Air Force doctrine, but the winner – Heron Systems, which was purchased by software company Shield AI – was more flexible.

The Air Force and industry need to discuss the range of acceptable behaviors for AI and how to build trust within that range, he said.

“We’re going to have to fail many times and learn from those failures and then move forward with ‘This is the right way to get through this,'” he said. “I think that’s the #1 thing that’s holding us back from really leapfrogging with this technology.”

The industry also wants to highlight the importance of science, technology, engineering and math education for future pilots and operators, said Ben Strausser, senior research manager for autonomy in mosaic at General Dynamics Mission Systems.

“Other discussions have talked about the importance of STEM education and making sure we have that level of understanding of literacy, so when we want to communicate what our unmanned systems are doing, there’s a level of… understanding of what the semantic descriptions of these algorithms mean,” he added during the panel.

Topics: Air power, robotics and autonomous systems, unmanned aerial vehicles

Comments are closed.