Industry prepares for next-generation aircraft AI technology

The Air Force wants its sixth-generation fighter jet to have a suite of unmanned systems flying alongside it. Before autonomous aircraft became standard software, the aviation industry was keen to meet the challenges of unmanned teamwork.

Secretary of the Air Force Frank Kendall presented the Air Force’s Next Generation Air Dominance Program as an integrated package of manned and unmanned systems. Although the collaborative fighter jet program won’t be funded until 2024, industry executives said they are preparing their autonomous capabilities to expand the capabilities of manned and unmanned teamwork.

General Mark Kelly, commander of Combat Air Command, said that while there is certainty in service that unmanned aircraft are the future, there are no requirements in place yet. He said that discussion is underway on how the acquisition will work.

Autonomy is one of the three indispensable components of the system, along with flexible communication links and the system’s authority to move freely. He said more tests and experiments will fill in the blanks.

“I’m an advocate of repeating our way there because I think there’s a lot we don’t know,” he said during a media roundtable at the Air and Space Forces Association’s annual conference in National Harbor, Maryland.

He said that operational tests of the cooperative combat aircraft will take place in two or three years.

Mike Atwood, senior director of the Advanced Programs Group at General Atomics Aeronautical Systems, said the industry needs to be involved in the experience that will shape the intrinsic capabilities.

He said during a panel discussion at the conference that one of the areas in which the industry must navigate alongside the Air Force is how it will confront other AI-based systems. This challenge can shape the ethical limits of autonomous regimes.

He said the ADAIR-UX program – which is developing an AI-guided aircraft with General Atomics to train combat aircraft on it – will build awareness of the difficulty of countering AI as students in weapons schools train against enemies by making quick decisions. .

“I think maybe this will be a Sputnik moment of cultural change, where we realize when we’ve seen … the F-22 and the F-35 at the range, how difficult it is to counter that,” he said during a panel discussion at the conference. .

He said that a new advance in autonomous capabilities with the potential of AI-controlled aerial vehicles of the future is reinforcement learning. Using algorithms, the operator can determine which world the machine is allowed to operate in and give it a set of actions. The machine can then self-learn all possible combinations of those actions in the given environment.

He said that kind of learning could be reassuring for those with concerns about artificial intelligence, especially as the military begins testing its largest class of unmanned aerial vehicles. Atwood said setting the limits of what a machine can do can be convenient, but it still allows the system to innovate.

“What we find now in manned unmanned teams is that the squadrons are ready to start accepting more degrees of freedom for the system — not just going into a circle, but perhaps routing mission systems, possibly doing electronic warfare [or] “Doing communications jobs,” he said.

He added that programs such as the Skyborg Loyal Wing Pilot Program – for which General Atomics provides core software – are developing the autonomous capabilities needed for future aircraft.

The AI-powered system that controls unmanned vehicles will officially become standard software in 2023. The software, along with the Air Force’s three other Vanguard systems, will add data to programs such as the Next-Generation Air Dominance family of systems, according to the Air Force.

“I think we’re on the verge of something very special with collaborative fighter jets,” Atwood said.

John Clark, vice president and general manager at Skunk Works, said Lockheed Martin is considering internally how it can pull an industry collaboration similar in scope to the Manhattan project.

Given the urgent national need, he said, companies could band together to create capacity within 12 to 18 months.

“The environment isn’t quite that way, but maybe one day or one event away from having that kind of environment,” he said during the session.

Clark said the loss of the AlphaDogfight competition — a series of experiments testing unmanned teamwork capabilities run by the Defense Advanced Research Projects Agency — called for an examination of the limits of AI at Lockheed. During the competition, Lockheed restricted AI control to follow the Air Force principle, but the winner – Heron Systems, bought by software company Shield AI – was more flexible.

He said the Air Force and industry need to discuss where the range of acceptable behavior for AI is and how to build trust within that range.

“We have to fail a few times and learn from those failures and then move on,” he said, “that’s the right way to get past them.” “I think that’s the number one thing that stops us from really being able to take a leap forward with this technology.”

The industry also wants to emphasize the importance of STEM education for future pilots and operators, said Ben Strauser, principal head of mosaic independence research at General Dynamics Mission Systems.

“Other discussions have talked about the importance of science, technology, engineering and mathematics (STEM) education and making sure that we get that level of literacy comprehension, so when we want to communicate what our unmanned systems are doing, there is a level of… understanding the semantic descriptions of those algorithms mean,” he added. during the session.

Source

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version