Air systems
An F-15E Strike Eagle from the 96th Test Wing’s 40th Flight Test Squadron at Eglin AFB, Florida, flies in formation with an XQ-58A Valkyrie flown by AI.
Italy and Japan are investing in their own automated aerial systems. And with military sources typically tight-lipped about the specifics, getting a grip of what dogfights in 2030 might look like is tough – but Pettyjohn is happy to do some educated guesswork. As far as the Collaborative Combat Aircraft programme is concerned, for example, she envisages drones acting like a gaggle of “unmanned, loyal wingmen” for pilots flying nearby. Likely exerting some control over their “three-to-five” robotic allies, humans would be able to fire their weapons even as the drones were left to fly and position themselves. There’s some evidence, meanwhile, that researchers are already making progress here. With a design somewhere between a fighter jet and a speedboat, for instance, is the Kratos XQ-58 Valkyrie, an experimental loyal wingman capable of scouting enemy squadrons, or else absorbing fire when battle commences. In a way, DARPA’s Air Combat Evolution scheme is even more ambitious. Giving AI full command over actual aircraft, a computer flew multiple F-15 sorties over several days in December 2022. Among other things, the artificial pilots proved adept at take-offs and landings, and at dealing with a range of simulated enemies.
$11m
The cost of training an F-22 pilot.
RAND 42
Aside from the technical accomplishments here, at any rate, it’s not hard to see how useful these developments could be. In the first place, that’s clear from a financial perspective. As the Air Force spokesperson puts it: “Utilising automated technology, teamed with crewed aircraft, to project larger force training engagements at a lower cost, is one possible approach.” For her part, Pettyjohn echoes the results of that AlphaDogfight competition, arguing that AI algorithms can help warfighters “make decisions” around when weapons are used, or else what engagement profile to use. All this is doubly urgent, meanwhile, when you remember that it’s not only friendly countries like Italy rushing to build AI-powered aircraft. In March, to give one example, Chinese military researchers announced that an AI drone had bested a human pilot in a real-world setting. With this in mind, it’s no wonder the US Air Force spokesperson is keen,
past all the talk of finances and efficiencies, to emphasise the way computers can help their country defeat “adversaries” in the sky.
In the cloud
Given the sophistication of all those ones and zeroes, it’s tempting to imagine that the aerial duels of tomorrow will be fought without any humans at all. But that’s not quite true. In large part, this is a question of ethics. To put it differently: can a computer really be trusted to fire a missile primed to kill? Recent reports don’t offer much reassurance here. In May 2023, for instance, Colonel Tucker ‘Cinco’ Hamilton, the US Air Force’s chief of AI test and operations, claimed that an AI drone, programmed to destroy an enemy air defence system in an exercise, ended up attacking anyone who interfered with the initial order. Later, the US Air Force would go on to deny the initial reporting, suggesting the colonel’s statements had been taken out of context “and were meant to be anecdotal”. With that in mind, however, it’s unsurprising that the air force’s spokesperson should highlight the need for robust oversight whenever robots are used. “AI activities will augment, not eliminate, human intelligence,” they stress. “The Department of Defense requires that all weapons systems, including those that incorporate autonomous elements, must include appropriate levels of human judgement.” This more subtle approach to AI integration makes sense in other ways, too. In a field so predicated on individual skill – and on deep reliance on human comradeship – Pettyjohn worries that talented pilots would baulk if their wingmen became wing-robots overnight. “Flying is a very human endeavour,” is how she puts it, adding that “developing that trust” between man and machine is a crucial first step. Fair enough: in a situation as intense as a dogfight, the last thing pilots want to worry about is that their partner will shoot them down to fulfil some abstract command. All the same, and especially given the rising prospect of a shooting war in the Pacific over the coming years, the robots look here to stay – even if they should probably be viewed as steeds, and not knights all by themselves. ●
Defence & Security Systems International /
www.defence-and-security.com
US Air Force
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45