While there might not be a human programmer in 5 years if we are to believe the CEO of Stability AI (from a recent interview with Peter H. Diamandis for the Moonshots and Mindsets Podcast), the aerospace industry still seems to be stuck in the existing paradigm even for applications that it considers “low risk” and that might bring “net safety benefit” to the aviation according to the EASA’s own recent policy.

The newly established EUROCAE working group WG-127 that is a spin-off from the way too broadly scoped WG-117 has as an aim a creation of a new ED-xxx “Software Considerations in Low-Risk Applications Equipment Certifications and Approvals” guideline for Software development in low risk applications like UAS (but it may be also deployed/acceptable in some eVTOL and GA applications). EASA representative there expressed a desire to “lower the burden” to aid new product development in line with the “Net Safety Benefit” policy already established for CS-23/CS-27 applications and even spoke of considering recognition of existing automotive standards for such applications. However he then partially contradicted himself by explaining that what he/the agency envisions is a structured standard like ED-12C/DO-178C with objectives somewhere between Development Assurance Level C and D (he mentioned that DAL C is sometimes an overkill for some applications) and no, no Single Level of Requirements ideas are welcome here…. That does not leave exactly too much space to “reduce the burden” for the new standard.

As the ChatGPT/AI code generation capabilities that emerged in the last years seem to reduce the need for human coding to having well articulated, complete and correct description of what the code should do (which aligns well with the High Level Requirements as defined in ED-12C/DO-178C) we might deploy this toolset to automate or reduce some of the more time/budget consuming activities from the aerospace software development lifecycle (that tend to be officially or unofficially heavily automated anyway) like writing low level requirements and/or coding. That approach might be extended to verification of component level design as well. Some aerospace companies with a vast base of an existing historic data in the form of requirements, design/code and verification cases/procedures might be able to train their own models that might be better suited/more trustworthy than the existing open-source based AI systems.

Such an approach would enable faster deployment of new products and applications improving aviation safety into both emerging markets like UAS and eVTOL as well as into the existing GA market where many safety features are still prohibitively expensive.

Second part of this equation is the Airborne Electronic Hardware – here no activity toward “burden reduction” has started yet but it might be next on the agenda. Already some concessions have been granted for CS-23, eg. the Development Assurance Level requirements for Class/Level I/II have been lowered to DAL D for the primary system instead of DAL C with the caveat that the hardware has to contain a monitoring function that is DAL C. But again, maybe we need to be more ambitious here as the recent examples show that the deployment of LLMs might speed up/lower the cost of  the design process significantly – a group of Chinese scientists have designed a new industrial-scale RISC-V CPU in under 5 hours – that was about 1000x faster than a human team could have finished a comparable CPU design (and even more time would be required to design this CPU following aerospace standards for AEH). More on that here.