Air travel is the safest form of transportation, but human error and unexpected failures still lead to devastating accidents every year. Dr Paul Robertson of Dynamic Object Language Labs, Inc. (DOLL) is developing a new system that can help pilots detect and respond to problems before they become disasters. This AI-based technology, called Lightweight Interaction and Storytelling Archive (or LISA), continuously monitors aviation conditions and provides support to pilots. Read More
Unlike current warning systems that alert pilots after something has already gone wrong, LISA can predict and prevent issues before they escalate. Rather than waiting for specific parameters to exceed their limits, it continuously monitors the aircraft’s current state – including airspeed, altitude, and pilot inputs – and compares this information to a database of historical accidents and near-misses. It can predict the trajectory of the aircraft state and make suggestions to the pilot to avoid a catastrophic event.
Instead of simply issuing alarms, the technology offers succinct, context-aware warnings, helping pilots to correct mistakes or adapt to changing conditions. In a busy cockpit environment, LISA can filter out the noise and highlight only the most relevant information for a given situation.
How does it work? Depending on the aircraft, one or more cameras watch the instrument panel and the state of all switches. The system also monitors communications with air traffic control. It sees everything and understands the big picture. Let’s say that the angle of attack of the plane is high and the airspeed is dropping; this situation can lead to a stall, and can easily happen when flying in clouds where there is no outside reference. LISA determines exactly what the pilot must do and instructs the pilot to avoid the catastrophic situation while there is still time to do so.
To test LISA’s effectiveness, Dr Robertson and his colleagues conducted extensive tests using flight simulators. They recruited 23 experienced pilots and simulated conditions similar to two real-world accidents. Some pilots used LISA while others worked with a ‘baseline AI assistant’, built on a large language model. In one scenario, which involved an engine malfunction, LISA successfully helped 80% of the pilots to avoid an accident. All of the pilots using the baseline AI assistant experienced a crash.
In a second scenario, involving a challenging terrain in high-altitude conditions, all pilots using LISA completed the flight safely, while most of those using the baseline assistant experienced fatal accidents.
One concerning finding was the baseline AI assistant’s tendency to sometimes provide dangerously incorrect information. In two cases, the system’s plausible but incorrect responses led to accidents. The focused, specialised LISA system consistently outperformed the baseline assistant by providing advice from aircraft and aviation documents that are specific to the aircraft and flight plan.
DOLL is now preparing to release a commercial version under the code name “First Officer”. First Officer may soon shape the future of flight safety, helping to save lives by allowing pilots to avoid dangerous situations before they become critical.