Tracking the eye of the pilot

piloto
En la cabina de un simulador de vuelo A320, un sistema de seguimiento ocular que consta de cámaras y sensores infrarrojos mantiene un seguimiento constante de dónde está mirando el piloto (izquierda). (Imagen: David Rudi / ETH Zurich)

Views: 68

 – In a collaboration with Swiss International Air Lines, NASA and other partners, researchers at ETH Zurich have developed eye-​tracking software for use in pilot training. This allows instructors to analyse the gaze behaviour of student pilots in the cockpit.

Courtesy ETH by Michael Keller: Anyone who has ever sat in a cockpit will know how mentally challenging it is to pilot an aircraft. During a flight, pilots and copilots have to process an enormous quantity of visual, acoustic and spatial information. Keeping a constant eye on the numerous instruments in the cockpit is a strenuous task, as pilots must check the correct indicators during a manoeuvre – often in a specific order.

This process of “scanning” the flight systems is something that pilots internalise during their training. But even for experienced instructors, it is hard to judge whether a student pilot is looking at the right instruments at the crucial moment. Now, in collaboration with Swiss International Air Lines, researchers led by ETH Zurich Professor Martin Raubal have used eye-​tracking technology for the first time to understand how pilots monitor the automatic systems of a modern passenger aircraft.

Seeing what the pilot sees

Camera-​based eye-​tracking technology allows precise monitoring of a person’s eye movements. “Since eye movements allow conclusions to be drawn about a person’s thought processes, Swiss came to us with the idea of using eye-​tracking in pilot training,” says Martin Raubal, Professor of Geoinformation Engineering at ETH Zurich.

The idea developed into a several-​year economic partnership involving NASA, Lufthansa Aviation Training and the University of Oregon in addition to ETH Zurich. Here, the common goal was to improve flight simulator training and thereby cockpit safety. Raubal’s team developed software by the name of “iAssyst” that assists flight instructors as they train budding pilots. The researchers recently wrote about their work in the journal Ergonomics.

Reducing the burden on instructors

iAssyst” stands for “Instructor Assistant System”. The program integrates video, audio and simulator recordings while simultaneously displaying the pilots’ gaze patterns. To avoid distracting the pilots, an eye-​tracking system consisting of fixed cameras and infrared sensors was specially installed in the cockpit of an A320 flight simulator. “Setting up and calibrating the system for each trainee pilot is more laborious than with eye-​tracking glasses, but it provided us with better results,” explains David Rudi, who implemented the application as part of his doctorate at the Chair of Geoinformation Engineering’s Geogaze Lab.

The ETH Zurich researchers designed their software in close cooperation with aviation experts from the project partners before evaluating it with the help of seven active instructors from Swiss. During a training flight, the instructor sits in the rear of the cockpit, from where they operate the simulator and play the role of flight controller while also keeping a close eye on the pilot. “As a result, instructors sometimes miss – or misjudge – relevant information that is vital for analysing the pilot’s training session,” says Rudi.

The feedback from the study showed that iAssyst actually allowed the instructors to analyse the pilots’ flying performance more precisely. “The tool helps us to recognise weaknesses in systematic scanning and to spot gaps in perception during certain phases of flight,” confirms Swiss Air Lines pilot Benedikt Wagner, who is an instructor himself and supervised the eye-​tracking project on the part of Swiss. Using the software, trainers were able to assess the causes of potential pilot errors better and adapt the training accordingly.

Focused on individual aims

This is the first time that a research project has analysed the gaze-​based interactions of pilots in a flight simulator. As part of the collaboration, it was important to Raubal’s team to generate a stand-​alone scientific benefit, because optimized pilot training was not sufficient for ETH as an aim in itself. “We therefore placed our focus on software development,” says Rudi.

For Swiss, on the other hand, the project was primarily about the scanning process in the cockpit. Thanks to the eye-​tracking system, they were able to study this aspect separately alongside aviation psychologists from Nasa and the University of Oregon. The resulting insights have led to new guidelines for the visual monitoring of automatic flight systems. Lufthansa Aviation Training provided the consortium with technical expertise and the infrastructure in the simulator. Lastly, the Federal Office of Civil Aviation (FOCA) met around 40 percent of the project costs.

One tool – many possibilities

Raubal and Rudi see an obvious application of iAssyst in the evaluation discussions conducted after training flights in a simulator. In the long term, the program could also be used in real cockpits – although that possibility remains a long way off.

But aviation isn’t the only research field in which eye tracking can contribute to improving the interactions between users and technical systems. According to Raubal and Rudi, their software could conceivably also be used in medical training, for example, where doctors use simulators to practise performing operations on an artificial body.

References

Rudi D, Kiefer P, & Raubal M. (2019). The Instructor Assistant System (iASSYST) Utilizing Eye Tracking for Aviation Training Purposes. Ergonomics 08 Nov 2019. DOI: 10.1080/00140139.2019.1685132

Rudi D, Kiefer P, Giannopoulos I, & Raubal M (2019). Gaze-​based interactions in the cockpit of the future: a survey. Journal of Multimodal User Interfaces. 19 July 2019. DOI: 10.1007/s12193-​019-00309-8

Related Article:  New radar components reduce aviation weather impact