The touch-based flight control panel during a test flight in a DR400 by unstable weather conditions.

This intern work is part of Airtius, a larger project whose objective is the design of a commercial flight deck based on tangible interaction. It focuses on a single instrument, the Flight Control Unit that enables the crew to interact with the Flight Guidance system. In this case study, the purpose was to explore tangible design dimensions and understand better the usability limitations related to the use of touch screens.

Several Finite State Machine (FSM) components were used to implement the gesture recognition algorithm, from a general recognition FSM (idle, pressed, moving) to domain specific FSMs (selected, managed, etc.). This allowed the student to iterate separately on the gesture detection robustness (e.g. adding a waiting state for unintentional touch release recovery) and the domain specific test results (e.g. adding a specific behavior for Altitude parameter).

Go back to gallery