HEMP - Helicopter Ego Motion Perception
The aim of this research is to improve the useable cue environment in degraded visual environments (DVE's). This requires the ability of sensors which can "see" through conditions such as fog, cloud cover, night etc. Active sensors such as lasers and radar are already capable of achieving this but with the remaining question to still be answered 'How much information is sufficient in order to achieve safe flight?'
The question of 'How much technology is required?' becomes dependent on human perception.
Visual perception is about seeing objects, surfaces and events. Light provides the means in which to achieve this. An assumption is that humans/pilots can effectively and efficiently use information provided to them in the real world. However once this information is 'degraded' e.g. rain and arctic conditions, then the reliance on other sources of information is required. Another assumption is that pilots rely on perspective cues i.e. they use relative distances between 2 objects to control translations and rotations. There is little work available that specifies visual cues in a graphical scene that are used by a pilot to control his/her motion. Thus the need to breakdown the graphical content and quantify this information to determine what and how much information is required. A series of flight tests were conducted in the Liverpool's flight simulator to attempt to answer this question. The work has been continued in Prospective Skyguides and Helicopter Operations in Degraded Visual Environments
View inside the simulator's cockpit using synthetic boxes as a guide.