Paper
5 May 2017 Sensor data/cueing continuum for rotorcraft degraded visual environment operations
Joe Minor, Zachariah Morford, Walter Harrington
Author Affiliations +
Abstract
Degraded Visual Environments (DVE) can significantly restrict rotorcraft operations during their most common mission profiles of terrain flight and off-airfield operations. The user community has been seeking solutions that will allow pilotage in DVE and mitigate the additional risks and limitations resulting from the degraded visual scene. To achieve this solution there must be a common understanding of the DVE problem, the history of solutions to this point, and the full range of solutions that may lead to future rotorcraft pilotage in the DVE. There are three major technologies that contribute to rotorcraft operations in the DVE: flight control, cueing and sensing, and all three must be addressed for an optimal solution. Increasing aircraft stability through flight control improvements will reduce pilot workload and facilitate operations in both Degraded Visual Environments and Good Visual Environments (GVE) and therefore must be a major piece of all DVE solutions. Sensing and cueing improvements are required to gain a level of situational awareness which can permit low-level flight and off-airfield landings, while avoiding contact with terrain or obstacles which are not visually detectable by the flight crew. The question of how this sensor information is presented to the pilot is a subject of debate among those working to solve the DVE problem. There are two major philosophies in the field of DVE sensor and cueing implementation. The first is that the sensor should display an image which allows the pilot to perform all pilotage tasks as they would fly under visual flight rules (VFR). The second is that the pilot should follow an algorithm-derived, sensor cleared, precision flight path, presented as cues for the pilot to fly as they would under instrument flight rules (IFR). There are also combinations of these two methods that offer differing levels of assistance to the pilots, ranging from aircraft flight symbology overlaid on the sensor image, to symbols that augment the displayed image and help a pilot interpret the scene, to a complete virtual reality that presents a display of the sensed world without any “see-through” capability. These options can utilize two primary means of transmitting a sensor image and cueing information to the pilot: a helmet mounted display (HMD) or a panel mounted display (PMD). This paper will explore the trade space between DVE systems that depend on an image and those that utilize guidance algorithms for both the PMD and HMD as recently demonstrated during the 2016 and 2017 NATO flight trials in the United States, Germany and Switzerland.
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Joe Minor, Zachariah Morford, and Walter Harrington "Sensor data/cueing continuum for rotorcraft degraded visual environment operations", Proc. SPIE 10197, Degraded Environments: Sensing, Processing, and Display 2017, 101970Y (5 May 2017); https://doi.org/10.1117/12.2262939
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

Visualization

Image sensors

Head-mounted displays

LIDAR

Situational awareness sensors

Control systems

RELATED CONTENT


Back to Top