![]() |
|||||
|
|
Home > News & Events > MITRE Publications > The MITRE Digest > | |||||||||||||||||||
Predictive Heads-Up Display Compensates for Feedback Lag for Predators and Reapers December 2009
A pilot at Creech Air Force Base who remotely operates an unmanned aircraft such as a Predator or Reaper in close air support has to react quickly and precisely. Tight maneuvers are required when tracking fleeing targets or delivering weapons accurately over Afghanistan. However, there is a two-second round-trip delay between the pilot making a control input and observing the aircraft’s reactions. It takes one second for the pilot’s input to relay through a satellite to the aircraft, and one second for the return signal from the aircraft’s instruments. That’s two seconds too slow for precise maneuvers. A pilot must make small control inputs and wait through the satellite lag to see the aircraft's response before continuing with more inputs. Compared to autopilot mode, manually flying an unmanned aircraft takes a good deal of concentration that can otherwise be applied to other mission tasks. It also produces results that only marginally outperform the autopilot. When faced with the reduced performance and large effort required, most pilots choose to control the plane exclusively through the autopilot. However, this eliminates a valuable part of the aircraft's performance envelope. And relying on the autopilot allows pilot skills to atrophy below a level that's needed in critical situations. To improve maneuverability, MITRE responded to a request by Col. Jeffrey W. Eggers—the Air Force's director of Intelligence, Surveillance, and Reconnaissance (ISR) Innovations for unmanned aircraft—to create a proof-of-concept for a predictive heads-up display that uses a computer gaming engine. (A heads-up display allows the user to look forward at a monitor or screen to see critical information, rather than downward, which can be distracting.) The display is based on Col. Eggers' concept of predicting where the aircraft is in near-real time. The predictive display overlays the pilot's normal heads-up display. The pilot can see both displays at once, or toggle between the two. This allows the pilot more control and the ability to be more aggressive with the aircraft. A three-person team put the prototype together in three weeks. The fast turnaround was possible in large part because of MITRE's close working relationship with the Air Force's 432d Air Expeditionary Wing at Creech AFB, where Col. Eggers was the deputy group commander at the time. That relationship planted the seeds of the solution several months before the need was apparent. Solution Emerges from Close Customer Relationship The solution started as an information exchange. To understand our customers' problems, MITRE gives special briefings about emerging technologies that may become useful one day. A few years ago, MITRE's Robert Bahnij invited Col. Eggers to our campus in Bedford, Mass., to talk about improving UASs. Bahnij, MITRE's technical advisor for Nellis and Creech AFB, Nev., is also a former fighter pilot instructor and flight examiner for the F-4 Phantom II, the F-104 Starfighter, and the F-16 A/C Falcon/Viper, so he readily understands combat flying issues.
After Col. Eggers' presentation to a wide variety of engineers and scientists, he was introduced to Dave deMoulpied, who manages the MITRE group working on human-systems integration, visualization, and decision-support projects. DeMoulpied's group described the latest advances in decision support and ISR management for unmanned aircraft. They also discussed the use of computer game technology that simulates flying unmanned aircraft. Later, when Col. Eggers was working on the satellite time-lag problem, he remembered the game technology demonstration that simulated unmanned flight. He wondered: Could a predictive simulation of an unmanned aircraft's real-time flight be overlaid on a video feed from a Predator? Exercising his mathematical bent, he wrote a technical paper describing a concept for predicting the aircraft's real-time position and accounting for external disturbances such as wind gusts and loss of communications. Using Col. Eggers' paper as a guide, the MITRE team developed a demonstration program using a computer game engine. The team included Matthew Patron, a principal software systems engineer; Jon Homer, a lead software systems engineer; and Matthew Durgavich, a senior software application development engineer. "At the end of three weeks we sent the software to the test users to install on their own computers, along with a compatible joy stick and throttle, so they could plug in and go," says Patron, the team lead. "They were thrilled." Since then, the predictive heads-up display has been shown to the Air Force Research Lab, which is now building a real version of the simulator for testing. "We're targeting it for deployment in an advanced control system in the next few years," says Col. Eggers.
—by David A. Van Cleave Related Information Articles and News
Technical Papers and Presentations
Websites |
||||||||||||||||||||
Page last updated: December 10, 2009 | Top of page |
Solutions That Make a Difference.® |
|
|