War, Drones, and Videotape: A New Tool for Analyzing Video

May 2010
Topics: Sensor Technology, Multimedia Information Systems
Sorting through video from remotely piloted aircraft in Iraq and Afghanistan requires a new system, the Video and Image Retrieval and Analysis Tool, or VIRAT.
uas

Flying at an altitude of 25,000 feet, the MQ-9 Reaper is equipped with sensors and a thermal camera that play a key role in intelligence, surveillance, and reconnaissance (ISR). Since the introduction of remotely piloted aircraft (RPAs) like the Reaper, their impact has been significant.

"Between 2001 and 2008, ISR hours increased by 1,900 percent, with more than 160,000 hours of video collected in 2008 alone," says Michael Fine, a senior artificial intelligence engineer at MITRE.

To help intelligence analysts sort through this vast amount of footage, the Defense Advanced Research Projects Agency (DARPA) is working to develop a Video and Image Retrieval and Analysis Tool (VIRAT). The goal of VIRAT is to enable an analyst to rapidly find an action (in which direction was the convoy heading?) or event (where did an explosion occur?) either from a video archive or in a real-time stream during live operations. Says Fine, "VIRAT is about getting critical video content to the right analyst at the right time."

Three vendors have been invited by DARPA to propose a solution for developing VIRAT, which will eventually be transitioned to a military sponsor. To evaluate the proposed solutions, DARPA asked MITRE for support.

"Because we're an FFRDC [federally funded research and development center], and we don't compete with them, the developers are willing to share their code," Fine says. "We have all three systems running here at MITRE, which allows DARPA to better assess their performance."

Test Footage

The MITRE team, led by Fine, is defining key metrics for the program. The team's initial work involved creating its own test footage. While they recorded the action from RPAs overhead, another team on the ground at Creech Air Force Base in Indian Springs, Nevada, performed various scenarios similar to those analysts might see. The video footage was processed and annotated—what Fine calls "developing an answer key"—to enable the MITRE team to determine if a vendor's solution is able to actually identify where a specific event occurs.

The MITRE team is also developing a test environment that will interface with each of the vendor's systems to automate the evaluation. "We'll test whether the system can alert an operator in real-time, and whether it can look through an archive of video to find similar events," Fine says. "We'll also assess how the system improves with user feedback, and how quickly it can learn to detect a new, never-before-seen event. The test environment will calculate key program metrics, such as the probability of detecting an activity and the false-alarm rate per hour."

As the first phase nears completion, preparations are already underway for phase two, which is expected to begin this spring. "In phase two, DARPA will down-select to two vendor teams," Fine says. "We'll test those solutions with additional data and expect improvement."

In the next two years, the MITRE team hopes to work with more operational data. According to Fine, the work may also transition to platforms other than RPAs, possibly including ground cameras. "MITRE's evaluation will directly influence the development of this technology," he says.

—by Tricia C. Bailey

Publications

Interested in MITRE's Work?

MITRE provides affordable, effective solutions that help the government meet its most complex challenges.
Explore Job Openings

Publication Search