About Us Our Work Employment News & Events
MITRE Remote Access for MITRE Staff and Partners Site Map
The MITRE Digest

Follow Us:

Visit MITRE on Facebook
Visit MITRE on Twitter
Visit MITRE on Linkedin
Visit MITRE on YouTube
View MITRE's RSS Feeds
View MITRE's Mobile Apps

 

 

Home > News & Events > MITRE Publications > The MITRE Digest >
spacer

Kaleidoscope Keeps Pace with Battle Events


April 2006

Kaleidoscope imagery and fighter

Remember the toy kaleidoscope you played with as a child? The dazzling optics were created by the constantly changing patterns and colors. At MITRE, the word "kaleidoscope" has a slightly different, if parallel, meaning. It describes a system of models that examines and correlates spatial and temporal data associated with moving targets. The sum of the data renders a simpler, more complete picture of a complex, dynamic battlespace.

"We chose the name 'Kaleidoscope' for our project because we wanted to take a different view of the world," explains Stephen Matechik, lead information systems engineer in MITRE's Interoperability and Site Technology organization. "Our research uses simple data fusion techniques to model a human's cognitive process." By fusing facts that are developing in real time—such as altitude and speed of aerial platforms, movements of convoys, and potential insurgency threats—a more complete scene emerges. This evolutionary view improves a warfighter's ability to make targeting decisions throughout the time-sensitive find-fix-track-target-engage-assess process, known as F2T2EA, that is vital to operators in an Air and Space Operations Center (AOC).

Currently, various systems display different pieces of the puzzle, and it's up to an individual to employ "swivel chair mechanics," as Matechik calls it, manually merging bits and pieces of information from one system or another to draw conclusions about what is happening in real time on the battlefield.

"If you were to sit in the time-critical target [TCT] cell of the AOC, you would notice that many systems are needed to decipher and describe the changing battlespace," says Matechik. "One monitor might show video of a UAV [unmanned aerial vehicle] in real time. Another system might be a Joint Services Workstation, which relays Joint STARS ground moving target data." (Joint STARS—short for Joint Surveillance Target Attack Radar System—is a cooperative Army-Air Force effort for long-range surveillance that locates and tracks targets.)

A Crucial Mission

Kaleidoscope, a U.S. Air Force-supported research project now in its second year, attempts to solve a big challenge facing military combat commanders at the operational level of war. Typically, the military's AOC is flooded with data, but starved for methods that transform data into actionable information for warfighters on the ground or in the air.

Matechik, the project's principal investigator, says that rather than reinvent a complex technical architecture, the Kaleidoscope team began from the resulting end of the equation. "What messages and reports are available?" he asks. "What intelligence collections are at one's disposal? What products does the TCT require to determine if a moving threat is, in fact, a red target, whose engagement is necessary to meet commanders' objectives?"

This project has three main objectives: introduce and improve UAV video moving target tracking; create a common data model that integrates critical information derived from intelligence, surveillance and reconnaissance (ISR) platforms, sensors, and associated processing; and develop a computationally efficient, reason-based multi-sensor fusion engine.

The Kaleidoscope project is part of the corporation's Department of Defense (DOD) Command, Control, Communications, and Intelligence (C3I) Federally Funded Research and Development Center (FFRDC), which supports a broad and diverse set of sponsors within the DOD and intelligence community.

Kaleidoscope offers Joint STARS and other tactical-edge users a low-bandwidth option to disseminate information captured in UAV video scenes. It also offers a means to visually identify Joint STARS moving target indication (MTI) detections.

Getting it Done for Sponsors

"When humans look at video, they can easily recognize people, places, and objects—machines cannot reliably do that yet," Matechik points out. "Therefore, people are still required to visually confirm, or identify, threats and targets. But the problem of associating what is seen in video with sensor detections from other systems has been, until now, a manual, time-consuming process."

Video also uses a large share of transmission bandwidth, and its pixel-space representation is not conducive to automated, machine-to-machine (M2M) communications.

Kaleidoscope solves the problem by detecting and precisely geo-locating moving targets observed in UAV video. It computes positional information such as latitude, longitude, velocity, and heading, applies temporal time stamps, and extracts sensor-unique information, such as color. All of this data is derived from high-bandwidth pixel-space, captured and conveyed in an efficient common data model representation. "In this low-bandwidth, alternative representation, we can now think about efficiently moving the information M2M," Matechik says.

Pulling Together the Big Picture

The researchers who work on this project have accomplished a great deal during the past year. They successfully introduced video MTI track generation as a viable real-time information source to warfighting decision makers. They also demonstrated the value of using a data model that captures critical, real-time information that wasn't previously available to the warfighter. To validate their work, with willing help from the Air Force Command and Control Battle Lab and 116th Air Control Wing, they coordinated a live Joint STARS MTI-UAV video coincident data collection last spring at the Camp Blanding Joint Readiness Training Center in Florida—a site where Air Force security forces are trained prior to their operational deployments. "Coincident" data collections operate in real time with synchronized feeds from different sources.

"The results were one of the first coincident Joint STARS-UAV 'ground-truthed' data collections for the Air Force research communities," Matechik says. "That was a huge step for us—and others."

This year, the Kaleidoscope team is striving to develop an initial content-based video indexing and retrieval capability. This would enable content captured in the video to be automatically indexed and cataloged, so it can be easily recovered for future analysis using natural-language queries.

—by Cheryl Balian


Related Information

Websites

 

Page last updated: March 18, 2009 | Top of page

Homeland Security Center Center for Enterprise Modernization Command, Control, Communications and Intelligence Center Center for Advanced Aviation System Development

 
 
 

Solutions That Make a Difference.®
Copyright © 1997-2013, The MITRE Corporation. All rights reserved.
MITRE is a registered trademark of The MITRE Corporation.
Material on this site may be copied and distributed with permission only.

IDG's Computerworld Names MITRE a "Best Place to Work in IT" for Eighth Straight Year The Boston Globe Ranks MITRE Number 6 Top Place to Work Fast Company Names MITRE One of the "World's 50 Most Innovative Companies"
 

Privacy Policy | Contact Us