About Us Our Work Employment News & Events
MITRE Remote Access for MITRE Staff and Partners Site Map
The MITRE Digest

Follow Us:

Visit MITRE on Facebook
Visit MITRE on Twitter
Visit MITRE on Linkedin
Visit MITRE on YouTube
View MITRE's RSS Feeds
View MITRE's Mobile Apps

 

Home > News & Events > MITRE Publications > The MITRE Digest >

Immersive Vision Gives the Best Control to Military Robots


January 2009

Immersive Vision Gives the Best Control to Military Robots

Inspection of suspected road-side bombs is a dangerous task for soldiers, but robots and unmanned ground vehicles can provide a way to inspect them at a safe distance. Sometimes, however, navigating the robot to the suspicious object or bomb is difficult because the route is too complex, or the robot’s vision system is cumbersome to use. Now, MITRE principal investigator Kyle D. Fawcett has overcome those challenges by developing an easier way to guide robots in complex environments such as litter-strewn streets with many intersections.

MITRE Immersive Vision System (MIVS)

MITRE Immersive Vision System (MIVS)

This video shows the MIVS in action and demonstrates how it could enhance ground force operations. (.wmv also available [56MB])

Fawcett's solution is a more natural way of seeing through a robot's eyes by building on a process called "visual telepresence." Typically, when a robot works beyond your visual range—say, around a corner out of harm's way—it uses an electronic vision system that sends back real-time video that you use to guide the robot through its environment. "The vision system is supposed to give you the spatial clues to accurately perceive the robot's environment," says Fawcett. "This spatial perception should help you understand where the robot is, how it got there, what's around it, and how to navigate back to home base."

Normally, as you travel through an environment, your own internal vision system helps you subconsciously build a mental image, or spatial model, of what's around you. It's when you use an artificial vision system along with your own that problems can occur.

Video Vision Systems

Vision systems used on most robots broadcast video from a camera mounted on the robot to a local display—either a goggle-like display mounted on your head or a laptop computer screen. Some cameras are fixed to the robot and aimed by steering the robot in the direction the user wants to see. "More frequently, the camera is put on gimbals—a mechanical support with bearings—so that it can pan and tilt, allowing you to scan the robot’s environment independently from steering it," explains Fawcett.

"The panning and tilting is controlled by a joystick. But when you use a joystick-controlled vision system, this spatial modeling process becomes a very active process that requires a lot of concentration. The joystick control may seem like a simple interface, but in reality, you must simultaneously maneuver the robot with yet another joystick. And if the robot is equipped with a robotic arm, you have still another joystick to deal with."


Tricking the Brain with Visual Telepresence

For decades, the concept of visual telepresence has been suggested as part of the solution for both the problem of mental mapping and the problem of interface complexity. Telepresence technologies use interfaces and sensory input to mimic interaction with a remote environment to trick your brain into thinking you're actually in the remote environment. Visual telepresence tricks your eyes into thinking they've been transplanted into a remote environment.

"With remote robot interaction we can use visual telepresence to trick users' brains into thinking their eyes have been transported onto the robot," says Kyle Fawcett. "This unlocks the brain's natural ability to build a mental image of the remote environment. The better our vision system mimics interaction with the remote environment, the more we tap into our brain's natural spatial mapping abilities."

 

Most robot operators have trouble with just one joystick, but even experienced video gamers have trouble with two. "The combined result is that police crews and soldiers on explosive ordnance disposal teams need constant training to maintain the required dexterity to use robots," he adds. "And even with all that experience, building and maintaining a mental map of the robot’s environment is still a difficult task."

Fawcett's unique system, called the MITRE Immersive Vision System (MIVS), provides visual telepresence that tricks your eyes into believing you're in the middle of a remote environment (see sidebar). MIVS is a MITRE-sponsored research project and combines commercial-off-the-shelf (COTS) components with integrating software created by Fawcett. It uses a COTS hemispherical digital camera system and a head-mounted display. Attached to the display is an orientation sensor that tracks the position of your head as you move it left or right, up or down. A wireless signal is sent from the head tracking device to the hemispherical camera on the robot to control its position.

What you see in the head-mounted display is the remote environment as if you were in the midst of it. Turn your head left, and the display shows what the robot sees to its left. Ditto when you turn your head to the right or up and down or even tilting to the sides. It's as if your eyes are transplanted onto the robot.

Near Zero Latency with Virtual Gimbals

This vision system is unique because there are no moving parts and thus near-zero mechanical delay, or latency. The hemispherical camera system looks like a small cylinder with five cameras equally spaced around it; a sixth camera points straight up from the top. Software stitches the separate camera images together to form a continuous 360-degree hemispherical image of the area surrounding the robot. The hemispherical image is projected onto a virtual sphere using an onboard graphics processor. Fawcett's software places a virtual camera at the center of this sphere and captures what's in the virtual camera's field of view. Once the images seen by the six cameras have been melded onto a virtual sphere, any segment of the sphere video can be viewed through the virtual camera.

"I can place the virtual camera in the sphere anywhere I want and adjust the field of view of the camera as well," Fawcett says. "That allows me to get different views for different types of situations. For example, if I want to identify the license plate of a car driving by, I can zoom in on the plate. Zooming permits me to use the maximum number of camera pixels for detail. Or, I can dynamically widen my field of view to upwards of 200 degrees, which is useful for navigating and searching for the next object of interest."

Applications—Looking High and Low

The immersive vision system can be used in a number of life-saving ways. "If you're disarming a bomb or improvised explosive device, you can put yourself out of the robot's line-of-sight and still have the situational awareness as if you were actually there," he notes. "This 'second-nature' experience also lets you more easily move and manipulate the robot without having to think about it too much."


Support for MIVS

Support for MIVS came from two internal MITRE research projects: Advanced Perception and 3-Dimensional Simultaneous Localization and Mapping (3D-SLAM). The Advanced Perception project is developing new methods for combining information from multiple sensors on unmanned ground vehicles to improve maneuvering, navigation, safety, and awareness.

The 3D-SLAM project is developing technology to enable soldiers to build a three-dimensional, real-time view of an unknown environment before entering it. Designed for urban situation awareness, 3D-SLAM uses video sensors and visual odometry on unmanned ground vehicles and hovering unmanned aerial vehicles.

 

Another application is to place the system atop armored vehicles such as tanks. Tank pilots and commanders often open the tank's hatch and look outside to navigate and scan the area for approaching threats. The open-hatch operation exposes the cabin and crew members to threats. The MIVS system could provide a virtual open-hatch experience while leaving the actual hatch closed for safety. And installing the MIVS on both an armored vehicle and a robot could give the commander an extra edge. The armored vehicle could deploy a robot and switch to the robot's vision system at the touch of a button to enable rapid inspection of suspicious objects on the road from the safety of a closed armored cockpit.

As Fawcett continues his research, he expects to find more innovative use for MIVS. "Future work might include a robotic arm that mimics your arm and becomes a second-nature interface much like the MIVS system," Fawcett explains. "Ideally, operators will wear a special glove that remotely controls the robot arm. Combining the two systems would virtually put our eyes and hands in the remote environment. That will allow us to operate in more complex environments while reducing our exposure to hazardous situations."


Kyle Fawcett (l), principal investigator, and Tom Gannon, chief engineer, check out the wireless communication system for the hemispherical camera atop the research robot, Little Red.

Kyle Fawcett (l), principal investigator, and Tom Gannon, chief engineer, check out the wireless communication system for the hemispherical camera atop the research robot, Little Red.


Wireless Immersive Vision System

Compared to the real camera mounted on the robot, the virtual camera can show any segment of the virtual sphere from just about any position within the virtual sphere.

Compared to the real camera mounted on the robot, the virtual camera can show any segment of the virtual sphere from just about any position within the virtual sphere.


—by David A. Van Cleave


Related Information

Articles and News

Technical Papers and Presentations

Websites

 

Page last updated: March 24, 2009   |   Top of page

Homeland Security Center Center for Enterprise Modernization Command, Control, Communications and Intelligence Center Center for Advanced Aviation System Development

 
 
 

Solutions That Make a Difference.®
Copyright © 1997-2013, The MITRE Corporation. All rights reserved.
MITRE is a registered trademark of The MITRE Corporation.
Material on this site may be copied and distributed with permission only.

IDG's Computerworld Names MITRE a "Best Place to Work in IT" for Eighth Straight Year The Boston Globe Ranks MITRE Number 6 Top Place to Work Fast Company Names MITRE One of the "World's 50 Most Innovative Companies"
 

Privacy Policy | Contact Us