![]() |
|||||
|
|
Home > News & Events > MITRE Publications > The MITRE Digest > | |||||||||||||||||||
Immersive Vision Gives the Best Control to Military Robots January 2009
Inspection of suspected road-side bombs is a dangerous task for soldiers, but robots and unmanned ground vehicles can provide a way to inspect them at a safe distance. Sometimes, however, navigating the robot to the suspicious object or bomb is difficult because the route is too complex, or the robot’s vision system is cumbersome to use. Now, MITRE principal investigator Kyle D. Fawcett has overcome those challenges by developing an easier way to guide robots in complex environments such as litter-strewn streets with many intersections.
Fawcett's solution is a more natural way of seeing through a robot's eyes by building on a process called "visual telepresence." Typically, when a robot works beyond your visual range—say, around a corner out of harm's way—it uses an electronic vision system that sends back real-time video that you use to guide the robot through its environment. "The vision system is supposed to give you the spatial clues to accurately perceive the robot's environment," says Fawcett. "This spatial perception should help you understand where the robot is, how it got there, what's around it, and how to navigate back to home base." Normally, as you travel through an environment, your own internal vision system helps you subconsciously build a mental image, or spatial model, of what's around you. It's when you use an artificial vision system along with your own that problems can occur. Video Vision Systems Vision systems used on most robots broadcast video from a camera mounted on the robot to a local display—either a goggle-like display mounted on your head or a laptop computer screen. Some cameras are fixed to the robot and aimed by steering the robot in the direction the user wants to see. "More frequently, the camera is put on gimbals—a mechanical support with bearings—so that it can pan and tilt, allowing you to scan the robot’s environment independently from steering it," explains Fawcett. "The panning and tilting is controlled by a joystick. But when you use a joystick-controlled vision system, this spatial modeling process becomes a very active process that requires a lot of concentration. The joystick control may seem like a simple interface, but in reality, you must simultaneously maneuver the robot with yet another joystick. And if the robot is equipped with a robotic arm, you have still another joystick to deal with."
Most robot operators have trouble with just one joystick, but even experienced video gamers have trouble with two. "The combined result is that police crews and soldiers on explosive ordnance disposal teams need constant training to maintain the required dexterity to use robots," he adds. "And even with all that experience, building and maintaining a mental map of the robot’s environment is still a difficult task." Fawcett's unique system, called the MITRE Immersive Vision System (MIVS), provides visual telepresence that tricks your eyes into believing you're in the middle of a remote environment (see sidebar). MIVS is a MITRE-sponsored research project and combines commercial-off-the-shelf (COTS) components with integrating software created by Fawcett. It uses a COTS hemispherical digital camera system and a head-mounted display. Attached to the display is an orientation sensor that tracks the position of your head as you move it left or right, up or down. A wireless signal is sent from the head tracking device to the hemispherical camera on the robot to control its position. What you see in the head-mounted display is the remote environment as if you were in the midst of it. Turn your head left, and the display shows what the robot sees to its left. Ditto when you turn your head to the right or up and down or even tilting to the sides. It's as if your eyes are transplanted onto the robot. Near Zero Latency with Virtual Gimbals This vision system is unique because there are no moving parts and thus near-zero mechanical delay, or latency. The hemispherical camera system looks like a small cylinder with five cameras equally spaced around it; a sixth camera points straight up from the top. Software stitches the separate camera images together to form a continuous 360-degree hemispherical image of the area surrounding the robot. The hemispherical image is projected onto a virtual sphere using an onboard graphics processor. Fawcett's software places a virtual camera at the center of this sphere and captures what's in the virtual camera's field of view. Once the images seen by the six cameras have been melded onto a virtual sphere, any segment of the sphere video can be viewed through the virtual camera.
"I can place the virtual camera in the sphere anywhere I want and adjust the field of view of the camera as well," Fawcett says. "That allows me to get different views for different types of situations. For example, if I want to identify the license plate of a car driving by, I can zoom in on the plate. Zooming permits me to use the maximum number of camera pixels for detail. Or, I can dynamically widen my field of view to upwards of 200 degrees, which is useful for navigating and searching for the next object of interest." Applications—Looking High and Low
The immersive vision system can be used in a number of life-saving ways. "If you're disarming a bomb or improvised explosive device, you can put yourself out of the robot's line-of-sight and still have the situational awareness as if you were actually there," he notes. "This 'second-nature' experience also lets you more easily move and manipulate the robot without having to think about it too much."
Another application is to place the system atop armored vehicles such as tanks. Tank pilots and commanders often open the tank's hatch and look outside to navigate and scan the area for approaching threats. The open-hatch operation exposes the cabin and crew members to threats. The MIVS system could provide a virtual open-hatch experience while leaving the actual hatch closed for safety. And installing the MIVS on both an armored vehicle and a robot could give the commander an extra edge. The armored vehicle could deploy a robot and switch to the robot's vision system at the touch of a button to enable rapid inspection of suspicious objects on the road from the safety of a closed armored cockpit. As Fawcett continues his research, he expects to find more innovative use for MIVS. "Future work might include a robotic arm that mimics your arm and becomes a second-nature interface much like the MIVS system," Fawcett explains. "Ideally, operators will wear a special glove that remotely controls the robot arm. Combining the two systems would virtually put our eyes and hands in the remote environment. That will allow us to operate in more complex environments while reducing our exposure to hazardous situations."
—by David A. Van Cleave Related Information Articles and News
Technical Papers and Presentations
Websites |
||||||||||||||||||||
Page last updated: March 24, 2009 | Top of page |
Solutions That Make a Difference.® |
|
|