About Us Our Work Employment News & Events
MITRE Remote Access for MITRE Staff and Partners Site Map
The MITRE Digest

Follow Us:

Visit MITRE on Facebook
Visit MITRE on Twitter
Visit MITRE on Linkedin
Visit MITRE on YouTube
View MITRE's RSS Feeds
View MITRE's Mobile Apps

 

Home > News & Events > MITRE Publications > The MITRE Digest >

Terrain Sensing for Unmanned Vehicles Avoids Rocky Roads


October 2008

Terrain Sensing for Unmanned Vehicles Avoids Rocky Roads

One of the downsides of mobile robots such as unmanned ground vehicles (UGVs) is that they can't travel safely over rough unknown terrain. If UGVs could drive over uneven and vegetated ground, the Department of Defense (DoD) could use them for transporting supplies to troops or as driverless ambulances. They could operate autonomously so that humans wouldn't have to drive them by wire or remote control. And they could be used farther away from a command post.

The crux of the problem for an off-road UGV is getting it to recognize the terrain far enough ahead so it can avoid impassable swamps or rough, rocky areas. The major research question is: What visual features at a distance will predict how a vehicle will fare, mechanically, on terrain? This information is required for downstream path planning, navigation, and motor control functions.


The Dune Buggy as a Stereo Camera

For the terrain sensing research, the team chose a dune buggy because it's rugged and can travel over unpaved terrain. The buggy's tubular frame also makes it easy to attach a variety of sensors plus a computer and uninterruptable power supply. About 150 pounds were added to the buggy's base weight of 1,100 pounds. It's powered by a 16.8 horsepower, 250cc engine and has a top speed of 45 mph.

"The dune buggy has several cameras onboard," explains MITRE's Jeff Colombe. "We're using a wide-baseline stereo camera with 'eyes' about 10 inches apart to see stereo disparities into the far field. This lets us estimate depths, distances, and 3D shapes of terrain from far away. We also have near-field cameras over each front wheel to record what the terrain looks like while the vehicle is driving over it. We can track features from the far field into the near field and correlate how the buggy will 'ride' once the vehicle gets to the far field."

Meanwhile, a hemispherical camera gives a panoramic view of the environment without any distortion. It shows a scene as you would look out in all directions as if your eyes were the center of a hemisphere. "A fish eye lens would give you a panoramic view, but it would also give a distorted scene," notes Colombe.

Several sensor devices feed data at more than 350 megabytes per second to the onboard computer. The sensors include a forward laser rangefinder, a stereo camera for 3-D pictures of the terrain, and three panoramic cameras. Rotation sensors check all four wheel rotations individually. The brake pedal and steering mechanism each have a sensor. "These sensors allow us to model the vehicle to see if it's moving as intended," he explains. "We can tell when the brakes are locking, or if the wheels are spinning." Other sensors include linear and rotational accelerometers, a global positioning system (GPS), and a compass.

Because some sensors such as the GPS sensor and the compass can give deceptive information, visual odometry is used to get a true estimate of the vehicle's motion. Visual odometry is the process of determining the position and orientation of a robot by analyzing associated camera images.

Monocular cameras capture the terrain right under the vehicle and mechanical sensors measure how the vehicle reacts to the terrain surface. "You want to pair the visuals with the mechanical sensors," explains Colombe. "Since we're recording all the images on the disk, we can go back to each patch of terrain the vehicle actually rolled over and track it backward."

 

To help solve the problem, a MITRE team is working on a research project for the U.S. Army to extend a UGV's range over unpaved terrain by up to more than 10 times, while also improving mechanical performance. The team, headed by Jeff Colombe, a lead artificial intelligence engineer, and Todd Hay, a lead software systems engineer, has been collecting a database of terrain visual cues with a dune buggy. (See the sidebar, "The Dune Buggy as a Stereo Camera," on this page.)

"Long-range path planning is difficult because current methods don't use an understanding of a vehicle's mechanical properties on terrain to analyze a UGV's traversability," says Colombe. "Our approach differs from previous ones because we're extracting estimates of the three-dimensional shapes of terrain surfaces. The pigmentation properties of those surfaces indicate what materials they are made of—such as snow, mud, sand, pavement, and so on. We'll use these properties to predict how the vehicle is likely to behave mechanically on that terrain. The ability to traverse terrain will be estimated in several different ways that reflect important properties of vehicle mechanics, including controllability, safety, smoothness of ride, obstacle avoidance, and not getting stuck."

Matching Far-Views with Near-Views

The trick in path finding is to correlate the terrain the dune buggy's video cameras see in the distance with how the vehicle behaves mechanically when it travels over that same ground a few minutes later. Along the way, it will have gathered multiple far-field views of those patches of terrain (visited later in the near field) with stereo and/or motion parallax information between video frames. (Motion parallax makes objects close to the camera appear to move faster than things further away.) "We'll use reverse-correlation techniques to discover visual features at various distances that predict the mechanical performance of the vehicle if it were to visit each patch of terrain," says Colombe.

When the data gathering process is completed, the team will "train" the vehicle to drive like a human. "When you drive, you use your vision to estimate how your vehicle is likely to perform mechanically on various patches of terrain around you," says Colombe. "This information then determines what kind of path you'll choose to take, and how to drive along it. You want to maintain good control to minimize a bumpy ride and to keep the vehicle upright and away from collisions."

Making an Impact

The methods developed under this research project could be used for on-board processing of scene information such as rural farmland or a dry riverbed. The information could then be transmitted to remote human tele-operators to provide enhanced situational awareness during, for instance, a search and rescue operation.

"This year, we're seeking to develop a good predictive model," says Colombe. "Later, we'll exploit these models for autonomous path planning and navigation. If the approach proves successful, the Army's Future Combat Systems, Defense Advanced Research Projects Agency, and the Navy and Marines could use it to migrate from tele-operation of UGVs to autonomous operation as visual perception and path planning skills mature."

In addition to military uses, the terrain-sensing-technique could be applied to civilian vehicles in the future. Instead of driving to work yourself, for instance, your driverless car would avoid potholes and road debris to provide a smooth, safe ride to your office.

—by David A. Van Cleave


A variety of sensors record far-range terrain, near-range terrain, and the mechanical movements of the dune buggy as it travels over the ground.

A variety of sensors record far-range terrain, near-range terrain, and the mechanical movements (wheel spin, bumps, sideways slippage, etc.) of the dune buggy as it travels over the ground.


Jeff Colombe prepares for a test run of the terrain-sensing dune buggy.

Jeff Colombe (r) prepares for a test run of the terrain-sensing dune buggy. The gray boxes over the rear tires hold batteries for the on-board computer's uninterruptable power supply; the computer draws 300 watts. Stereo cameras and hemispheric panoramic cameras are mounted on the roof. The blue box on the buggy's front is a forward looking laser range finder.



Related Information

Articles and News

Technical Papers and Presentations

Websites

 

Page last updated: October 27, 2008   |   Top of page

Homeland Security Center Center for Enterprise Modernization Command, Control, Communications and Intelligence Center Center for Advanced Aviation System Development

 
 
 

Solutions That Make a Difference.®
Copyright © 1997-2013, The MITRE Corporation. All rights reserved.
MITRE is a registered trademark of The MITRE Corporation.
Material on this site may be copied and distributed with permission only.

IDG's Computerworld Names MITRE a "Best Place to Work in IT" for Eighth Straight Year The Boston Globe Ranks MITRE Number 6 Top Place to Work Fast Company Names MITRE One of the "World's 50 Most Innovative Companies"
 

Privacy Policy | Contact Us