![]() |
|||||
|
The reality is that we are able to review only a fraction of what is available, and with several new commercial satellites going up in the next few years, the problem will only get worse. We can use imagery, especially satellite imagery, to confirm what we suspect (e.g., training activities are underway in a remote area of the world) or to monitor activity at specific sites (e.g., production at a foreign weapons factory is increasing), but we simply dont have the manpower to look through all this imagery to find something new. Automated Analysis of Overhead Imagery The first step is to factor out the expected, normal changes, such as seasonal differences or lighting intensity. We use data mining techniques to identify the normal changes over time between images. Given a pair of images, we train an Artificial Neural Network (ANN) to recognize what should be in one image given what is in the other image. That is, we feed the intensities of the various colors for a given point in one image as input to the neural network, and the intensities of that point in the other image as expected output. ANNs perform a weighted combination of inputs to generate outputs. Training automatically adjusts the weights until the resulting input-to-output function is correct for all (or most) of the training examples. Each pair of points in the images constitutes one training example. We use all points in the image as training examples. This enables the trained ANN to recognize that, for example, the light brown of a dirt road should stay light brown, while the bright green color of newly sprouted wheat in the spring should become the brown of earth in a picture taken after harvest. The trained ANN is then used to get expected values for each point in the new image: for a given point, the intensity values from the old image are fed to the ANN, which produces expected values for the new image. The expected values are compared with the actual values in the new image; if they are substantially different, then that point is flagged as an unusual change. Defining what constitutes a substantial difference between the expected and actual values involves statistical analysis of the entire set of predictions as well as correlation of prediction between various colors. Note that something staying the same could also count as unusual. For example, a bright green spot that stays bright green would be identified as unusual, if the ANN would predict that it should turn brown. Below are before and after images taken near Portsmouth, New Hampshire, at the same time of year but several years apart. Dark green in the old image (top) generally becomes light green in the new image (bottom). Given dark green as inputs, the ANN trained on this pair of images produces light green as an expected output. Generally, this matches the actual values in the new image, but in the center mark we see an area where the actual value is grayish brown, not the expected light green. Other changes detected are a boat in the water, appearing as a white speck on the right of the lower image only, and a new parking lot on the left of the lower image.
Images taken near Portsmouth, New Hampshire several years apart, with circles indicating automatically detected changes. The key to this process is that a new ANN is trained for each pair of images. The ANN doesnt need to know the concepts of earth, wheat, or road. It learns what the common color changes are for that pair of images only. A different neural network is trained for the next set of images, and so on. However, the training for each ANN is done automatically from the images--no human effort or a prior knowledge is required. The same basic technique has also been applied to grayscale imagery, and to detecting changes occurring between a set of before and a set of after images. An example of this is shown below. The left images are the before set, taken within an hour of each other. The right images are the after set, taken two days later. The image mining algorithm learns to ignore the appearance and disappearance of cars in the parking lot as a normal change. However, the appearance of trucks in the lower left of the after images is detected as an unusual change.
Detecting changes between sets of grayscale images. Significantly Reducing ANN Training Time For more information, please contact Chris Clifton using the employee directory. |
Solutions That Make a Difference.® |
|
|