Lab 4: Remote Sensing of the Environment

Introduction:

The purpose of this lab is to further familiarize students with ERDAS photo analysis capabilities, including image pre-processing, stitching, and optimizing. In the following exercise, students will execute a series of several new capabilities:

     1. Image Subsetting
     2. Image Fusion
     3. Radiometric Enhancement
     4. Syncing Image Viewer with Google Earth
     5. Resampling
     6. Image Mosaicking, and
     7. Binary Change Detection (or Image Differencing)


Methods & Results:

Part One: Image Subsetting of a Study Area-
There are two different methods that can be used for subsetting satellite images:

     1. The use of an Inquiry Box
     2. Delineating an area of interest (AOI) with the use of a shapefile

Figure 1: Inquiry Box Method of Image Subsetting
The each method can be easily executed through a few simple clicks in the Raster toolset. Using the Chip and Subset tool, a highlighted area can be easily extracted and reimaged.

The first version of subsetting is the most simple method to perform, but its also somewhat limited, as this method only allows for a square or rectangular subset image. In many instances, the study area is irregular in shape, and so this method is not always ideal. An example of a subsetted image of the Chippewa and Eau Claire city area using the Inquiry Box method. can be seen in Figure 1 to the right.


Figure 2: Subset Image using a Shapefile Method

The second method uses a shapefile to delineate the study area. The resulting image, then, is shaped to follow the outline of the overlayed, selected shapefile. While this method requires a few extra steps, it is a more practical method to use, as it allows the analysis of abnormally shaped study areas, as they usually are. An example of this method of execution can be found  in Figure 2 pictured left, and is a more accurate representation of the Chippewa and Eau Claire area.





Part Two: Image Fusion-
In part two of the lab, students demonstrate the transformation of a pixelated image to one of better quality by designating it with a higher spatial resolution. This can be executed using the Pan Sharpen tool, Resolution Merge option under the Raster toolset. In the Merge Resolution Window, settings were set so that repixelation was performed using the multiplicative, nearest neighbor techniques for resampling. The resulting resampled image portrayed a spatial resolution of 15 meters as opposed to the 30 meter resolution used in the original photo. A comparison photo of  a portion of the two images is pictured in Figure 3 below.

Figure 3: A comparison of the original reflective image and the new, pansharpened image.

The image to the left showcases the original photo while the picture in black and white to the left is the new, pansharpened image. Due to its higher spatial resolution, the new pansharpened image appears more clear, detailed, and less pixelated than the original reflective image.


Part Three: Simple Radiometric Enhancement Techniques-
In this section of the lab, students demonstrate an image quality enhancing technique using the Radiometric function under the Raster toolbar. For the purposes of the demonstration, students focus on the haze reduction option of Radiometric functions, a common issue of aerial imagery. Figure 4 below shows a side-by-side comparative example of an image before and after the haze reduction tool has been implored:

Figure 4: A before and after of the same photograph, resampled using the haze reduction tool.

In comparing the two images, the colors in the haze reduced image are much more clear, vibrant, and high contrasting. After using the tool, the haze that was clearly visible in the lower southeastern portion of the original input image pictured right, is no longer visible in the resulting, haze reduced image pictured right. Overall, this tool helps to increase visibility in aerial imagery for further analysis purposes.


Part Four: Linking Image Viewer to Google Earth-


Figure 5: Photo in Erdas Viewer

This section of the lab introduces an extremely helpful tool that can help users to find out more information about the location of their aerial photo. Erdas's new Google Earth toolset allows for Google Earth to sync to the image pictured in the original aerial photo and matching GE to the Erdas Viewer. Figure 5 shows the original view from the photo used in Erdas, while Figure 6 showcases the same area's spatial extent in Google Earth.


Figure 6: Photo in Google Earth



In this scenario, Google Earth is an example of a selective interpretation key because the Google Earth viewer includes labels, images, and texts in addition to the maps, which provides the user with an array of extra information about the area in view.






Part Five: Resampling-
During resampling, the size of pixels within an image change, either by scaling up or scaling down. In most scenarios, resampling is performed to scale up, reducing the size of each pixel and thereby making the image more clear.

The resampling tool can be found under Raster > Spatial > Resample Pixel Size. Again, the resolution is changed from 30 meters to 15 meters, first using the nearest neighbor method, and secondly using the bilinear interpolation method. Figure 7 shows the original image pictured large on the left, and the resampled images on the right. The top right shows resampling using the nearest neighbor method, and the bottom right shows the resampling using bilinear interpolation.


Figure 7: Comparative Resampling Photo using two different methods

The spatial resolution in the resampled images are of better quality, resulting in a less pixelated image, leaving a much smoother transition between objects in the image. One difference that can be noted between the two resampled images is that the bilinear interpolated photo has a much more “hazy” appearance in the separation of objects in the photo than the nearest neighbor method left. Additionally, the colors are slightly less vibrant and the lines that can be seen in the image are not as harshly defined.


Part Six: Image Mosaicking-
In Part Six of the lab, students explore the process of image mosaicking. This method is used whenever there is a study area that exceeds the space of a single satellite image, or requires the partial continuation into a second image. The process uses two adjacent satellite images taken in the same temporal frame and at the same spatial resolution, to overlap and stitch the two images together. For the purposes of this lab, image mosaicking is performed using two different tool options: Mosaic Express and MosaicPro.

Figure 8: Mosaic Express Output
As the name would suggest, Mosaic Express is the quickest and simplest of the two mosaicking options. It is a cut and paste operation that places the two photos adjacent and somewhat overlapping one another, and creates a new, single output image from the original two images. This method is nice because it preserves the original quality of each photo taken, however, the color and image transition is left far from seamless, as can be seen in Figure 8. It is obvious where the first image ended and where the second begins.


Figure 9: MosaicPro Output
The second method, MosaicPro, takes care of the pigmentation change problem by correcting the image color schemes so that they are matching. This makes photo analysis and interpretation easier because it keeps the descriptive hues between photos consistent. Additionally, the image seen in Figure 9 is more asthetically pleasing because the transition between the two images is much smoother, and near seamless.



A downfall to this method that can be noted is that in correcting the color of one satellite image to the other, much of the detail from the original photo is lost in translation. This can be noted in Figure 10, by looking at the differences of the detail seen in the green visual spectrum. 

Figure 10: Detail, Seams, and Color Differences between Mosaic Express and MosaicPro



Part Seven: Binary Change Detection-























Conclusions:




Sources:

Satellite Images from: 
Earth Resources Observation and Science Center, United States Geological Survey. 

Shapefile from: 
Mastering ArcGIS 6th edition Dataset by Maribeth Price, McGraw Hill. 2014. 

Comments

Popular posts from this blog

Lab 5: LiDAR and Remote Sensing