Lab 7: Photogrammetry
Introduction:
The purpose of this lab is to provide students with a foundation in the performance of photogrammetry tasks, including the calculation of image scale and relief displacement, and measuring areas and perimeters of aerial imagery features. Additionally, the lab introduces students to stereoscopy and orthorectification tasks for satellite imagery.
Methods & Results:
This lab ultimately consisted of three parts:
Part 1: Scales, Measurements, and Relief Displacement, required students to calculate scales and relief displacement using formulas provided to us in lecture. Limited information was given pertaining to measurements taken from the real world, and then using the aerial photographs, direct measurements could be made with something as simple as a ruler for comparison in completing the formula for scale. Figure One shows an example of the relief displacement we observed between two photos. On the left, we can clearly visualize the relief displacement of some taller objects in the photographs, in this case, in the smoke stack and Towers buildings on upper campus. The image on the right shows what the photograph should look like when relief displacement is corrected. In this image, only the top views of these objects are visible, as they have been put back into their planimetrically correct place in the image, reflective of their actual locations in the real world.
| Figure 1: Example of relief displacement |
Students were also introduced to some new tool options during Part 1 of the lab. Using the Polygon Perimeter and Area tool, students could outline a feature on the image, and get a return of the feature polygon's real world measurements. Figure Two shows the outline of an Eau Claire lagoon and records its real world measurements in highlighted at the bottom of the program in blue. The real world perimeter of the lagoon is 421,062 meters and its area is recorded at about 37.9 hectacres.
| Figure 2: Polygon Perimeter and Area Tool in use |
Part 2: Stereoscopy, had students experiment with Stereoscopy image outputs and had them observe elevation changes in the outputs directly using Polaroid glasses.
In the first image, the elevational changes visualized were somewhat blocky. The features showed minimal changes and where changes did occur, they were not true to real life.
In the second image, contrastingly, the elevation changes were much more obvious and naturally occurring. Hills that would have seemed choppy or gradual in the first rendered image now seemed to reflect more accurately to the real world, with exaggerative and smooth changes seen in the Earth's surface.
This result was most likely due to the implementation of LiDAR and the use of ground returns to get an accurate reflection of the surface terrain in the photograph.
Finally, Part 3: Orthorectification, walked students through how to tie images together reflective of correct directional and elevational differences that may have originally been seen between the two photos. Figure Three shows a platform of the tool that was utilized. Students collected GCP's from the original photo and its reference photo to match up and align the positioning between the two original photographs.
| Figure 3: Tool Platform Layout |
Once GCP's and Tie points were collected for both horizontal (x, y) and vertical (z) pixel coordinates, students ran the triangulation tool to cement them together and account for their positional correctness. Optimal points were used to tie the two photos together. Figures Four and Five showcase the Triangulation Summary and Report generated by the results of this lab.
| Figure 5: Triangulation Summary Report |
| Figure 4: Triangulation Summary |
Finally, Figure Six showcases the series final product. It orients the two images into alignment of one another and provides onlookers with a realistic view of features in the landscape. The river seams between the two images are perfectly aligned, as are the shading seen occurring in the mountain ranges between the two images. These details demonstrate a well-rectified image using the photogrammetry process of orthorectification.
| Figure 6: Othorectification Final Product. |
Conclusion:
Principles and tools utilized in photogrammetry are extremely useful toolsets to know and become familiarized with in the practice of Remote Sensing. Its processes prove useful in a wide range of areas, from observing differences in elevation, calculating scale, and correcting image alignment, orientation, and in-photo features' plannimetric positioning. Knowing these processes will better prepare students for executing these kinds performances on aerial imagery in the future.Sources:
National Agriculture Imagery Program (NAIP) images are from United States Department of Agriculture, 2005.
Digital Elevation Model (DEM) for Eau Claire, WI is from United States Department of Agriculture Natural Resources Conservation Service, 2010.
Lidar-derived surface model (DSM) for sections of Eau Claire and Chippewa are from Eau Claire County and Chippewa County governments respectively.
Spot satellite images are from Erdas Imagine, 2009.
Digital elevation model (DEM) for Palm Spring, CA is from Erdas Imagine, 2009.
National Aerial Photography Program (NAPP) 2 meter images are from Erdas Imagine, 2009.
Comments
Post a Comment