GEOG 480
Exploring Imagery and Elevation Data in GIS Applications

Lesson 2 Introduction

PrintPrint

Remote sensing can be done from space (using satellite platforms), from the air (using aircraft), and from the ground (using static and vehicle-based systems). The same type of sensor, such as a multispectral digital frame camera, may be deployed on all three types of platforms for different applications. Each type of platform has unique advantages and disadvantages in terms of spatial coverage, access, and flexibility. A student who completes this course should be able to identify the appropriate sensor platform combination for a variety of common GIS applications.

Lesson 2 introduces the most common types of sensors used for mapping and image analysis. These include aerial cameras, film and digital, as well as sensors found on commercial satellites. New cameras and sensors are being introduced every year, as the remote sensing industry grows and technology advances. The principles of sensor design introduced in this lesson will apply to new as well as older instruments used for image data capture. This course will focus on optical sensors, those which passively record reflected and radiant energy in the visible and near-visible wavelength bands of the electromagnetic spectrum. Other courses in this curriculum delve into both active sensors (such as lidar and radar) and passive sensors that operate outside the optical portion of the spectrum (thermal and passive microwave).

Digital images are clearly very useful - a picture is worth a thousand words - in many applications, however, the usefulness is greatly enhanced when the image is accurately georeferenced. The ability to locate objects and make measurements makes almost every remotely sensed image far more useful. Georeferencing of images is accomplished using photogrammetric methods, such as aerotriangulation (A/T) or Structure from Motion (SfM). Geometric distortions due to the sensor optics, atmosphere and earth curvature, perspective, and terrain displacement must all be taken in account. Furthermore, a reference system must be established in order to assign real-world coordinates to pixels or features in the image. Georeferencing is relatively simple in concept, but quickly becomes more complex in practice due to the intricacies of both technology and coordinate systems.

Lesson Objectives

At the end of this lesson, you will be able to:

  • describe various types of remote sensing instruments used to create base map imagery and elevation data, including film cameras, digital multispectral and hyperspectral sensors, lidar and radar;
  • describe common platforms for deployment of sensors, including fixed-wing and rotary-wing aircraft, satellites, and ground-based vehicles;
  • identify appropriate sensor/platform combinations for a variety of geospatial applications;
  • describe technologies and methods used to georeference remotely sensed data;
  • explain the difference between a datum, coordinate system, and map projection;
  • identify primary coordinate systems used for imagery and elevation data in the conterminous United States;
  • identify metadata fields that describe georeferencing in a variety of image and elevation data sets acquired from public domain sources;
  • import imagery and elevation data into ArcGIS in the correct geographic location, identifying and compensating for missing or incorrect information in the provided metadata.

Questions?

If you have any questions now or at any point during this week, please feel free to post them to the Lesson 2 Questions and Comments Discussion Forum in Canvas.