Remote Sensing Techniques

Table of Contents
Remote sensing basics Aerial photography
Manned-space photography Landsat satellite imagery

Remote Sensing Basics

Remote sensing is the observation and measurement of objects from a distance, i.e. instruments or recorders are not in direct contact with objects under investigation. Remote sensing depends upon measuring some kind of energy that is emitted, transmitted, or reflected from an object in order to determine certain physical properties of the object. One of the most common types of remote sensing is photography, which along with many other techniques is utilized for the images in this Lewis and Clark atlas.

These techniques are based on sensing electromagnetic energy emitted or reflected from the Earth's surface and detected at some altitude above the ground. The electromagnetic spectrum is, thus, the starting point for understanding remote sensing. Passive remote sensing is based on detecting available (background) electromagnetic energy from natural sources, such as sunlight. Active remote sensing, in contrast, depends on an artificial "light" source, such as radar, to illuminate the scene.


Taken from U.S Geological Survey EROS Data Center--see EDC.

Sunlight is the main source of energy at the Earth's surface with most energy in the ultraviolet, visible, and short infrared portions of the spectrum. The Earth is a much weaker source of energy at longer wavelengths of thermal infrared and microwaves. All passive remote sensing is based on these two energy sources.

Aerial Photography

Photography is one of the oldest and most versatile forms of remote sensing of the Earth's surface. All photographic cameras have certain basic components: lens, diaphragm, shutter, viewfinder, and image plane. Geometry of the lens and film format determine the scene area focused onto the image plane. The diaphragm and shutter control the amount of light to expose each photograph.

The spectral sensitivity of photography ranges from about 0.3 µm (near-ultraviolet) to 0.9 µm (near infrared). Different parts of the spectrum may be photographed by using various combinations of films and filters. Photographs are routinely taken in b/w panchromatic, b/w infrared, color-visible, color-infrared, and multiband types. For example, color-infrared film is exposed to green, red, and near-infrared wavelengths, which are depicted as blue, green and red in the photograph. This shifting of bands to visible colors is called false-color.

Color photograph in visible light. Cottonwood River at Cottonwood Falls, Kansas. Note normal appearance of vegetation, water, and other objects in the view. Photo date 5/98; © J.S. Aber.
Color-infrared photograph of Cottonwood River at Cottonwood Falls, Kansas. Active vegetation appears red and pink. Photo date 5/98; © J.S. Aber.

Basic cameras and film.
Glossary of photographic terms.

Aerial photographs may be taken in vertical, low-oblique or high-oblique positions; standard air photos are vertical views of the ground. Vertical photographs are normally acquired in overlapping pattern, so as to create a stereoscopic effect when adjacent pairs are viewed together. The overlapping views of the same ground area produce a parallax effect, which is also the basis for depth perception in human vision. This ability to perceive depth is quite useful for visual interpretation of air photos.

Examples of aerial photographs.
See Aero-Metric.

Aerial photography is typically done from specially equiped airplanes or helicopters nowadays. However many other manned or unmanned platforms may be utilized to hold the camera above the ground, including balloons, tethered blimps, kites, radio-controlled model planes, and rockets--see Project Corona.

Low-Height Unmanned Aerial Platforms

Great Plains kite aerial photography.
Sensoar for low-height remote sensing.
Kite and blimp aerial photography.
Unmanned aerial vehicles--UAV.

Aerial photographs are routinely employed for all manner of mapping and evaluation of natural and cultural resources, including agriculture (crops and soils), archeology, biology (habitat, wildlife census), forestry, geology, geomorphology, engineering, hydrology, industrial development, military (camouflage detection, espionage, terrain models), mineral and oil prospecting, pollution (air, land, water), reclamation, transportation, urban planning, etc.

Manned-Space Photography

U.S. manned space photographs of the Earth began with the Mercury, Gemini, and Apollo missions of the 1960s. Many spectacular images of the lands, oceans, and atmosphere were obtained, and these demonstrated the potential of synoptic space photography for various earth science investigations. With few exceptions, these photos were taken at the discretion of astronauts, who had little formal background in either photography or earth science.

Systematic manned space photography was undertaken during the Skylab missions of 1973 and 1974. Skylab was a orbiting space station that was utilized for extended Earth observations. Skylab 4 was most successful; about 2000 photographs were obtained of more than 850 features and phenomena (Wilmarth et al. 1977). The lessons learned during Skylab missions formed the basis for the program of space-shuttle photography in the 1980s and '90s.

South Island of New Zealand. Chicago and Lake Michigan, Illinois.

Astronauts on space-shuttle missions have taken more than 250,000 photographs with hand-held cameras. A large portion of these are Earth-looking images that provide unique views of the world's surface features: deserts, volcanoes, mountains, coasts, oceans, glaciers, sea ice, rivers, lakes, dust, fires, clouds, and human settlement. Space-shuttle photographs are taken in all possible orientations--near vertical, low oblique, and high oblique--for a greater range of perspectives than is possible with any unmanned satellite instruments.

Space-shuttle photography is the result of systematic Earth and environmental science training given to each astronaut crew prior to flight. Much of the photography consists of revisits to targets previously photographed. Each day during flight, astronauts are given instructions for photographs based on cloud cover and orbital conditions. Astronauts are also free to take photographs of any interesting or attractive scenes. Special advantages of astronaut photography include the following (Lulla et al. 1993).

  1. Images are taken at various sun angles, ranging from negative angles (below horizon) to nearly vertical. This provides for unique views under various lighting conditions that are not available with other remote sensing systems.

  2. Sequential (overlapping) near-vertical photographs with different look angles provide for stereoscopic viewing.

  3. Little-known tropical areas are well represented in the photography database. These regions are undergoing very rapid human-induced environmental transformations.

Various cameras, lenses, and film types are employed for space-shuttle photography. Hasselblad cameras are most commonly used. These take pictures with 70-mm (2¾-inch) format film. Lenses vary from 40 mm to 250 mm focal length. The Linhof camera is also utilized on most missions; it takes 125-mm (5-inch) format film. Other still cameras that have been used include Rolleiflex (70 mm) and Nikon (35 mm). IMAX and 16-mm movie footage are taken on some flights along with video imagery. The electronic still camera (ESC) is a charged-couple device that produces near-film-quality images in digital format. It has been utilized on several missions since 1991. Current spectral response range is 0.4 to 1.1 µm, which is greater than films now in use (Lulla and Holland 1993).

Film selection for space-shuttle photography is relatively conservative. Most photographs are exposed on color-visible films of various types. Color-infrared film is used occasionally, and other special film types are employed rarely. Most photographs have been digitized and converted to video format. "Digitization, rectification, multi-layering (GIS), classification, and mensuration of these digitized analog images is now fairly routine ..." (Lulla et al. 1993).

Florida, Gulf of Mexico, and Atlantic Ocean. Smoke from forest fires in Queensland, Australia.
Color-infrared view of Bangkok, Thailand. Color-infrared view, Mt. St. Helens, Washington.

Time of launch and crew sleeping schedules affect the times of daylight when photographs may be acquired, as very few pictures are taken on the Earth's night side. Thus, some missions return pictures mainly from the northern hemisphere, whereas other missions take photos mostly in the southern hemisphere. An important factor is cloud cover that often obscures ground features of interest. For example, certain equatorial sites have little or no cloud-free photographs. Photography is not the primary objective on most space-shuttle flights, so pictures are taken when astronauts are not busy with other duties. The result of these factors is very uneven coverage of the world.

The presence of human photographers in space provides some important capabilities: to observe and respond quickly to unusual ground events, to preselect and acquire scientifically useful images, and to photograph phenomena from different look angles. "These qualities demonstrate the value of a human observer in orbit in adding to understanding of our planet Earth, and interacting with ground scientists in real time to acquire data and confirm or deny events." (Lulla and Helfert 1991)

More information

NASA Johnson Space Center--Imagery services.
NASA Johnson Space Center--Earth from space.

Landsat Satellite Imagery

Remotely sensed satellite observations from space have fundamentally changed the way in which scientists study the atmosphere, oceans, land, vegetation, glaciers, sea ice, and other environmental aspects of the Earth's surface. In 1951, science-fiction writer Arthur C. Clarke proposed that a satellite in polar (N-S) orbit could allow humans to systematically view the whole planet, as the earth rotated beneath the satellite (Lauer et al. 1997). The first photographs taken by astronauts and early satellite images began to change our concept of the Earth in the 1960s, and remote sensing of the Earth's surface has been continuous ever since. Viewing the Earth from space leads to a powerful impression.

The Earth is an integrated, complex, dynamic
system that is unique so far as we know.

A great many satellite systems have been designed and operated during the past three decades. NASA's
Goddard Space Flight Center in Maryland has taken the lead in developing experimental satellite technology for scientific observations of the Earth. The new methodology of earth science is based on satellite data, which allow a "whole earth" approach to studying the environment. Remotely sensed satellite data and images of the Earth have four important advantages compared to ground observations.

  1. Synoptic view: Satellite images are "big-picture" views of large areas of the surface. The positions, distribution, and spatial relationships of features are clearly evident; megapatterns within landscapes, seascapes, and icescapes are emphasized. Major biologic, tectonic, hydrologic, and geomorphic factors stand out distinctly.

  2. Repetitive coverage: Repeated images of the same regions, taken at regular intervals over periods of days, years, and decades, provide data bases for recognizing and measuring environmental changes. This is crucial for understanding where, when, and how the modern environment is changing.

  3. Multispectral data: Satellite sensors are designed to operate in many different portions of the electromagnetic spectrum. Ultraviolet, visible, infrared, and microwave energy coming from the Earth's surface or atmosphere contain a wealth of information about material composition and physical conditions.

  4. Low-cost data: Near-global, repetitive collection of data is far cheaper using satellite sensors than collecting the same type and quantity of data would be through conventional ground surveys.

The satellite series, now known as Landsat, was conceived in the 1960s, and Landsat 1 went into operation in 1972. The images generated by this satellite created a sensation in the scientific community and popular imagination. More Landsat satellites followed--Landsat 5 is now in operation, and Landsat 7 is planned for launch in 1999. Landsat imagery is based on a technique called scanning in which a picture is built up from rows and columns of picture elements (pixels) that represent the scene. Two instruments have been utilized on the Landsat series in the 1970s, '80s, and '90s.

  1. Multispectral Scanner (MSS) -- A moderate-resolution scanner that collected data in four spectral bands: green, red, and two near-infrared channels. Pixel size in processed datasets is 57 m by 57 m. For more information--see Landsat MSS.

  2. Thematic Mapper (TM) -- An advanced high-resolution scanner that collects spectral data in seven bands: blue, green, red, near-infrared, two mid-infrared, and thermal-infrared. Pixel size in processed datasets is 28½ m by 28½ m. For more information--see Landsat TM.

In 1999 Landsat 7 was successfully launched and began to collect imagery of the Earth's land areas. It carries the enchanced thematic mapper plus (ETM+) which has several improvements compared to the older thematic mapper. Landsat 7 continues the legacy of moderate-resolution satellite imagery of the Earth's environmental conditions.

The following examples show the Missouri River valley and adjacent terrain at Leavenworth, Kansas. The scene covers approximately nine by 12 miles (14 x 20 km). Each image is derived from the same dataset, acquired on 28 Sept. 1994 by the Landsat 5 satellite. Autumn imagery is particularly good for general display of the landscape, as there are many variations in vegetation and water bodies, and the lower sun position creates some shadowing of the terrain. The different appearances of these examples reflect the spectral bands and color coding used to create the composite images. Both natural-color and false-color composites are utilized throughout the atlas. Click on the small images below to see full-sized versions (app. 125 kb each).

Natural-color composite. Image consists of blue, green, and red visible light portrayed in a natural manner. Active vegetation appears green, bare soil and fallow fields are brown, and urban structures are white. Clean water bodies appear black, whereas the Missouri River displays a muddy brown color. Landsat TM bands 1, 2, 3; image processing by J.S. Aber ©.
Standard false-color composite. Image consists of green, red, and near-infrared light portrayed in a false-color manner. Active vegetation appears red-pink, bare soil and fallow fields are green, and urban structures are bluish-white. Clean water bodies appear black, whereas the Missouri River displays a green-brown color. Landsat TM bands 2, 3, 4; image processing by J.S. Aber ©.
Special-color composite. Image consists of red, near-infrared and mid-infrared light portrayed in a false-color manner. Active vegetation appears green, bare soil and fallow fields are red-brown, and urban structures are purple. Most water bodies are black, and the Missouri River appears dark blue. Landsat TM bands 3, 4, 5; image processing by J.S. Aber ©.
Special-color composite. Image consists of green, near-infrared and mid-infrared light portrayed in a false-color manner. Active vegetation appears green, bare soil and fallow fields are red-brown, and urban structures are purple. Most water bodies are black, and the Missouri River appears dark blue. Landsat TM bands 2, 4, 7; image processing by J.S. Aber ©.
Vegetation image based on red and near-infrared bands. Active vegetation is green; dormant or dry plants are yellow; plowed fields, pavement, and bare ground are brown. All water bodies appear black. Landsat TM bands 3 and 4 utilized to create the normalized difference vegetation index--see NDVI. Image processing by J.S. Aber ©.

More Landsat Information

Landsat program overview.
USGS Landsat program.

Return to Lewis and Clark bicentennial space-age atlas.
Last update Feb. 2005.