Satellite imagery forms one of the basic
tools for remote sensing, which includes a wide variety of
methods to determine properties of a material from a distance.
Remote sensing can include subjects as diverse as seismology,
satellite image interpretation, and magnetics. The types of
satellite images available to the geologist are expanding rapidly,
and only the most common in use are discussed here.
The Earth Resources Technology Satellite (ERTS-1) was
the first unmanned digital imaging satellite that was launched
on July 23, 1972. Four other satellites from the same series,
later named Landsat, were launched at intervals of a few
years. The Landsat spacecraft carried a Multi-Spectral Scanner
(MSS), a Return Beam Vidicon (RBV), and later, a Thematic
Mapper (TM) imaging system.
Landsat Multi-Spectral Scanners produce images representing
four different bands of the electromagnetic spectrum.
The four bands are designated band 4 for the green spectral
region (0.5 to 0.6 micron); band 5 for the red spectral region
(0.6 to 0.7 micron); band 6 for the near-infrared region (0.7
to 0.8 micron); and band 7 for another near-infrared region
(0.8 to 1.1 micron).
Radiation reflectance data from the four scanner channels
are converted first into electrical signals, then into digital
form for transmission to receiving stations on Earth. The
recorded digital data are reformatted into what we know as
computer compatible tapes (CCT) and/or converted at special
processing laboratories to black-and-white images. These
images are recorded on four black-and-white films, from
which photographic prints are made in the usual manner.
The black-and-white images of each band provide different
sorts of information because each of the four bands
records a different range of radiation. For example, the green
band (band 4) most clearly shows underwater features,
because of the ability of “green” radiation to penetrate shallow
water, and is therefore useful in coastal studies. The two
near-infrared bands, which measure the reflectance of the
Sun’s rays outside the sensitivity of the human eye (visible
range) are useful in the study of vegetation cover.
When these black-and-white bands are combined, falsecolor
images are produced. For example, in the most popular
combination of bands 4, 5 and 7 the red color is assigned to
the near-infrared band number 7 (and green and blue to
bands 4 and 5 respectively). Vegetation appears red because
plant tissue is one of the most highly reflective materials in
the infrared portion of the spectrum, and thus, the healthier
the vegetation, the redder the color of the image. Also,
because water absorbs nearly all infrared rays, clear water
appears black on band 7. Therefore, this band cannot be
used to study features beneath water even in the very shallow
coastal zones. However, it is very useful in delineating the
contact between water bodies and land areas.
The Return Beam Vidicon (RBV) was originally flown in
the interest of the mapping community. It offered better geometric
accuracy and ground resolution (130 feet; 40 m) than
was available from the Multi-Spectral Scanner (260 feet/80 m
resolution) with which the RBV shared space on Landsats 1,
2, and 3. The RBV system contained three cameras that operated
in different spectral bands: blue-green, green-yellow, and
red-infrared. Each camera contained an optical lens, a shutter,
the RBV sensor, a thermoelectric cooler, deflection and
focus coils, erase lamps, and the sensor electronics. The three
RBV cameras were aligned in the spacecraft to view the same
70-square mile (185-km2) ground scene as the MSS of Landsat.
Although the RBV is not in operation today, images are
available and can be utilized in mapping.
The Thematic Mapper (TM) is a sensor that was carried
first on Landsat 4 and 5 with seven spectral bands covering
the visible, near infrared, and thermal infrared regions of the
spectrum. It was designed to satisfy more demanding performance
parameters from experience gained in the operation of
the MSS with a ground resolution of 100 feet (30 m).
The seven spectral bands were selected for their band
passes and radiometric resolutions. For example, band 1 of
the Thematic Mapper coincides with the maximum transmissivity
of water and demonstrates coastal water-mapping
capabilities superior to those of the MSS. It also has beneficial
features for the differentiation of coniferous and deciduous
vegetation. Bands 2–4 cover the spectral region that is
most significant for the characterization of vegetation. Vegetation
and soil moisture may be estimated from band 5 readings,
and plant transpiration rates may be estimated from the
thermal mapping in band 6. Band 7 is primarily motivated by
geological applications, including the identification of rocks
altered by percolating fluids during mineralization. The band
profiles, which are narrower than those of the MSS, are specified
with stringent tolerances, including steep slopes in spectral
response and minimal out-of-band sensitivity.
TM band combinations of 7 (2.08–2.35 μm), 4 (0.76–0.90
μm), and 2 (0.50–0.60 μm) are commonly used for geological
studies, due to the ability of this combination to discriminate
features of interest, such as soil moisture anomalies, lithological
variations, and to some extent, mineralogical composition
of rocks and sediments. Band 7 is typically assigned to the red
channel, band 4 to green, and band 2 to blue. This procedure
results in a color composite image; the color of any given
pixel represents a combination of brightness values of the
three bands. With the full dynamic range of the sensors, there
are 16.77 × 10 possible colors. By convention, this false-color
combination is referred to as TM 742 (RGB). In addition to
the TM 742 band combination, the thermal band (TM band
6; 10.4–12.5 μm) is sometimes used in geology because it contains
useful information potentially relevant to hydrogeology.
The French Système pour l’Observation de la Terre
(SPOT) obtains data from a series of satellites in a sun-synchronous
500-mile (830-km) high orbit, with an inclination
of 98.7°. The SPOT system was designed by the Centre
Nationale d’Etudes Spaciales (CNES) and built by the French
industry in association with partners in Belgium and Sweden.
Like the American Landsat it consists of remote sensing satellites
and ground receiving stations. The imaging is accomplished
by two High-Resolution Visible (HRV) instruments
that operate in either a panchromatic (black-and-white) mode
for observation over a broad spectrum, or a multispectral
(color) mode for sensing in narrow spectral bands. The
ground resolutions are 33 and 66 feet (10 and 20 m) respectively.
For viewing directly beneath the spacecraft, the two
instruments can be pointed to cover adjacent areas. By pointing
a mirror that directs ground radiation to the sensors, it is
possible to observe any region within 280 miles (450 km)
from the nadir, thus allowing the acquisition of stereo photographs
for three-dimensional viewing and imaging of scenes
as frequently as every four days.
Radar is an active form of remote sensing, where the system
provides a source of electromagnetic energy to “illuminate”
the terrain. The energy returned from the terrain is
detected by the same system and is recorded as images. Radar
systems can be operated independently of light conditions
and can penetrate cloud cover. A special characteristic of
radar is the ability to illuminate the terrain from an optimum
position to enhance features of interest.
Airborne radar imaging has been extensively used to
reveal land surface features. However, until recently it has not
been suitable for use on satellites because: (1) power requirements
were excessive; and (2) for real-aperture systems, the
azimuth resolution at the long slant ranges of spacecraft
would be too poor for imaging purposes. The development of
new power systems and radar techniques has overcome the
first problem and synthetic-aperture radar systems have
remedied the second.
The first flight of the Shuttle Imaging Radar (SIR-A) in
November of 1981 acquired images of a variety of features
including faults, folds, outcrops, and dunes. Among the
revealed features are the sand-buried channels of ancient river
and stream courses in the Western Desert of Egypt. The second
flight, SIR-B, had a short life; however, the more
advanced and higher resolution SIR-C was flown in April
1994 (and was again utilized in August 1994). The SIR-C
system acquired data simultaneously at two wavelengths: L
band (23.5 cm) and C band (5.8 cm). At each wavelength
both horizontal and vertical polarizations are measured. This
provides dual frequency and dual polarization data, with a
swath width between 18 and 42 miles (30 and 70 km), yielding
precise data with large ground coverage.
Different combinations of polarizations are used to produce
images showing much more detail about surface geometric
structure and subsurface discontinuities than a
single-polarization-mode image. Similarly, different wavelengths
are used to produce images showing different roughness
levels since radar brightness is most strongly influenced by
objects comparable in size to the radar wavelength; hence, the
shorter wavelength C band increases the perceived roughness.
Interpretation of a radar image is not intuitive. The
mechanics of imaging and the measured characteristics of the
target are significantly different for microwave wavelengths
than the more familiar optical wavelengths. Hence, possible
geometric and electromagnetic interactions of the radar
waves with anticipated surface types have to be assessed prior
to their examination. In decreasing order of effect, these qualities
are surface slope, incidence angle, surface roughness, and
the dielectric constant of the surface material.
Radar is uniquely able to map the geology at the surface
and, in the dry desert environments, up to a maximum 30
feet (10 m) below the surface. Radar images are most useful
in mapping structural and morphological features, especially
fractures and drainage patterns, as well as the texture of rock
types, in addition to revealing sand-covered paleochannels.
The information contained in the radar images complements
that in the Thematic Mapper (TM) images. It also eliminates
the limitations of Landsat when only sporadic measurements
can be made; radar sensors have the ability to “see” at night
and through thick cloud cover since they are active rather
than passive sensors.
Radarsat is an earth observation satellite developed by
Canada, designed to support both research on environmental
change and research on resource development. It was
launched in 1995 on a Delta II rocket with an expected life
span of five years. Radarsat operates with an advanced radar
sensor called Synthetic Aperture Radar (SAR). The synthetic
aperture increases the effective resolution of the imaged area
by means of an antenna design in which the spatial resolution
of a large antenna is synthesized by multiple sampling from a
small antenna. Radarsat’s SAR-based technology provides its
own microwave illumination, thus can operate day or night,
regardless of weather conditions. As such, resulting images
are not affected by the presence of clouds, fog, smoke, or
darkness. This provides significant advantages in viewing
under conditions that preclude observation by optical satellites.
Using a single frequency, 5 cm horizontally polarized C
band, the Radarsat SAR can shape and steer its radar beam
to image swaths between 20 and 300 miles (35 km to 500
km), with resolutions from 33 feet to 330 feet (10 m to 100
m), respectively. Incidence angles can range from less than
20° to more than 50°.
The Space Shuttle orbiters have the capability of reaching
various altitudes, which allows the selection of the required
photographic coverage. A camera that was specifically
designed for mapping the Earth from space using stereo photographs
was first flown in October 1984 on the Space Shuttle
Challenger Mission 41-G. It used an advanced, specifically
designed system to obtain mapping-quality photographs from
Earth orbit. This system consisted of the Large Format Camera
(LFC) and the supporting Attitude Reference System
(ARS). The LFC derives its name from the size of its individual
frames, which are 26 inches (66 cm) in length and 9 inches
(23 cm) in width. The 992-pound (450-kg) camera has a 12-
inch (305-mm) f/6 lens with a 40° × 74° field of view. The
film, which is three-fourths of a mile (1,200 m) in length, is
driven by a forward motion compensation mechanism as it is
exposed on a vacuum plate, which keeps it perfectly flat
(Doyle, 1985). The spectral range of the LFC is 400 to 900
nanometers, and its system resolution is 100 lines per millimeter
at 1,000:1 contrast and 88 lines per millimeter at 2:1 contrast.
This adds up to photo-optical ground resolution of
33–66 feet (10–20 m) from an altitude of 135 miles (225 km)
in the 34,200-square-mile (57,000-km2) area that is covered
by each photograph. The uniformity of illumination of within
10 percent minimizes vignetting. The framing rate of 5 to 45
seconds allows its operation from various spacecraft altitudes.
The ARS is composed of two cameras with normal axes
that take 35-millimeter photographs of star fields at the same
instant as the LFC takes a photograph of the Earth’s surface.
The precisely known positions of the stars allow the calculation
of the exact orientation of the Shuttle orbiter, and particularly
of the LFC in the Shuttle cargo bay. This accurate
orientation data, together with the LFC characteristics,
allows the location of each frame with an accuracy of less
than half a mile (1 km) and the making of topographic maps
of photographed areas at scales of up to 1:50,000.
See also REMOTE SENSING.














Tidak ada komentar:
Posting Komentar
Catatan: Hanya anggota dari blog ini yang dapat mengirim komentar.