|
|
National Aeronautics andSpace Administration
Goddard Space Flight Center
|
This image of the San Francisco Bay region was acquired
on March 3, 2000 by the Advanced Spaceborne Thermal Emission and Reflection
Radiometer (ASTER) on NASA's Terra satellite. With its 14 spectral bands from
the visible to the thermal infrared wavelength region, and its high spatial
resolution of 15 to 90 meters (approximately 50 to 300 feet), ASTER will image
Earth for the next 6 years to map and monitor the changing surface of our
planet.
This image covers an area 60 kilometers (37 miles) wide
and 75 kilometers (47 miles) long in three bands of the reflected visible
and infrared wavelength region. The combination of bands portrays vegetation
in red, and urban areas in gray. Sediment in the Suisun Bay, San Pablo Bay,
San Francisco Bay, and the Pacific Ocean shows up as lighter shades of blue
and green. Along the West Coast of the San Francisco Peninsula, strong surf
can be seen as a white fringe along the shoreline. A powerful rip tide is
visible extending westward from Daly City into the Pacific Ocean. In the lower
right corner, the wetlands of the South San Francisco Bay National Wildlife
Refuge appear as large dark blue, brown, purple, and orange polygons.
The high spatial resolution of ASTER allows fine detail
to be observed in the scene. The main bridges of the area (San Mateo, San
Francisco-Oakland Bay, Golden Gate, Richmond-San Rafael, Benicia-Martinez,
and Carquinez) are noticeably visible, connecting the different communities
in the Bay area.
About ASTER
The ASTER instrument, provided by Japan's Ministry of
International Trade and Indus" and built by NEC, Mitsubishi Electronics
Company and Fujitsu, Ltd., measures cloud properties, vegetation index, surface
mineralogy, soil properties, surface temperature, and surface topography for
selected regions of the Earth. The international ASTER Science Team is led
by Dr. H. Tsu at the Shikoku National Industrial Research Institute and Dr.
A. Kahle at the Jet Propulsion Laboratory. http://asterweb.lipt.nasa.
When viewing images such as the image of San Francisco
on the front of this lithograph, scientists are often asked why certain parts
of the image are red and other parts blue, green, gray, or purple, etc. This
is because they have chosen to display three different wavelengths in the
image where the surface is highly reflective (bright) at those wavelengths.
These colors are the result of using instruments that are sensitive to different
parts of the spectrum, different from the part that our eyes are sensitive
to (the visible portion of the spectrum).
The ASTER instrument on board the Terra spacecraft does
not "see" in color. Every image is obtained in a gray scale from
black to white based on brightness of radiation at a precise wavelength (between
0.52 and 11.65 microns). These electronic cameras only collect digital signal
levels that are displayed as gray scale images from black to white, but they
can obtain many images at the same time in different parts of the spectrum.
If we look at the diagram of the spectrum below, we see
several broad regions that include the ultraviolet (wavelengths between 0.3
- 0.4 microns), visible (0.4 to 0.7 microns), near infrared (0.7 to 1.2 microns),
the solar reflected infrared (1.2 to 3.2 microns), the mid-infrared (3.2 to
15 microns) and the far infrared (longer than 15.0 microns).
Notice that the vertical axis (called % atmospheric transmission)
shows where Earth’s atmosphere allows a lot of the radiation, including the
sun's rays, to reach the ground. It is easy to see why our eyes work in the
visible part of the spectrum, because it's here that almost 100% of the sun's
energy reaches the surface. In contrast, due to water vapor in the atmosphere,
almost no energy (roughly 0% trans- mission) reaches the ground at 1.4 and
1.9 microns.
This image of the San Francisco Bay area was obtained
using ASTER Bands 1, 2 and 3 (representing a spectral range of 0.52 to 0.86
microns) as a set of three individual, yet simultaneous black and white images.
In the diagram above, all of ASTER's 14 "bands" are represented,
along with the wavelength of the band. Note that not all bands have the same
wavelength range, i.e., bands 3 and 4 have a range of. I microns (wide) while
bands 5 and 6 have a range of 0.04 microns (narrow).
The value in obtaining multiple images at different wavelengths
can be seen in the grayscale images on the front. Here we show the individual
images that were obtained by Band 1 (0.52 - 0.60 microns), Band 2 (0.63 -
0.69 microns) and Band 3 (0-76 - 0.86 microns). Careful inspection of these
three images shows that different parts of San Francisco have different "brightness's"
at different wavelengths.
For example, at point "A" in Bands 1 and 2,
sediment is revealed in the bays, but is barely noticeable in Band 3. At the
shorter wavelengths of Bands 1 and 2, light penetrates the deepest into water,
and the sediment reflects the light. At point "B" in Bands 1 and
2 vegetated areas are quite dark, but at the same place in Band 3, vegetation
appears quite bright. Vegetation reflects light very strongly in the near
infrared wavelength region, and so appears bright in Band 3. This is due to
the reflective nature of chlorophyll in the leaves at these wavelengths. Additionally,
light is almost completely absorbed by water at these wavelengths, so very
little detail is visible. Finally, Bands and 2 display a great contrast between
vegetated and urban areas at point "C", while the difference is
barely noticeable in Band 3.
Reflectance Material Spectra

Brightness differences can be further understood by examining
the figure above. The figure shows three curves, each called a reflectance
spectrum, for three different materials-water, concrete (urban areas), and
vegetation. In each spectrum, the percentage of light that is reflected is
shown as a function of wavelength. Notice that "vegetation" is very
bright around 0.8 microns compared to concrete. The relative contrast is reversed
at wavelengths around 1.5 microns. Water, in comparison, is brightest at short
wavelengths, roughly 0.7 microns, and we see no reflection at a wavelength
longer than about 0.9 microns. Using these spectra, we can under- stand the
relative contrast of the water, vegetation, and urban areas. Lf we continued
these spectra to thermal wavelengths, we would also have seen that temperature
becomes important, with hot objects being brighter than cooler objects.
Having seen that different bands (or wavelengths) have
a different contrast, we can now understand how the computer can produce a
false color image from a remote sensing data set. To do this, we start with
the three black and white images corresponding to Bands 1, 2 and 3. Then,
just like a color television set, our computer screen can display three different
images using blue light, green light and red light. The combination of these
three wavelengths of light will generate the color image that our eyes can
see. So, if we display Band 1 in blue light, Band 2 in green light, and Band
3 in red light we get the relative contrast between the three images. More
importantly, when these three colors are combined we get our color image (the
image on the front of this lithograph), which we call a "false color
image,
because it has nothing to do with the colors we see with
our eyes.
Adaptedfrom Virtually Hawaii, An Introduction to Remote
Sensing, by Peter Mougini s- Mark, University of Hawaii. http://hawaii ivv.nasa.govl
Image credit. NASAIGSFCIMITIIERSDACIJAROS, and U. S./Japan
ASTER Science Team