SURVEY OF ADVANCED TECHNOLOGIES IN IMAGING …



SURVEY OF ADVANCED TECHNOLOGIES IN IMAGING SCIENCE FOR REMOTE SENSING

By:

Erich Hernandez-Baquero

B.S. Physics, U.S. Air Force Academy

Rochester Institute of Technology

(1997)

Table of Contents

Table of Contents ii

List of Figures and Tables iii

Abstract iv

1.0 Introduction 1

1.1 Applications 1

1.1.1 Environmental Science 1

1.1.2 Military 3

1.1.3 Commercial 3

1.2 The Imaging Chain 4

2.0 Approach 6

2.1 Field of View 6

2.2 Data Transfer and Storage 6

2.3 Quality of Imagery 7

2.3.1 Atmospheric Distortions 7

2.3.2 Sensor Performance 8

2.3.3 Platform-Induced Distortions 10

2.3.4 Image Processing Algorithms 10

3.0 Results 11

3.1 Multispectral Imaging 11

3.2 Hyperspectral Imaging 13

3.3 Synthetic Aperture Radar 17

4.0 Conclusions 20

Appendix 22

References 26

List of Figures and Tables

Figure 1. View of El Niño from the TOPEX/Poseidon satellite. 2

Figure 2. Comparison of CORONA and SPOT satellite imagery. 4

Figure 3. Atmospheric correction of Landsat-TM data by image processing. 10

Figure 4. MSS scanning geometry and image projection 12

Figure 5. Typical hyperspectral image cube 14

Figure 6. 3-D topographical SAR imagery 19

Table 1. UVISI sensor suite parameters. 16

Table 2. Spaceborne Hyperspectral Imagers 22

Table 3. Airborne Hyperspectral Imagers 23

Abstract

Various technologies in imaging science used for remote sensing are described. Although this report is not all-inclusive, the technologies presented are diverse and represent the most prominent fields in remote sensing imaging. Strengths and weaknesses are evaluated as it pertains to specific applications (either airborne or spaceborne). A brief description of the theory of each technique is also provided. A vision for the future of remote sensing is provided.

SURVEY OF ADVANCED TECHONOLOGIES IN IMAGING SCIENCE FOR

REMOTE SENSING

Introduction

Remote sensing is a natural extension of the human need to explore and understand its environment. Through advances in technology, mankind has been able to extend the way we see the world to a perspective never before possible. Using airborne and spaceborne platforms, complex imaging systems that surpass the limitations of the human eye are used to observe the Earth. Through these systems, we can now see in spectral regions that were previously invisible to the unaided eye.

The ability to extract information about our world and present it in ways that our visual perception can comprehend is the ultimate goal of imaging science in remote sensing. In all applications--from environmental monitoring to intelligence data gathering--the need to obtain more accurate information in a timely and efficient manner continues to grow exponentially. It is precisely because of this rapid growth that a broad range of technologies is presented in this report.

1.1 Applications

1.1.1 Environmental Science

Clearly one of the largest and most prominent applications is the study of the Earth’s ecosystem through the use of remote sensing. The synoptic view obtained from airborne and spaceborne imaging platforms provides an opportunity to understand weather systems, climate changes, geological phenomena, etc. from a global perspective. Not only are we able to view the Earth as a single ecosystem, but the amount and quality of information that we can gather is much greater than other methods of observation.

Figure 1. View of El Niño from the TOPEX/Poseidon satellite.

Figure 1 is an example of the kind of imagery that is available to anyone almost instantaneously over the Internet. The scale describes the height of the ocean surface (which is directly correlated to temperature) compared to last year’s measurements. El Niño is seen as a mass of “red water” accumulating along the Eastern Pacific. The data contained in this single image from space would have required the use of hundreds of boats and instruments and could have potentially taken a considerable amount of time to process and distribute1.

2 Military

Perhaps one of the areas in which the greatest advances in imaging technology have occurred is in the field of intelligence data gathering for the support of military operations and national security. The need for accurate and timely data cannot be overemphasized here since the lives of military personnel could be saved by having a better understanding of the enemy forces location and activities. In addition, international treaties involving nuclear disarmament and biological/chemical warfare can be enforced without actually having to send in a team of inspectors. High-flying aircraft such as the SR-71 and U-2 and satellite platforms such as the recently de-classified CORONA provide this type of information. The resolution available from these systems is far greater than their civilian counterparts. The CORONA satellite, for example, could obtain images with resolutions of approximately 6 feet! This technology, although dating to the 1960’s, is still better than most currently operating civilian/commercial spaceborne imaging systems such as the French SPOT (see Figure 2).

3 Commercial

During the 1980’s, the federal government decided that private industry should operate satellite space systems and manage the data that is generated from these systems. As a result of that, many companies began to sponsor or even develop the capability to do remote sensing. Their customers were the scientific community and the government itself. Other customers included local utility companies that would provide their customers with information about their energy use. For example, through the use of thermal infrared sensors, information about how efficient a home or a building is using its electricity can be determined3. Many of the commercial imagery are available for sale over the Internet--thus making it very accessible to the public.

Figure 2. Comparison of CORONA and SPOT satellite imagery2.

1.2 The Imaging Chain

Before we can start analyzing and comparing remote sensing systems, we must first lay an approach for visualizing these imaging systems. At first glance, one might consider that the caliber and performance of an imaging system rests solely on the quality of the optical system in terms of resolution and accuracy. However, the fact is that in order to fully characterize a system we must look at it using an end-to-end perspective. A satellite, for example, could be equipped with the highest technology hardware available, but if the images generated by that system cannot be processed (or interpreted) then it is useless. We then look at systems from an imaging chain approach.

The imaging chain simply consists of all the steps (which can be thought of as rungs in a chain) required to bring an image to an end-user. At this point it is important to note that the end-user may not only be a human looking at a picture or movie, but it may also be a control system used in an automated process. The imaging chain takes us through the steps of capturing a scene, storing, manipulating, and transmitting the data, and finally displaying the image. Clearly, a system may generate good data that can be processed to yield accurate information, but without a good system to display it the whole process suffers. The analogy to a chain applies here as we think of the whole chain being only as strong as its weakest link3.

The scope of this report is to analyze emerging technologies in imaging science for remote sensing using the imaging chain approach. However, no discussion on display systems is provided. It is assumed that the systems presented in this report can generate processed imagery that can be properly digitized and displayed on a moderate resolution CRT or incorporated into automated systems. Thus, using the imaging chain approach, a system can be evaluated by the scene that it can capture, the data transfer and storage capability, and the quality of the produced imagery.

2.0 Approach

The following parameters will be used to evaluate the performance of the imaging systems presented in this report:

2.1 Field of View

The scene a system can capture is driven mainly by its Field Of View (FOV). In particular, we are interested in an imaging sensor’s instantaneous FOV (IFOV), the ground IFOV (GIFOV), and the height of the imaging sensor platform. The relationship is given by

GFOV = H∙IFOV [1]

where H is the height of the platform, and IFOV is the size of the sensor at the imaging plane divided by the effective focal length of the optical system. Clearly, how much total ground coverage is achieved depends on the GFOV (which depends on the orbital parameters of a satellite platform or the flying altitude of an aircraft) and the dwell time on a particular scene. Depending on the sensor configuration, the total FOV may range from only 15o to 120o 3,4. Unfortunately, a larger FOV is not necessarily the best solution since it is more susceptible to geometric distortions and often results in poor spatial resolution. These parameters continue to improve as new electro-optics technologies develop.

2 Data Transfer and Storage

The data transfer and storage capability of a system depends on the electronic configuration at the imaging platform. Ultimately, the photons generated or reflected by a scene reach the sensor and the energy is turned into an electrical signal (or recorded on film). Since emerging technologies involve the use of electro-optical imaging sensors, we will look at the requirements for storing and transferring the electrical data. Because of weight limitations, satellite systems usually send the data in real-time or near real-time via telemetry to ground stations which then record the data in optical drives, CD’s, or serial tape drives. Airborne platforms, however, may be able contain the data storage hardware onboard. In many cases, the distribution and storage of data is handled by government organizations or by private industry. This is another area where technology continues to improve and become more affordable allowing real-time delivery of large volumes of imagery data with minimal loss or distortion of data.

2.3 Quality of Imagery

The quality of the imagery mainly depends on the atmospheric distortions, sensor performance, platform-induced distortions, and the effectiveness of image processing algorithms. Of all of these, the atmosphere is the most dynamic, and consequently, the most difficult source of image degradation to compensate for.

2.3.1 Atmospheric Distortions

Slight variations in the atmosphere change the effective index of refraction of the atmospheric medium for any given optical path between the scene and the sensor. The effect of the atmosphere is typically seen as a blurring and loss of contrast in an image. When looking at spectral data, the atmosphere affects the spectral profile that a sensor “sees” by blocking or introducing different frequency bands to the spectra, thus generating inaccuracies in the image segmentation and classification process. In the extreme case, heavy cloud cover may completely obscure a scene from a remote sensing system. The degradation process is difficult to characterize because of the large amounts of physical processes occurring in the atmosphere that affect the transmission of light through it. In general, atmospheric effects are compensated for through a complex model of the atmosphere. The U.S. Air Force Phillips Laboratory Geophysics Directorate has developed a widely accepted database of atmospheric constituents allowing the user to make an estimation of the atmospheric effects on the image acquisition process. Other approaches for atmospheric compensation include speckle imaging, range-angle interferometry, and adaptive optics5. Interestingly, many of these developments in atmospheric compensation originated from within the astronomical community.

2 Sensor Performance

There are so many aspects of sensor design and operation that contribute to a sensor’s overall performance that it would be beyond the scope of this paper to discuss all of them. A more complete treatment of sensor design and performance is found in Chapter 8 of the Manual of Remote Sensing6 and Volumes 3 and 4 of The Infrared & Electro-Optical Systems Handbook7. For our discussion, it is sufficient to mention that the sensor performance is mainly driven by its signal-to-noise ratio (SNR), spectral response, throughput, and ease of calibration. The SNR is simply how well the sensor can distinguish a signal of interest from the electronic or thermal noise associated with the hardware. This is the parameter that requires longer dwell time in order to obtain enough photons or flux to create a signal above the noise of the sensor. Although the detector usually drives the noise, it is possible to have a system where the signal conditioning electronics are the major source of noise. The spectral response is a parameter that describes how well the sensor can “see” in a specific spectral band. If the optical wavelength we are interested in falls in the near-infrared, for example, and our sensor has no spectral response in this region then it is not going to register a signal. Another example is the spectral response of the eye, which can only see in the visible optical spectrum. The throughput is a measure of how well the incoming flux of radiation propagates through the sensor optics. Clearly, poor mirror coatings and lens aberrations will cause the light to be scattered or attenuated as it propagates through the sensor system, thus limiting the number of photons reaching the detector. Finally, no sensor can provide high-quality imagery without proper calibration. A well-calibrated system increases the confidence in the accuracy of the data. Temperature readings from an airborne radiometer may very well be useless if the data cannot be compared to some absolute measurement. Calibration is a far greater issue for spaceborne sensors. A system can be well calibrated in the laboratory but once it reaches space, the zero-gravity environment can change the hardware to a point where the instrument operates differently than it did on the ground. Advances in remote calibration techniques include the use of ground-based radar and lasers8.

2.3.3 Platform-Induced Distortions

Another source of distortions on an image is the imaging platform itself. This is particularly true of non-stabilized platforms such as aircraft. As the airplane pitches, yaws, and rolls, the direction in which the sensor is pointing changes, causing the FOV of sequential frames to be different3. This creates geometric distortions on the image. Although this distortion is not extremely complex, it can lead to loss of data and it must be taken into consideration. One of the major advantages of spaceborne over airborne sensors is the inherent stability of spaceborne platforms.

Figure 3. Atmospheric correction of Landsat-TM data by image processing.

5 Image Processing Algorithms

As computer technology continues to improve dramatically, the use of image processing algorithms is becoming more viable. Through image processing, sensor data can be represented in a meaningful format that allows the extraction of vital information. Image processing can also help fill in the “gaps” caused by missing data through the use of application-dependent interpolation or extrapolation techniques. Also, processing allows the user to compensate for atmospheric distortions (assuming that model predictions can be correlated to the observed distortions) and displaying an image without the atmospheric effects. Figure 3 shows the dramatic improvement that can be accomplished through the use of image processing techniques for atmospheric correction. The image on the right is the corrected image shown on the left9. As new sensor systems are developed, image processing algorithms must be identified in order to generate high-quality imagery.

3.0 Results

3.1 Multispectral Imaging

One of the biggest advances in remote sensing was the launch of the Landsat series of satellites beginning in 1972. These satellites were equipped with multispectral sensors that provided repetitive global coverage. The first three satellites carried the multispectral scanners (MSS) which collected images in four broad spectral bands from the visible green to the near-infrared. The sensors consisted of an oscillating mirror that would scan the ground in the cross-track direction using six simultaneous line scans (one line scan per detector per spectral band). Figure 4 shows the image projection on the ground and the scanning arrangement for the MSS. Landsat 3 had a MSS with a fifth band (designated “Band 8”) in the thermal-infrared. These MSS had an IFOV of 86 μrad (258 μrad for Band 8) and a GIFOV of approximately 79 m (237 m for Band 8) and SNR’s ranging from 72 to 123. Landsat 4 and 5 were equipped with a MSS and a

Figure 4. MSS scanning geometry and image projection4.

thematic mapper (TM). The TM’s operated over the same spectral bands as the MSS, but also provided 16 detectors per spectral band (as opposed to six in the MSS) and 4 detectors for the thermal-IR scan at a spatial resolution of 30 m (IFOV of 42 μrad) and 120 m (IFOV of 170 μrad) for the thermal-IR band. The TM sensor also provides more radiometric information than the MSS. The next generation of multispectral sensors comparable to these is the ETM+ (Enhanced Thematic Mapper Plus) which will be carried on the Landsat 7 (scheduled for launch in July 1998). This sensor will acquire images over 7 spectral bands with the same IFOV and spatial resolution as the TM except for the thermal-IR band, which will have an improved resolution of 30 m. The on-board radiometric calibration of the sensor is improved and a panchromatic band is included10.

Multispectral sensors continue to improve. The French SPOT satellite series consistently provide 20-m resolution multispectral and 10-m resolution panchromatic imagery that is available over the Internet. While the currently operational SPOT satellites have three spectral bands, SPOT-5 will have an additional mid-infrared band, increased multispectral resolution of 10-m, and a panchromatic resolution of 5-m. The data rate required to transmit all of these data is on the order of 150 Mbit/sec11!

2 Hyperspectral Imaging

While multispectral sensors continue to improve in spatial resolution, hyperspectral sensors are beginning to gain more popularity in remote sensing. These sensors are capable of providing data in very narrow spectral bands (on the order of nanometers). The advantage of the hyperspectral sensor is that we can now not only determine what an object is, but what it is made of. Since the reflective and emissivity characteristics of an object vary depend on wavelength and the object’s composition, obtaining a spectrum of reflected and self-emitted photons should provide some information as to what the object’s constituents are. Because each spectral band is narrow, longer dwell times are needed in order to obtain a high enough SNR. Also, decreasing the frequency bandwidth has the adverse effect of decreasing the spatial resolution. Because of this, image processing algorithms are being developed to fuse hyperspectral data with high-resolution multispectral or panchromatic images. The resulting product is an image cube that is made up of voxels (volume elements) instead of pixels (picture elements)12. Figure 5 is an example of a typical image cube generated from hyperspectral imaging. Notice that the spectral information is in the vertical axis of the image and that there are gaps where no detector exists for the corresponding spectral band.

Figure 5. Typical hyperspectral image cube.

Currently, civilian operational hyperspectral imagers are limited to airborne systems. These systems, however, provide very high resolution spectral data at the expense of a wide synoptic view. The AISA Airborne Imaging Spectrometer, for example, can resolve spectral differences of less than 2 nm in the visible to near-infrared spectrum of 430-990 nm while obtaining spatial resolutions of about one meter for an aircraft flying at an altitude of 1000 m. The FOV is 21o cross-track and .055o along-track13. This is all accomplished with a lightweight portable package of about 35 pounds. Other airborne sensors include the NASA AVIRIS (with a spectral range of 400-2450 nm) and the ITRES CASI (430-870 nm). Rochester Institute of Technology (RIT) is currently working on the MISI sensor that has 69 bands ranging from the visible to the far-infrared. The spectral bandwidth of this instrument in the visible range is of the order of 10 nm, which is considered state-of-the-art technology14. Appendix A contains a more extensive list of airborne and spaceborne hyperspectral imagers15.

The Department of Defense (DOD) recently launched the Midcourse Space Experiment (MSX) satellite. This satellite carries a hyperspectral imaging suite of sensors built by the John Hopkins University Applied Physics Laboratory (JHU/APL) called the Ultraviolet and Visible Imagers and Spectrographic Imagers (UVISI). Table 1 shows the characteristics of this sensor.

Over the next decade, several hyperspectral sensors will be placed in orbit. This is largely due to the Mission to Planet Earth (MTPE) initiative headed by NASA. The next spaceborne hyperspectral sensor is the Moderate Resolution Imaging Spectrometer (MODIS) which will be on-board the EOS-1 (Earth Observing Satellite). The sensor will encompass the whole globe every 1 to 2 days with a spectral resolution of 10-15 nm in the visible and near-infrared and 0.1-0.3 μm for the mid-infrared to far-infrared16.

Table 1. UVISI sensor suite parameters17.

|Characteristic | UV-VIS Imager | Spectrographic Imager |

|Wavelength coverage (nm) | 110-300 UV, 300-900 VIS | 110-900 |

|Field of view (°) | 1.6 X 1.3 NFOV | 1.0 X 1.0 |

| | 13.1 x 10.5 WFOV | |

|Optical collecting area (cm²) | 130 NFOV, 25 WFOV | 110 |

|Filters | 5 position selectable | 3 |

|Spatial resolution (°) | 0.01, 0.10 | 0.025 |

|Spectral resolution (nm) |- | 0.5-4.3 |

|Sensitivity (photons/cm²s) | 1-5 | 1-6 |

|Total Weight: 456 lb | | |

|Total Power: 105 W | | |

Perhaps the most critical advance in technology that has made hyperspectral imaging from space a realization is the advent of highly sensitive HgCdTe Focal Plane Arrays (FPA). This allows the narrow-band, “photon-starved” detectors to have a high enough SNR and/or NEΔT (noise equivalent change in temperature) in order to register a signal. Also, improvements in the readout electronics result in a much higher detector quantum efficiency (which is a measure of how well the detector can generate electrons per incident photon). There is a host of other technologies involved in the proper operation of a hyperspectral imager (i.e. calibration modules, cooling systems, precise dispersion elements, optical coatings, etc.) that must be considered but are omitted here do to the limited scope of this report16.

Recall from Figure 5 that the image product of a hyperspectral sensor is an image cube. The large volume of data inherent in this product is of concern for spaceborne sensors since on-board recording systems are not encouraged due to weight limitations of payloads. Therefore, advances in telemetry must keep up with the increasing demands for real-time and near-real-time display of imagery. A promising technology appears to be laser communications where data rates of gigabytes per second are possible. Other areas of current development are the incorporation of hyperspectral images from space to current imaging technologies and the use of hyperspectral data for automated target identification18.

3.3 Synthetic Aperture Radar

One of the major limitations of multispectral and hyperspectral sensors is the fact that they cannot “see” through clouds, heavy fog, or haze. Synthetic Aperture Radar (SAR) surpasses this limitation because it operates in the microwave region of the electromagnetic spectrum (1-10 GHz) where there is no water molecule absorption. At frequencies lower than 1 GHz, the characteristics of the returned signal are dominated by ground interference and ionospheric disturbances while molecular absorption bands dominate the range of frequencies above 100 GHz, which yields information about the atmosphere, but not about the Earth’s surface. Because of its all-weather capability, and because it provides its own active source of illumination (and therefore it is independent of the sun), SAR technology is very appealing for the continuous observation of global patterns19.

Although SAR technology has existed since 1951, it has not been seriously considered as a high-resolution imaging system from space until recently. The major limitation in SAR was the inherent large number of computations needed to analyze the data. Because of the limited computational power that existed up until recently, SAR imagery resulted in poor quality and a short dynamic range. Also, SAR systems are particularly susceptible to geometric distortions generated by uneven terrain and large radiometric distortions inherent to the system design. Development of accurate calibration techniques and their operational implementation has made SAR technology more viable19.

The ingenuity behind SAR imaging systems is that the aperture is formed synthetically. Normally, operation in the microwave region would require large antennae of dimensions on the of order hundreds of meters. With SAR, a moving platform emits ranging pulses that are collected by the imaging platform as it moves. The distance the aircraft or spacecraft travels over the time of transmission is the synthetic aperture20. Range resolution in the cross-track direction (direction the imaging system is looking at) is determined by the time it takes for the radar pulse to return. Azimuthal resolution (or along-track) is obtained from the Doppler shift associated with a target return. Consider a point target located at a range R in the cross-track direction of an aircraft and a coordinate x in the along-track direction, the Doppler frequency associated with that point is

ƒD = 2∙(Vst∙sinθ)/λ = 2∙Vst∙x/λ∙R [2]

where θ is the angle between the cross-track direction and the target, Vst is the relative velocity of the aircraft, and λ is the wavelength of the SAR pulse. Thus, for each location x, there is a Doppler frequency associated with it that allows the point to be resolved19.

The number of space SAR systems is very limited. In the United States, the Seasat-A satellite (which was operational for only 100 days) and the Shuttle Imaging Radar (SIR) are the only spaceborne SAR systems. Because of the limited amount of spaceborne SAR data, the next generation of EOS satellites, which will have SAR systems similar to the SIR-C and X-SAR, will generate interesting results. In 1991, the European Space Agency (ESA) and the National Space Development Agency (NASDA) of Japan launched the European Remote Sensing (ERS-1) which contains the Active Microwave Imager (AMI) SAR system. The AMI operates at 5.3 GHz (C-Band) and has a spatial resolution of 30 m over a 99 km swath21. This is comparable to the Landsat TM and ETM+ sensors. This system is a prototype for the Advanced SAR (ASAR) sensor that will be on-board the environmental monitoring and atmospheric chemistry satellite ENVISAT. The Canadian Radarsat will also employ a C-Band SAR22.

Figure 6. 3-D topographical SAR imagery.

Airborne SAR systems are more common and have provided the necessary tools to make spaceborne SAR possible. Sandia National Laboratories routinely obtains 3-D topographic imagery from interferometric airborne SAR operations. SAR interferometry simply uses two airborne passes that are correlated to generate a synthetic interferometer. Figure 6 is an example of the type of imagery possible with these systems even through heavy clouds20. NASA Jet Propulsion Laboratories (JPL) uses the AIRSAR system which is flown on a DC-8 operating at the P-band (438.75 MHz), L-band (1237.4 MHz), and C-band (5287.5 MHz)23.

As it can be seen, SAR systems can provide a unique form of imagery that is weather- and sun-independent. This is clearly an advantage over optical systems such as Landsat, which provides an average of only 6 to 10 images per year for any non-desert land area on Earth because of its weather limitations24. Furthermore, the microwave nature of the active illumination allows SAR systems to characterize surface properties (i.e. soil moisture, etc). SAR technology also serves as the basis for emerging technologies. These include, but are not limited to, passive interferometric range-angle imaging and optical range-Doppler imaging5.

4.0 Conclusions

The imaging technologies covered in this report are by no means all encompassing, but they represent the major efforts in remote sensing going on today. The tradeoffs are numerous between these systems. Multispectral sensors provide high spatial resolution but are limited in spectral information. On the other hand, hyperspectral sensors can generate nearly continuous spectra of objects but have poor spatial resolution. Finally, SAR systems can overcome weather limitations but are difficult to calibrate and are very difficult to maintain operationally. Since no single system provides and end-all solution, the answer clearly lies in taking the information that can be obtained from each of these systems and putting it all together in a single product. This is what remote sensing is headed towards. The real technological advance will occur when information technology catches up with all the sources of satellite and airborne imagery available and can fuse it all together in a way that it is available to the user in real-time mode. Advances in data transmission and computational speeds will definitely have to occur as well as display technologies that can accurately represent the fused imagery. The knowledge-power associated with having this capability is far reaching in all applications. The future of remote sensing is just beginning, and the reward for our efforts is a better understanding of our own home: The Earth.

Appendix

Table 2. Spaceborne Hyperspectral Imagers

|Sensor |Number of |Spectral |Band Width |GIFOV (mrad) (m) |FOV(deg) |Data Product |Tentative |

|(Agency) |Bands |Coverage (nm) |at FWHM (nm) | |(km) | |Launch Date |

|NIMS |504 |700-5100 |10 |0.5 |20 pixels |Full Cube |flown |

|(NASA/JPL) | | | | | | |(extra-terrestrial mission) |

|VIMS |320 |400-5000 |15 |0.5 |70 pixels |Full Cube |flown |

|(NASA/JPL) | | | | | | |(extra-terrestrial mission) |

|UVISI |> 200 |380 - 900 |1 - 3 |(100 - 1000) |(25) |Full Cube |MSX |

|(US MILITARY) | |110-900 | | | | |spacecraft |

| | | | | | | |(1994) |

|Min Map |192 |350-2400 |12.5 |.45 |5.8 |Full Cube |1996? |

|(?) | | | | | | | |

|MODIS |36 |415-2130 |10 - 500 |(250 - 1000) |(2330) |Sub-Cube |EOS AM platform (1998) |

|(NASA/EOS) | |3750 - 4570 | | | | |EOS PM platform (2000) |

| | |6720 -14240 | | | | | |

|MERIS |15 |400 - 1050 |2.5 - 10 |(300) |(1450) |Sub-Cube |ESA-POEM 1 |

|(ESA/EOS) |(selectable) | |(selectable) | | | |AM platform |

| | | | | | | |(1998) |

|PRISM |~ 150 - 200 |450-2350 |10 - 12 |(50) |(50) |Full Cube |Design stage |

|(ESA/EOS) |1 |3800 |600 | | | | |

| |3 |8000 - 12300 |1000 | | | | |

|CIS |30 |VNIR |20 |(402) |90 |Full Cube |Design stage |

|(China) |6 |SWIR/MWIR/TIR | | | | | |

|HSI |128 |400 - 1000 |5.00 |(30) |(7.7) |Full Cube |LEO s/c platform |

|(TRW) |256 |900 - 2500 |6.38 | | | |(1996) |

Table 3. Airborne Hyperspectral Imagers

|Sensor |Number of |Spectral |Band Width |IFOV (mrad) |FOV(°) |Data Product |Period of |

|(Agency/Company) |Bands |Coverage (nm) |at FWHM (nm) |(GIFOV (m)) |(km) | |Operation |

| | | |(wave number) | | | | |

|AAHIS |288 |433-832 |6.0 |1.0 x 0.5 |11.4 |Image Cube |since 1994 |

|(SAIC) | | | | | | | |

|AHS |48 |440-12700 |20 - 1500 |2.5 |86 |Image Cube |since 1994 |

|(Daedalus) | | | | | | | |

|AIS-1 |128 |900-2100 |9.3 |1.91 |3.7 |Image Cube |1982-1985 |

|(NASA/JPL) |128 |1200-2400 |10.6 |2.05 |7.3 |Image Cube |1985-1987 |

|AIS-2 | |800-1600 | | | | | |

|(NASA/JPL) | |1200-2400 | | | | | |

|AISA |286 |450-900 |1.56 - 9.36 |1.0 |21.0 |Image Cube |since 1993 |

|(Karelsilva Oy) | | | | | | | |

|AMSS |32 |490-1090 |20.0 - 71.0 |2.1 x 3.0 |92.0 |Image Cube |since 1985 |

|(GEOSCAN) |8 |2020-2370 |60.0 | | | | |

| |6 |8500-12000 |550 - 590 | | | | |

|ARES |75 |2000 - 6300 |25.0 - 70.0 |1.17 |3 x 3 |Image Cube |since 1985 |

|(Lockheed) | | | | | | | |

|ASAS |29 |455 - 873 |15.0 |.80 |25.0 |Image Cube |1987 - 1991 |

|(NASA/GSFC) |62 |400 - 1060 |11.5 |.80 |25.0 |(7 viewing angles) |since 1992 |

|upgraded ASAS | | | | | |+45(deg)/-45(deg) | |

| | | | | | |Image Cube | |

| | | | | | |up to 10 viewing angles) | |

| | | | | | |+75(deg)/-55(deg) | |

|ASTER Simulator |1 |700 - 1000 |300.0 |1.0, 2.5 |28.8, 65.0 |Image Cube |since 1992 |

|(DAIS 2815) |3 |3000 - 5000 |600 - 700 |or 5.0 |or 104.0 | | |

|(GER) |20 |8000 - 12000 |200 | | | | |

|AVIRIS |224 |400 - 2450 |9.4 - 16.0 |1.0 (20) |30.0 (12) |Image Cube |since 1987 |

|(NASA/JPL) | | | | | | | |

|CASI |288 |430 - 870 |2.9 |1.2 |35.0 |Profiles |since 1989 |

|(Itres Research) |up to 15 |(nominal) | | | |Image | |

|CAMODIS |64 |400 - 1040 |10.0 |1.2 x 3.6 |80.0 |Imabe Cube |since 1993 |

|(China) |24 |2000 - 2480 |20.0 |1.2 x 1.8 | | | |

| |1 |3530 - 3940 |410.0 |1.2 x 1.2 | | | |

| |2 |10500 - 12500 |1000.0 |1.2 x 1.2 | | | |

|DAIS - 7915 |32 |498 - 1010 |16.0 |3.3, 2.2 |78.0 |Full Cube |since 1994 |

|(GER/DLR/JRC) |8 |1000 - 1800 |100.0 |or 1.1 | | | |

| |32 |70 - 2450 |15.0 | | | | |

| |1 |3000 - 5000 |2000.0 | | | | |

| |6 |8700 - 12300 |600.0 | | | | |

|DAIS - 16115 |76 |400 - 1000 |8.0 |3 |78.0 |Full Cube |since 1994 |

|(GER) |32 |1000 - 1800 |25.0 | | |Stereo | |

| |32 |2000 - 2500 |16.0 | | | | |

| |6 |3000 - 5000 |333.0 | | | | |

| |12 |8000 - 12000 |333.0 | | | | |

| |2 |400 - 1000 | | | | | |

|DAIS - 3715 |32 |360 - 1000 |20 |5.0 |90.0 |Full Cube |since 1994 |

|(GER) |1 |1000 - 2000 |1000 | | | | |

| |2 |2175 - 2350 |50 | | | | |

| |1 |3000 - 5000 |2000 | | | | |

| |1 |8000 - 12000 |4000 | | | | |

|FLI/PMI |288 |430 - 805 |2.5 |.66/.80 |70.0 |Full Cube (Profiles) |1984 - 1990 |

|(MONITEQ) |8 | | | | |Sub-Cube | |

|GERIS |24 |400 - 1000 |25.4 |2.5, 3.3 |90.0 |Full Cube |since 1986 |

|(GER) |7 |1000 - 2000 |120.0 |or 4.5 | | | |

| |32 |2000 - 2500 |16.5 | | | | |

|HSI |128 |400 - 900 |4.3 |0.14 x 1.0 |8.0 |Full Cube |until 1994 |

|(SAIC) | | | | | | | |

|HYDICE |206 |400 - 2500 |7.6 - 14.9 |0.5 |8.94 |Full Cube |since 1995 |

|(Navel Research | | | | | | | |

|Laboratory) | | | | | | | |

|ISM |64 |800 - 1700 |12.5 |3.3/11.7 |40.0 |Full Cube |since 1991 |

|(DES/IAS/OPS) |64 |1500 - 3000 |25.0 | |(selectable) | | |

|MAS |9 |529 - 969 |31 - 55 |2.5 |85.92 |Full Cube |since 1993 |

|(Daedalus) |16 |1395 - 2405 |47 - 57 | | | | |

| |16 |2925 - 5325 |142 - 151 | | | | |

| |9 |8342 - 14521 |352 - 517 | | | | |

|MAIS |32 |450 - 1100 |20 |3 |90 |Full Cube |1990 |

|(China) |32 |1400 - 2500 |30 |4.5 | | | |

| |7 |8200 - 12200 |400 - 800 |3 | | | |

|MEIS |> 200 |350 - 900 |2.5 |2.5 |- |Full Cube |since 1992 |

|(McDonnell Douglas) | | | | | | | |

|MISI |60 |400 - 1000 |10 |1 |±45 |Full Cube |from 1996 |

|(RIT) |1 |1700 |50 |1 or 2 | | | |

| |1 |2200 |50 | | | | |

| |3 |3000-5000 |2000 | | | | |

| |4 |8000-14000 |2000 | | | | |

|MIVIS |20 |433 - 833 |20.0 |2.0 |70.0 |Full Cube |since 1993 |

|(Daedalus) |8 |1150 - 1550 |50.0 | | | | |

| |64 |2000 - 2500 |8.0 | | | | |

| |10 |8200 - 127000 |400.0/500.0 | | | | |

|MUSIC |90 |2500 - 7000 |25 - 70 |0.5 |1.3 |Full Cube |since 1989 |

|(Lockheed) |90 |6000 - 14500 |60 - 1400 | | | | |

|ROSIS |84 |430 - 850 |4.0/12.0 |0.56 |16.0 |Full Cube |since 1993 |

|(MBB/GKSS/DLR) |30 | | | | |Sub-Cube | |

|RTISR |20 or 30 |400 - 700 (900) |7.0 - 14.0 (19.0) |0.2 - 2.0 |29.0 x 22.0 |Full Cube |since 1994 |

|(Surface Optics Corp.) | | | | | | | |

|SFSI (CCRS) |120 |1200 - 2400 |10.0 |0.33 |9.4 |Full Cube |since 1994 |

| | | | |(„ 0.5) | |Sub-Cube | |

|SMIFTS |75 |1000 - 5200 |(100 cm-1) |0.6 |6.0 |Full Cube |since 1993 |

|(U. of Hawaii) |35 |3200 - 5200 |(50 cm-1) | | | | |

|TRWIS-A |128 |430 - 850 |3.3 |1.0 |13.8 |Full Cube |since 1991 |

|TRWIS-B |90 |430 - 850 |4.8 |1.0 |13.8 |Full Cube |since 1991 |

|TRWIS-II |99 |1500 - 2500 |11.7 |0.5/1.0 |6.9/13.8 |Full Cube |since 1991 |

|TRWIS-III |396 |400 - 2500 |5.0/6.25 |0.9 |13.2 |Full Cube |since 1991 |

|(TRW) | | | | | | | |

|Hybrid VIFIS |30 |440 - 640 |10 - 14 |1.0 |31.5 |Full Cube |since 1994 |

|(U. of Dundee) |30 |620 - 890 |14 - 18 |1.0 |31.5 | | |

|WIS-FDU |64 |400 - 1030 |10.3 |1.36 |10.0 & 15.0 |Full Cube |1992 |

|(Hughes SBRC) | | | | | | | |

|WIS-VNIR |17 |400 - 600 |9.6 - 14.4 |0.66 |19.1 |Full Cube |1995 |

|(Hughes SBRC) |67 |600 - 1000 |5.4 - 8.6 | | | | |

|WIS-SWIR |41 |1000 - 1800 |20.0 - 37.8 |0.66 |12.0 |Full Cube |1995 |

|(Hughes SBRC) |45 |1950 - 2500 |18.0 - 25.0 | | | | |

References

1. NASA/JPL. “TOPEX/Poseidon.” Internet. 14 Nov. 1997. Available

jpl.

2. Vick, C.P. “FAS Intelligence Resource Program: CORONA Products.” (17 Mar.

1997). Internet. 14 Nov. 1997. Available

nuclear.htm

3. Schott, J.R. Remote Sensing: The Image Chain Approach. New York: Oxford UP,

1997.

4. “Multispectral Scanner Landsat Data.” U.S. Geological Survey’s EROS Data Center.

Internet. 14 Nov. 1997. .

5. Robinson, S.R., ed. Emerging Systems and Technologies. SPIE: Optical Engineering

Press and Environmental Research Institute of Michigan, 1993. Vol. 8 of The

Infrared and Electro-Optical Systems Handbook. Eds. J.S. Accetta, D.L.

Shumaker. 8 vols. 1993.

6. Colwell, R.N. Manual of Remote Sensing. Vol. 1. Virginia: American Society of

Photogrammetry, 1983.

7. Accetta, J.S., Shumaker, D.L. Eds. The Infrared and Electro-Optical Systems

Handbook. 8 Vols. SPIE: Optical Engineering Press and ERIM, 1993.

8. Schott, J.R. Personal Interview. October, 1997.

9. Vandle, J.R. et. al. “LTER/NASA Collaboration on Atmospheric Correction of

Remotely Sensed Data: Draft Workshop Report.” (16-18 Aug. 1996).

14 November 1997. Available .

10. NASA. “ETM+ Image Formation.” Internet. 13 Nov. 1997. Available



11. “The SPOT Satellites.” Internet. 13 Nov. 1997. Available



12. “Image Cube.” Digital Imaging and Remote Sensing Group. Center of Imaging

Science. Rochester Institute of Technology.

13. “AISA Airborne Imaging Spectrometer.” Internet. 9 Oct. 1997. Available



14. Schott, J.R. Personal Interview. October, 1997.

15. Staenz, K. “Airborne and Spaceborne Imaging Spectrometers.” Canadian Activities

in Terrestrial Imaging Spectroscopy. Internet. 13 Nov. 1997. Available



ImagSpec.book_99.html#HEADING98

16. Ward, K. “MODIS Instrument.” NASA/MTPE. Internet. 9 Oct. 1997. Available



17. “UVISI” U.S. Navy Research Laboratories. John Hopkins University Applied

Physics Laboratory. Internet. 13 Nov. 1997. Available:

18. “Leveraging the Infosphere: Surveillance and Reconnaissance in 2020.” Air

University. Vol.1 of Spacecast 2020. June 1994.

19. Curlander, J.C., McDonough, R.N. Synthetic Aperture Radar: Systems and Signal

Processing. New York: John Wiley & Sons, Inc. 1991.

20. Walker, B. “How SAR Works.” Sandia National Laboratories. (8 Jan. 1996).

Internet. 13 Nov. 1997. Available

21. “AMI Sensor.” ESA/NASDA. Internet. 13 Nov. 1997. Available



22. Asrar, G., Greenstone, R. Eds. 1995 MTPE EOS Reference Handbook. NASA-

Goddard Space Flight Center.

23. Maldonado, L. Jr., Curator “AIRSAR General Reference Manual.” (18 Jul. 1997)

AIRSAR Jet Propulsion Laboratory. Internet. 13 Nov. 1997. Available



24. Sellers, P.J. et. al. “Earth Science, Landsat, and the Earth Observing System.” Land

Satellite Information in the Next Decade: Conference Proceedings of the

American Society of Photogrammetry and Remote Sensing, Vienna, Virginia,

25-28 September 1995. Maryland: American Society of Photogrammetry and

Remote Sensing, 1995.

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download