9 Active and Passive Remote Sensing

Dr. Sandeep Gupta

 

1. Learning Objective

 

The objective of this module is to understand the basic concept of active and passive remote sensing with a note on microwave, LiDAR and thermal remote sensing techniques and their applications.

 

This chapter provides an overview on the two broad types of sensing based on the source of energy utilized for the purpose of sensing (the active and passive). This chapter highlights the use of some of the important active and passive sensing and sensors used in optical and microwave regions, namely, microwave, LiDAR and thermal. The underlying principles, components, the technologies and the development has also been attempted to familiarize the audience within the domain of this module. The applications are the important keys and so widespread that left the readers to not underestimate the capability of remote sensing within the domain of Earth’s observation.

 

2. Introduction

 

“Many myths are still to be solved through progress of remote sensing

 

Testing the objects in dimensions of space and time has resulted many discoveries of stars and theories of planetary motions. Remote sensing (English), Fernerkundung (German), Télédétection (French), Perception Remota (Spanish), Telerilevamento (Italian), and Sensoriamento Remoto (Portuguese), was a part of slowly progressing space science in a modern time. The early results of Galileo Galilei (AD 1564-1642) telescopic observation on the rotation of Earth around the rotating Sun was a profound scientific example rather than a bare and contradictory roman catholic philosophy. Such discoveries in past has played a pivotal role in exemplifying advancing role of today’s remote sensing science beyond planetary boundaries. Literally, remote sensing is through remotely obtaining qualitative and quantitative information concerning any site and the environment with a combination of methods and techniques for subsequent elaboration and interpretation. This sensing is actually a sensing of energy as every form or formless substance and/or phenomenon, indeed, is an energy. While sensing, based on the source of energy used to record the information, the sensing and sensors are broadly classified in to two parts – ‘active’ and ‘passive’. In this module, our focus is on the various aspects of active and passive remote sensing systems.

 

3. Active and Passive Remote Sensing

 

3.1 Passive Sensing and Sensors

 

The sun is the ultimate source of energy for life on the planet and the same is largely true for remote sensing. The solar energy is either reflected, as it is for VIS-NIR wavelengths, or absorbed and then reemitted, as it is for thermal IR wavelengths. Passive remote sensing systems measure energy that is naturally available and are called as passive sensors. Due to this reason, passive sensors are capable of detecting energy when all reflected or reemitted solar energy is available during the sun’s illumination time on the Earth’s surface. There is no reflection of solar energy at night. However, sensing of naturally emitted solar energy such as thermal IR is possible both day and night. Examples of passive sensors are Landsat series, IRS-series, SPOT series, IKONOS, Quickbird etc.

 

3.2 Active Sensing and Sensors

 

Active sensors have their own energy source for illumination for the features to be investigated. The energy reflected from the target feature is recorded by the active sensor. Therefore, one of the main advantages of active sensors is their ability to record energy anytime, regardless of the time of day or season. For instance, flash photography is active Remote Sensing in contrast to available light photography, which is passive. Active sensors are designed in such a way that they can be used for sensing in wavelengths that are not sufficiently provided by the sun on the Earth’s surface, for example, microwaves. Also, the active sensors can be controlled while illumination the way a target is subjected for sensing. Examples of active sensors are synthetic aperture radar (SAR) or LiDAR.

 

4. Microwave or RADAR Sensing, Sensors and Applications

 

It includes both active and passive microwave (MW) sensors. MW wavelength spectrum: 1cm to 1m, is a long wavelength. In contrast to shorter optical wavelengths which affect the imaging of Earth, the MW sensing is not vulnerable to atmospheric scattering. It is capable of penetrating cloud cover, haze, dust and others, except a profound rainfall. Detection of MW energy under almost all weather and environmental conditions is possible and therefore MW data can be collected at any time of the day in a year.

 

4.1 Passive Microwave Sensing

 

All objects emit microwave energy of some magnitude, but the amounts are generally very small. A passive MW sensor (radiometer or scanner) detects the naturally emitted microwave energy within its field of view by its antenna This emitted energy is related to the temperature and moisture properties of the emitting object or surface.

 

Generally, after striking of MWs strike on a surface, the MW is energy is scattered back to the sensor. The amount of energy to be scattered back is depends on several factors: a) physical factors such as the dielectric constant of the surface materials which also depends strongly on the moisture content, b) geometric factors (such as surface roughness, slopes, orientation of the objects relative to the radar beam direction), c) types of land-cover (soil, vegetation or man-made objects), d) MW frequency, e) polarisation and f) incident angle.

 

The broader area of passive MW application includes the field of meteorology, hydrology, and oceanography. By sensing a or through the atmosphere the meteorologists use passive MWs to detect atmospheric profiles and also measure the presence of water and ozone content in the atmosphere. The hydrologists use passive MWs to measure soil moisture content which is due to influence of moisture content in the property of MWs. In oceanography, detecting surface winds, sea ice and currents, and pollutants such as oil slicks is possible by recording backscattered MWs.

 

4.2 Active Microwave or RADAR Sensing

 

In RADAR (RAdio Detection And Ranging) sensing, an active radar sensor emit MWs in the form of a series of pulses from an antenna, looking obliquely at the surface perpendicular to the direction of motion. During the time of interaction of outgoing MW pulses hits the target feature(s), after which, some of the MW energy is backscattered to the sensor which in turn are detected, measured, and timed. The distance or range from the target to the sensor is determined by the time delay. While tracking through its path and recording the range and magnitude of the incoming MWs from the targets a two-dimensional image of the surface is formed. The active MW sensors are divided into two categories: imaging and non-imaging. RADAR is the most common Imaging active MW sensor. Altimeters and scatterometers are non-imaging MW sensors (profiling devices) which measure energy in only one linear dimension, in contrast to the two-dimensional mapping by active RADAR imaging sensors. Altimeters transmit Nadir looking short MW pulses towards the target and image it. While mapping, it measures the time delay also which, in turn, provide the target-sensor distance. The common application of RADAR altimetry are topographic mapping, altitude finding and sea surface height estimation. The scatterometers are used to measure the amount of the incoming MW energy after hitting the targets. The amount of energy depend on the roughness of the surface and the angle at which the MWs hits the target. The Scatterometry is used to measure the wind speeds over ocean surfaces based on the sea surface roughness. The terrestrial scatterometers are used for characterisation of different materials and surface types.

 

4.3 Synthetic Aperture RADAR (SAR)

 

The synthetic aperture radar or SAR is a MW imaging system which produces a high resolution image of the Earth in which MW pulses are transmitted by an antenna towards the earth surface and the backscattered MW energy to the spacecraft is recorded. The amount of MWs backscattered by the target feature and received by the SAR antenna determines the intensity of a SAR image. The term ‘synthetic aperture radar’ or SAR derives from the fact that the motion of an aircraft is used to artificially create, or synthesize a very long, linear array, antenna. The reason for creating a long antenna is to provide the ability to resolve targets that are closely spaced in angle, or cross range. This, in turn, is influenced by one of the main uses of SARs that is ‘to map the ground or targets’. In both cases, the radar needs to be able to resolve very closely spaced scatterers, i.e. of very high resolution, in the order of a few feet. To attain such resolutions in the range coordinate the radar uses wide bandwidth waveforms. To attain images of such resolutions in cross range, a very long antenna is a prerequisite. For a spacecraft it is very a difficult task to carry a very long antenna that is required for high resolution MW imaging. In order to deal with this limitation, the SAR capitalises on the motion of the space craft to imitate a large antenna (about 4 km for the ERS SAR) from the small antenna (10 m on the ERS satellite) it actually carries on board.

 

The physical mechanisms responsible for this MWs backscattering is different when compared to VIS/IR radiation, and therefore the interpretation of SAR images is also different. The SAR uses the radar principle to form an image by utilising the time delay of the backscattered signals.

 

The outgoing MW pulse by the radar antenna ‘sees’ an area on the ground known as the antenna’s ‘footprint’.

Figure 2. MW pulse footprint and antenna size

In real aperture radar imaging, the footprint or ground resolution is limited by the size of the MW pulses illuminating from the antenna. A narrower MW pulse gives the finer details on the ground compare to a wide MW pulse. Here, it is important to note that the MW pulse width or MW footprint size is inversely proportional to the size of the antenna which means that the narrower the size of the MW pulse, longer is the antenna (figure 2).

 

SAR is of two kinds: spotlight and strip map. In case of strip map SAR, the actual antenna constantly pointed at the same angle during the motion of the aircraft. While in case of spotlight SAR, the actual antenna is turned to constantly point towards the area being mapped. The term ‘strip map’ derives from the fact that this type of SAR can continually map strips of the ground as the aircraft flies by. The term ‘spotlight’ derives from the fact that the actual antenna constantly illuminates, or spotlights, the region being mapped.

 

Figure  3.  A  typical  strip map SAR imaging system. Note that during the imaging the antenna’s footprint sweeps out a strip parallel to the direction of the satellite’s ground track.

 

Some of the advantages and disadvantages of the microwave sensors are being summarized below:

 

Table 1: Advantages and disadvantages of microwave sensors

5. LiDAR Sensing, Sensors and Applications

 

We live in a three-dimensional (3D) spatial world and for many applications we require accurate 3D representation of data. In recent decades, remote sensing has witnessed a multifold growth in the spatio-temporal based applications with an addition of third dimension as a height/depth to the location of an object or phenomenon of focus. The growth is achieved using laser, an acronym for ‘light amplification by stimulated emission of radiation’, data and its synonym LiDAR (Light Detection and Ranging). LiDAR is a 3-D active sensing technology that records remotely sensed information in a optical wavelength spectrum using lasers. Surveying, mapping and analysis through LiDAR has provided a new insight to the users of this technology. LiDAR basically measures the distance to, or other properties of a target by illuminating the target with outgoing pulses from a laser as a carrier and recording the backscattered signal as an incoming or returning pulse. A pulse is a signal of very short duration that travels as a beam. It, thus, match a radar, however, it differs in a way that it uses the ultraviolet (UVaser), visible (Vaser) or infrared (IRaser) region of the electromagnetic (EM) spectrum instead of microwaves (MWaser) until a recent published work in Science on the March 03, 2017 by Cassidy et al.

 

Albert Einstein can be considered as the father of lasers, although he didn’t invented it. In 1917 Einstein postulated photons and stimulated emission in his eight pages long published work in a journal Physikalische Zeitschrift titled ‘Zur Quantentheorie der Strahlung’ (On the Quantum Theory of Radiation). In 1960, Theodore Maiman at Hughes Research Laboratories located at Malibu, USA developed a device to amplify light at 694 nm wavelength, thus building the first functional laser (instrument). A team of researchers led by Leo P. Kouwenhoven that also include M.C. Cassidy at QuTech, TU, Delft (Netherland) demonstrated in their paper in 2017 an on-chip microwave laser based on a fundamental property of superconductivity, the ac Josephson effect – if a very short barrier interrupts a piece of superconductor, the electrical carriers tunnel through this non-superconducting material by the laws of quantum mechanics.

 

The first applications of lasers was in meteorology, where the National Center for Atmospheric Research (U.S.A.) used it to measure clouds. The general public became aware of the accuracy and usefulness of LiDAR systems in 1971 during the Apollo 15 mission, when astronauts used a laser altimeter to map the surface of the moon. Later, the application of lasers has became widespread.

 

5.1 Components of a LiDAR System

 

5.1.1 Laser and Laser scanner

 

A laser emits a beam of monochromatic light. The radiation, indeed, is not of a single wavelength, but it has a very narrow spectral band, smaller than 10 nm. Lasers are of very high intensity radiation. Lasers can damage cells by boiling their water content, so they are considered as a potential hazard to eyes. A 1550nm laser beam is considered as eye-safe.

 

The laser scanner is a device used for opto-mechanical scanning. Daytime light intensity is almost constant and hence the signals which originate from laser pulses illuminating an object can be easily distinguished from the sunlight-induced background, and LiDAR can be used equally during the day and night.

 

The laser scanning is performed by space-borne satellites such as IceSat or aerially using manned and unmanned vehicles or terrestrially using both mobile and fixed terrestrial laser scanners.

Figure 4. A Typical Aerial LiDAR System

 

In the present generation of airborne laser scanning (ALS) systems, semiconductor diode lasers and Nd-YAG (neodymium-doped yttrium aluminium garnet; Nd-Y3Al5O12) lasers pumped by semiconductor lasers are used, covering the optical bandwidth of 0.8-0.16 µm. ALS, in general, uses 1064 nm diode pumped YAG lasers. The bathymetric laser scanning systems uses 532 nm frequency doubled diode pumped YAG lasers because 532 nm penetrates water with much less attenuation than does 1064 nm. Laser settings include the laser repetition rate (which controls the data collection speed). Pulse length is generally an attribute of the laser cavity length, the number of passes required through the gain material (YAG, YLF (LiYF4, yttrium lithium fluoride, 1047 and 1053 nm)), and Q-switch or giant pulse formation speed. The Q-switching mode allows the production of light pulses with extremely high (gigawatt) peak power, much higher than would be produced by the same laser if it were operating in a continuous wave (constant output) mode. Better target resolution is achieved with shorter pulses, provided the LiDAR receiver detectors and electronics have sufficient bandwidth. As current laser scanners work by the direct energy detection principle, a coherent light source for ranging is not required. To conclude, following physical laser properties are used in laser scanning: high power, short pulses, high collimation, and narrow optical spectrum in which lasers emit. A collimator is a device that narrows a beam of particles or waves. To “narrow” can mean either to cause the directions of motion to become more aligned in a specific direction (i.e., collimated or parallel) or to cause the spatial cross section of the beam to become smaller. A variety of scanning methods are available for different purposes such as azimuth and elevation, dual oscillating plane mirrors, dual axis scanner and polygonal mirrors. The type of optic determines the resolution and range that can be detected by a system.

 

The earlier generation of laser rangefinders was capable of recording of one return echo for every emitted pulse. The current generation of rangefinders is capable of recording multiple return echoes for every emitted pulse. A further development in the scanning devices is full waveform laser scanners which continuously digitize the entire returned echoes for each emitted laser pulse if its intensity is above a certain threshold, resulting in a high point density data. The discrete and full waveform laser data has made significant contribution in identifying each object at different height level, for example tree crown or a building at different height level, or at different layers, for example trees at different height level (upper-tier, lower-tier) or buildings at different height level.

 

5.1.2 Photodetector and Receiver Electronics

 

A photodetector detects the incoming photon energy from the returning echo pulse once it hits the photo detector. It is then converted to an equivalent photo current at the input of the receiver channel. The resulting electrical current, known as the primary photocurrent Is, is directly proportional to the incident optical power level Ps,

Is = RPs,———-(1)

 

where R is the responsivity of the photodetector measured in amperes/watt.

 

The other fundamental quantity is the quantum efficiency (ɳ) which is photon to electron conversion efficiency of the photodetector and is defined as the ratio of the electron-generation rate for a given photon incidence rate. The usual quantum efficiencies are 60-70%. The commonly used optical receivers are positive-intrinsic-negative (PIN) polarity of constituent photodiodes, and avalanche photodiodes (APD) through which conversion of photon to electron happens using avalanching mechanism.

 

5.1.3 Global Positioning System (GPS) and Inertial Measurement Unit (IMU)

 

GPS receiver and an IMU are used to determine the absolute position and orientation of the LiDAR sensors on-board aircrafts or satellites. Aerial GPS provide positioning information related to the trajectory of the sensor. However, to correctly determine the pointing direction of each laser pulse require accurate measurements of the roll, pitch and yaw of the platform. This is essential for measuring an accurate transformation from the local sensor reference frame to the Earth-centered reference frame. In practice, the orientation of the platform is recorded by an on-board IMU that is hard mounted to the LiDAR sensor. Generally, a 0.005º pitch/roll (some system has 0.0025º pitch/roll accuracy) and 0.008º yaw is common in commercial LiDAR sensors.

 

5.1.4 Data Storage and Management Systems

 

Because of the point coverage, the LiDAR data are huge in numbers and are continuous and therefore, requires tile based storage. Now a days, most of the commercial LiDAR data are available in LASer (LAS) file format.

 

5.2 Principle of LiDAR Scanning

 

The narrow beam width of laser pulse defines the instantaneous field of view (IFOV) which is a function of the transmitting aperture and the wavelength of the laser light. Generally, the IFOV ranges from 0.3 mrad to 2 mrad. The IFOV of the receiving optics must not be smaller than that of the transmitted laser beam. The narrow IFOV of the laser causes the laser beam to be moved across the flight direction in order to obtain a full coverage of surveying area. The second dimension is obtained by the forward motion of the aircraft. In the present laser ranging systems, mostly pulsed lasers are used. The ranging of a laser pulse is done by measuring the travelling time (tL) between the emitted and received pulse.

 

ΔtL =2ΔR/C————(2)

Or,

R=1/2C∗ tL—————(3)

 

Where, R is the distance between the ranging unit and the object surface, and C is the speed of light. It is important to note that the factor 2 is due to travelling of light pulse twice the distance R, outgoing and incoming. This relates to time lapse measurement of a laser pulse.

 

5.3 Laser Scanning

 

The laser scanning can be explained in the following four major steps.

 

i)   The measurement begins with the note of precise time as the short laser pulse having half pulse width of about 3 to 10 ns is emitted from a semiconductor laser diode of a LiDAR instrument to the target surface. Since the return distance of the laser pulse to be measured is expected to be in multiples of hundred of meters in a pulsed time-of-flight (TOF) applications, a high-power laser diodes with of peak power in the range of 10-100W are required in a laser transmitter.

 

ii)   The backscattered return echo of the emitted pulse is detected and the precise time is recorded. As the echo pulse hits the photo detector (either PIN or APD), it is converted to an equivalent photo current at the input of the receiver channel. In this, the received optical power Pr(R) at the receiver lens with Lambertian type (noncooperative, equal brightness to all directions) of targets is considered as a function of the measurement range R with radar equation

Pr(R)=PTτTρAR/πR2———-(4)

where, PT =  Peak pulsed output power of the laser diode,

τT = Transmission of the transmitter optics,

ρ  = Reflection coefficient of the object and

AR = Area of the receiver lens. The resulting signal current IsignalR will be then

 

I signal(R)= Pr(R)∗ R0

where R0 = responsivity of the photo detector.

 

iii) An accurate electrical timing signal is generated by the receiver channel using time interval measurement unit which determines the time difference (Δt) between the start and stop pulses. Using the equation 3, distance from the sensor to the target, through time delay and constant speed of light, is measured in a repeated fashion and thereby creating a 3-D map of the inventoried surface. The pulse width of 3ns, for example, covers 1m of distance with the constant speed of light 3×108m/s.

 

iv)   With the movement of the sensor, the height, location and orientation of the instrument is being calculated to determine the position of the laser pulse at the time of sending and at the time of return. Using the known position and orientation of the sensor, the 3-D coordinates of the laser and laser-hitting surface is being calculated. In addition to height, some scanners also measures the intensity or amplitude and echo’s width.

Figure 5. LiDAR – Discrete-return vs. waveform-digitizing (Adapted from Harding et al. 2000)

 

 

5.4 Frequency and Wavelength of Lasers Used

 

The frequency used is in the range of 50,000-200,000 pulses/second (Hz). But, for bathymetry applications, the range is slower. The wavelength used are: i) IR (1500 to 2000 nm) for meteorology – Doppler LiDAR, ii) NIR (1040 to 1060 nm) for terrestrial mapping, iii) blue-green (500 to 600 nm) for bathymetry and iv) UV (250 nm) for meteorology.

 

The selection of the optical wavelength of the laser is dependent on the overall laser scanning system design for a particular application. The most sensitive detectors are available in the range of 800 to 1000 nm. However, eye safety is a concern at this wavelength range. It is to b noted that the reflectivity of a target, for a given wavelength, also influences the maximum range.

 

5.5 Applications of LiDAR

 

For use of LiDAR data, the opportunities are numerous. It is used for mapping a wide range of targets, including aerosols, clouds, rain, rocks, non-metallic objects, chemical compounds and even single molecules. A narrow laser beam with a small IFOV is used to map high resolution physical features. LiDAR digitally measures elevations of very high precision producing high-resolution Digital Elevation Models (DEMs) depicting ground and vegetation surfaces (digital surface model or DSM) and bare ground terrain (digital terrain model or DTM). Some of the important applications of LiDAR include mapping of corridors (for example, roads, railway tracks, pipelines, waterway landscapes); mapping of electrical transmission lines and towers including ground/tree clearance; DTM generation, especially in forested areas in forests and also for road and path planning, study of drainage patterns; measurement of coastal areas, including dunes and tidal flats, determination of coastal change and erosion; high accuracy and very dense measurement applications, e.g., flood mapping, DTM generation and volume calculation in open pit mines, road design and modelling; DTM and DSM generation in urban areas, automated building extraction, generation of 3-D city models for planning of relay antenna locations in wireless telecommunication, urban planning, microclimate models, propagation of noise and pollutants; rapid mapping and damage assessment after natural disasters, e.g., after hurricanes, earthquakes, landslides; measurement of snow- and ice-covered areas, including glacier monitoring; measurement of wetlands; and derivation of vegetation parameters (for example, tree height, crown diameter, tree density, biomass estimation, determination of forest borders).

 

 

6. Thermal Remote Sensing, Sensors and Applications

 

Imagine a wild fire outbreak in a hot and dry summer. An obvious picture that hovers the mind is rapidly progressing fire with smoke and the burning sounds coming as a result of burning of dry forest biomass. The scenario become more panic when there is a lack of technical manpower and improper awareness to deal with. For such fire prone regions a combat plan to deal with highly damaging risks necessarily includes forest fire risk zonation which involves satellite images acquired in thermal infra-red (TIR) EM spectrum (3-14 µm). The use of pre- and post-fire images in turn is also helpful in delivering burn severity maps for property damage assessment. This is an effective application of thermal remote sensing in environmental application.

 

Any object with a temperature above the absolute zero (0 Kelvin or -273 ºC) emits radiation. The material type and the temperature of the object under investigation determine the emitted radiation intensity and the spectral composition of the emission. A blackbody is an imaginary object that totally absorbs (perfect absorber) and re-emits all (perfect emitter) EM radiation incident upon it. A blackbody’s emittance is a function of its kinetic temperature (or amount of kinetic heat produced by a blackbody object). Theoretically, blackbody is an ideal example of radiation principle. However, real materials are not blackbodies. They emit only a fraction of EM radiation. Therefore, such materials (real) are called as gray bodies.

 

The emitting ability or emissivity of a gray body is wavelength dependent of radiant energy. Thus, each material has both emissivity and reflectance spectrum. In remote sensing, we use TIR region to record this emissivity of object’s radiant energy. Apart from the wavelength, the emissivity is also influenced by other factors such as the material itself (water, soil, rock, concrete, vegetation etc.); the field of view and viewing angle of the sensor; surface geometry or surface roughness (rough surface relative to the wavelength shows greater surface area and, therefore, has greater absorption and re-emission potential); tone of objects (darker objects are better absorbers and better emitters); and moisture content (higher is the moisture content, the greater is the potential to be a good emitter).

 

The kinetic heat or kinetic temperature is the energy of particles of matter in random motion. When these dynamic particles collide, they change their energy state and emit EM radiation termed as radiant flux measured in watts. The radiant temperature (a kind of glow when seen in visible spectrum) of the object is the concentration of the amount of its exiting or emitting radiant flux. Therefore, we can use radiometers that are placed remotely from the object to measure its radiant temperature which is positively correlated with the object’s kinetic temperature. And, this forms the basis of thermal remote sensing. This is the emissivity that, in large, creates differences in remote measurement of the radiant temperature which is always slightly less than the true kinetic temperature of the object. The Planck’s law, Wein’s displacement law, Stenfan-Boltzmann Law and Lambert’s law are commonly used laws that govern the interrelation between the above mentioned parameters. To elaborate such laws, at present, is now beyond the scope of this study.

 

The TIR includes the incoming thermal-spectral solar radiation (such as incoming short-wave infrared or SWIR radiation), outgoing or re-radiated longer IR radiations and thermal emissions of objects. Thus, apart from emissivity, the TIR measurements include land and ocean surface temperature, atmospheric temperature and humility, atmospheric trace gas concentrations and radiation balance. There are specifically designed sensors on-board satellites which are sensitive to thermal measurements. For example, in Landsat 8, band 6 and band 7 are sensitive towards two SWIR regions, 1.56-1.66 µm and 2.1-2.3 µm, respectively. Whereas, band 10 and band 11 in the same Landsat series includes two TIR bands having 10.3-11.3 µm and 11.5-12.5 µm range.

 

Some of the thermal spectra based important applications are: i) 3-5 μm: monitoring hot phenomenon such as geothermal activities and forest fires; ii) 8-14 μm: monitoring vegetation, soil and rock; iii) 9.2 – 10.2 μm: monitoring ozone that strongly absorbs it. This range can be used by air-borne systems but cannot by space-borne sensors and iv) 9.7 μm: monitoring emitted energy from Earth’s surface.

 

The TIR  is  widely  used  for  civilian  and  military applications. The military  applications include surveillance, night vision, target identification, tracking. Humans at normal body temperature radiate chiefly at wavelengths around 10 μm and therefore, this wavelength can be used to quantify non-human objects. The civilian applications include environmental monitoring (example are soil moisture content, coastal zones, hydrology etc.), industrial facility inspections, surface temperature monitoring, remote temperature sensing, thermal efficiency analysis, weather forecasting etc.

 

7. Summary

 

In remote sensing, we sense the reflected/emitted energy remotely and convert the sensed energy to a usable digital form such as pixels which is then interpreted and analyzed for different applications. The remote sensor uses either reflected solar energy (passive sensing) as its source of energy to store the information about an object or it uses its own outgoing source of energy (active sensing) to record the incoming energy after hitting back to the target. The availability of choice of wavelength range and day-and-night illumination option, the active sensors are more useful. The active and passive sensors are developed to record the information in both optical and microwave (MW) wavelength region. As a part of passive and active remote sensing, the microwave, laser and thermal remote sensing has been widely used in measuring information about the targets. In MW sensing, the backscattered MW energy after hitting on a target is recorded and used. The strength of recorded MW energy is related to the temperature and moisture content present in the objects. The applicability of passive MW sensing is limited due to certain factors such as quantity of available passive MW energy. RAdio Detection and Ranging (RADAR) is a common term used for active imaging MW remote sensing in which time delay is used to record the range between the sensor and the target. Synthetic aperture radar (SAR) is a MW imaging system which produces a high resolution image of the Earth. One of the biggest advantages of MW sensors is that they can be used in almost all-weather and environmental conditions. Altimeters and scatterometers are non-imaging MW sensors which measure energy in only one linear dimension. The MW radiations are used in measuring the water and ozone content in the atmosphere; soil moisture content in the soil; surface winds, sea ice and currents, oil slicks in the ocean etc. Unlike MW sensing, Light Detection and Ranging (LiDAR) is a complete active 3-D sensing technique where targets are illuminated with outgoing laser pulses and backscattered incoming echoing pulses are recorded to generate 3-D information of an area. A LiDAR system essentially uses laser beam, laser scanner – an opto-mechanical device, photodetectors and GPS. The available laser scanning facility is space-borne, aerial and terrestrial. The LiDAR works on the same principle as RADAR. The most sensitive laser detectors are available in the wavelength range of 800 to 1000 nm. The LiDAR generated data are used for creation of 3-D models and are used in a wide range of applications such as corridor mapping, mapping of electrical transmission lines, vegetation mapping, building extraction, 3-D city models creation etc. Unlike MW and LiDAR, thermal remote sensing uses re-emitted light after absorption. In satellite remote sensing, the short-wave infrared or SWIR and thermal IR radiation are commonly used wavelength spectrum. The thermal radiation are used for the study of land and ocean surface temperature, atmospheric temperature and humility, atmospheric trace gas(example, ozone) concentrations, and radiation balance, geothermal activities and forest fires, monitoring vegetation, soil and rocks etc.

  1. References
  • Campbell, J. B. and Wynne, R. H. (2011). Introduction to remote sensing, 5th ed., The Guilford Press, USA.
  • Cassidy, M.C., Bruno, A., Rubbert, S., Irfan, M., Kammhuber, J., Schouten, R. N., Akhmerov, A. R., Kouwenhoven, L. P. (2017). Demonstration of an ac Josephson junction laser, Science, 939-942.
  • CCRS (2017). Canada Centre for Mapping and Earth Observation tutorial. Fundamental of remote sensing, accessed at http://www.nrcan.gc.ca/node/9309/ on 10 August 2017.
  • Einstein, A. (1917). Zur Quantentheorie der Strahlung, Physikalische Zeitschrift, Vol. 18, 121-128.
  • Harding, D.J., Blair, J.B., Rabine, D.L. and Still, K.L. (2000). SLICER airborne laser altimeter characterization of canopy structure and sub-canopy topography for the BOREAS Northern and Southern Study Regions: Instrument and Data Product Description, Technical Report Series on the Boreal Ecosystem-Atmosphere Study (BOREAS), F.G. Hall and J. Nickeson, Eds., NASA/TM-2000-209891, Vol. 93, 45.
  • Levin, N. (Ed.) (1999). Fundamentals of Remote Sensing. 1st Hydrographic Data Management Course, IMO—International Maritime Academy, Trieste, Italy.
  • Lillesand, T. M., Kiefer, R. W. and Chipman, J. W. (2008). Remote sensing and image interpretation, 6th ed., John Wiley & Sons, USA.
  • Tempfli, K., Kerle, N., Huurneman, G. C., and Janssen, L. L. F. (Eds.) (2009). Principles of remote sensing, ITC Educational Textbook Series 2, ITC, 4th ed., The Netherlands.