2 Remote Sensing Systems: Platforms and Sensors

Dr. Saif Said

epgp books

 

 

 

 

Structure

 

1.1        Introduction

 

1.2        Historical Development of Remote Sensing

 

1.2.1     History of India’s Space Programme

 

1.2.2    History of International Space Programme

 

1.3        Platforms and Sensors

 

1.4        Classification of Sensors

 

1.4.1    Optical Sensors

 

1.4.2     Microwave Sensors

 

1.4.3     Thermal sensor

 

1.5        Satellite Characteristics: Swaths and Orbits

 

1.6        Weather and Communication Satellites Terminal Questions

 

 

Objectives

 

At the end of this unit student will be able to briefly explain:

  • Historical evolution of remote sensing and the detailed information related to India’s space programme.
  • The two important types of satellite orbits, which are required for obtaining a full, image coverage of the Earth.
  • Classification of sensors; active and passive remote sensing sensors.
  • The essential features of various important weather and communication satellites.

    1.1 Introduction

 

The underlying principle as well as the basic concepts remote sensing that are required for understanding the process involved in the technology have been explained in detail in module 1. The electromagnetic energy emitted from the sun or reflected back from the target is recorded by the sensors that may be mounted on a truck or onboard aircraft or space satellite. The sensor platform combination determines the characteristics of the resulting data or image, henceforth, the resulting data is utilised for interpretation about the surface feature characteristics. The following module takes a closer look at the sensor and sensing platform combinations as well as the type of data collected by different sensors. This module also addresses on the historical developments and evolution in remote sensing technology in context to Indian space missions as well as foreign space missions.

 

1.2  Historical Development of Remote Sensing

 

The evolution of remote sensing technology started way back to more than 120 years with the invention of the primitive camera capable of capturing still photographs on the earth. Towards mid 1800’s, looking down the earth’s surface from higher altitude triggered installation of the camera on hot air ballons so as to capture perspective pictures of the ground surface that aided in the preparation of topographic and navigational maps. Perhaps the most novel platform for camera installation in the early nineteenth century was the pigeon fleet that operated as a novelty in Europe (Fig. 1).

 

                                      Fig. 1   The pigeon fleet installed with cameras.

 

During the first World War, cameras were installed on aircrafts that provided perspective photographs of fairly large ground areas that proved to be fruitful during military reconnaissance and operations. Since then aerial photographs were primarily been used for observing the earth’s surface from a vertical or oblique perspective (Fig. 2). Aircraft mounted cameras provided more reliable and stable photographs than pigeons or balloons. It was after world war II that aerial photographs were made available for civilian use and the applications were mainly focussed on geology, forestry, agriculture, land use practice and topography. Availability of photographs for various applications lead to the better camera resolutions and photographic film quality. During II world war, notable development took place in the field of aerial photography and photo interpretation. During this period, imaging systems were developed that were capable of recording the energy in the infrared and microwave region of electromagnetic spectrum. Near-infrared and thermal-infrared imaging proved very valuable tomonitor the health of vegetation. Infact, radar imaging or microwave imaging enabled data recording during night was used for milatery operations hence proved valuable for night time bombing.

 

                                  Fig. 2 Cameras installed on airplanes in the early nineteen hundred.

 

 

During early 1940’s, both Russia and America started their space missions to image land surfaces using several types of sensors installed onboard spacecraft. There after, satellite remote sensing advanced to a global scale and escalated the cold war until satellite data was made available for civilian or research purposes in the early 1980’s. Cameras were attached to V-2 rockets that were launched in 1946 after the world war

 

II to high altitudes from White Sands, New Mexico. These rockets, while never attaining orbit, took pictures as the vehicle ascended or gained altitude. With the advancements towards more extensive space program in the mid 1960s, Earth-orbiting cosmonauts and astronauts acquired photographs of earth and moon from their spacecraft.

 

The world’s first artificial satellite, Sputnik 1 was launched on 4th October 1957 by Soviet Union (Fig. 3), since then, improvements in sensor configuration aboard various earth observing satellites such as Landsat, Nimbus and more recent missions such as RADARSAT and UARS provided global images in different regions of electromagnetic spectrum proved to be useful for various civil, research, and military purposes. Space probes to other planets also provided an opportunity to carry out studies in extraterrestrial environments. Synthetic aperture radar (SAR) aboard the Magellan spacecraft provided detailed topographic maps of Venus, while sensors aboard SOHO provided a platform for studies to be performed on the Sun and the solar wind, just to name a few examples.

 

Fig. 3(a) The first artificial satellite launched by Soviet Union in 1957 (b) NASA’s space shuttle ‘Columbia’ to monitor Earth resources.

 

Extensive studies were carried out in various parts of the globe towards development of satellite image processing and interpretation tools. Research groups in NASA research center developed Fourier transform techniques that enabled notable enhancement of satellite data. With the improvements in the availability of satellite images integrated with image processing tools has lead to numerous applications of remote sensing in various domains. Remote sensing has significantly proven its crucial role towards better understanding of the earth processes in detail at both regional and global scales.

 

Self-assessment exercise

 

1. Briefly discuss the historical evolution of remote sensing technology.

 

2. Discuss some of the civilian applications of remote sensing technique.

 

 

1.2.1 History of India’s Space Programme

 

Although Indian scientists before independence had confined knowledge about the rocket science and space technology since the technology was being used during world wars. It was only after India achieved independence, the process of exploring the space actually accelerated. Dr. Vikram Sarabhai founded the Physical Research Laboratory (PRL) in Ahmedabad on November 11, 1947 that proved to be the first step that India took towards becoming one of the leading space power. It was in 1969, The Indian Space Research Organisation (ISRO) was established that superseded the erstwhile Indian National Committee for Space Research (INCOSPAR), which was established in 1962 by Dr. Vikram Sarabhai along with the efforts of independent India’s first Prime Minister Jawaharlal Nehru. ISRO is the space agency of the Indian government headquartered in the city of Bengaluru. Its vision is to “harness space technology for national development, while pursuing space science research and planetary exploration”. The establishment of ISRO institutionalised the space activities of India and is managed by the Department of Space, which reports to the Prime Minister of India.

 

India’s first major success was achieved on April 19, 1975 with the launch of launched its indigenous satellite named as Aryabhata into space. It was launched by the Soviet Union from Kapustin Yar using a Cosmos-3M launch vehicle. The ‘Aryabhata’ was named after a 5th century Indian mathematician, who founded concepts of the numerical value zero and many astronomical calculations in around 500 AD.

 

India has launched a total of 81 indigenous satellites (as of January 2016) offering a number of applications since its first launch in 1975. A series of satellites that have been launched includes the Apple (1981), Bhaskara –I (1979) and Bhaskara –II (1981), INSAT-1 series (1A, -1B, -1C and -1D), INSAT-2 series (2A, -2B, -2C and – 2D), IRS-Series (1A, -IB, -1E, -P2, -1C, -P3, -1D, P6), Rohini (1A, 1B, 2 and 3)to name a few. India has also developed various Launch vehicles that make a space programme independent and are the most important technological measure of its advancement. Prominent among them are Satellite Launch Vehicle (SLV), Augmented Satellite Launch Vehicle (ASLV), Polar Satellite Launch Vehicle (PSLV) and Geosynchronous Satellite Launch Vehicle (GSLV). Table 1 shows some important dates in the development and advancement of Indian remote sensing programme.

 

Table 1. Historical developments and advancements of Indian space mission

 

                                              Source: https://en.wikipedia.org/wiki/List_of_Indian_satellites

 

 

1.2.2  History of International Space Programme

 

 

There had been numerous celestial studies carried out from the earth using optical telescopes until second World War, that reckoned the development of powerful rockets capable of direct space exploration a technological possibility. The first artificial satellite, Sputnik I, was launched by the USSR (now Russia) on Oct. 4, 1957, followed by Explorer I, first satellite launched by United Satatesin the year 1958. America’s immediate space launch was popularly projected as international competition and envisioned as the “space race.” The first Moon landing by the American Apollo11 spacecraft in the year 1969 is often considered as a landmark of initial space exploration period. The Soviet space programachieved many milestones, including the first living being in orbit in 1957, the first human spaceflight (Yuri Gagarin aboard Vostok 1) in 1961, the first spacewalk (by Aleksei Leonov) in 1965, the first automatic landing on another celestial body in 1966, and the launch of the first space station (Salyut 1) in 1971. Since then, countries namely; People’s Republic of China, European Union, Russia, Japan, Canada, United States, France and India have initiated successful space missions by launching multitude of satellites for defence as well as research purposes. Although earth-orbiting satellites have so far been accounted with the majority of launches, manned space missions to the Moon and Mars have explicitly been advocated by the above mentioned countries during the 21st century. Table 2 illustrates some of the historical developments and advancements associated with international space programme.

 

Table 2 Historical developments and advancements associated with international space programme

 

                                          Source: The U.S. National Archives and Records Administration

 

A device capable of measuring and recording the electromagnetic energy is referred to as a sensor. For a sensor to collect and record electromagnetic energy reflected or emitted from a target or feature space of interest, it must be installed on a stable platform that may either be ground based, aircraft or balloon based (or some other platform within the Earth’s atmosphere), or spacecraft or satellite above the Earth’s atmosphere.

 

Sensors installed on ground based platforms

 

 

Sensors installed over ground-based platforms records detailed information about the feature or surface area such as a crop field or road intersection of limited extent (i.e. 200 to 400 sq m) which may be compared with information collected from aircraft or satellite sensors as per the requirement of the researcher. In some cases, data collected from ground based sensors can also be utilised to characterize and interpret the target feature that is also being imaged by other sensors, thereby integrating the information in the imagery for better analysis. Sensors may be installed or mounted on a ladder, tall building, crane, etc as illustrated in Fig. 4.

 

                                             Fig. 4  Ground based sensor mounted on a truck crane.

 

 

Sensors mounted on air based platforms

    Airborne remote sensing is carried out using specially designed aircrafts or hot air balloons depending on the operational requirements and the availability of budget. Airborne platforms are employed owing to their mobilization flexibility and capability of recording data covering large spatial areas as compared to the ground based sensors. The speed, altitude as well as orientation of the aircraft must be carefully chosen so as to have minimum influence on the scale, resolution and geometric characteristics of the recorded images. Airborne remote sensing is deployed when study areas are inaccessible for ground based platforms such as hilly region s or dense forest cover. Remote sensing aircraft can be of many types, i.e. very small in size, slow, and low flying, to twin-engine turboprop (see Fig. 5) and small jets capable of flying at very high altitudes. Unmanned platforms (UAVs) are becoming increasingly important, particularly in military and emergency response applications, both international and domestic. Modifications to the fuselage and power system to accommodate a remote sensing instrument and data storage system are often far more expensive than the cost of the aircraft itself. While the planes themselves are fairly common, choosing the right aircraft to invest in requires a firm understanding of the applications for which that aircraft is likely to be used over its lifetime. Aerial platforms are primarily stable wing aircraft, although helicopters are occasionally used. Aircrafts are often used to collect very detailed images and facilitate the collection of data over virtually any portion of the Earth’s surface at any time.

 

                   Fig. 5  Aircraft used as platform to record data pertaining to earth surface.

 

 

Sensors mounted on space based platforms

 

Space borne remote sensing is carried out from the outer space or at an altitude higher than the earth’s atmosphere and utilizes space shuttle or more commonly satellites as platforms. Satellites are man made objects that revolve around the earth and sensors installed onboard captures data of the earth’s surface covering areas of more than hundreds of square kilometers. Because of their orbits, satellites permit repetitive coverage of the earth’s surface on a continuing basis. Cost is often a significant factor in choosing among the various platform options. Ever since the launch of the first earth resources satellite (i.e., Landsat in 1972), satellite-based remote sensing has continuously served towards the betterment of science and technology. With increasing number of satellites being launched, interestingly the demand for data acquired from airborne platforms continues to grow. One obvious advantage satellites have over aircraft is global accessibility; there are numerous governmental restrictions that deny access to airspace over sensitive areas or over foreign countries. Satellite orbits are not subject to these restrictions, although there may well be legal agreements to limit distribution of imagery over particular areas.

 

                            Fig. 6 Space based sensor mounted on a space shuttle.

 

 

Different types of orbits are required to achieve continuous monitoring (meteorology), global mapping (land cover mapping), or selective imaging (urban areas). The following orbit types are more common for remote sensing missions;

 

Polar orbit: has inclination angle between 80 and 100 that confines the satellite motion in westward direction. Polar orbit enables observation of the whole globe including the poles. Satellites with polar orbit are launched at 600 km to 1000 km altitude.

 

Sun-synchronous orbit: also referred as near polar orbit, having inclination angle between 98  and 99  which is relative to a line running between the North and South poles thereby enabling the satellite to always pass overhead at the same time. The platforms are designed to adopt an orbit in north-south direction, which, in conjunction with the Earth’s rotation (west-east), allows them to cover most of the Earth’s surface over a certain period of time. Most sun synchronous orbits crosses the equator at mid morning at around 10:30 hour local sun time. At that moment the sun angle is low and the resultant shadows reveals terrain relief. Examples of sun synchronous satellites are Landsat, IRS and SPOT.

 

Geostationary orbit: has 0 (zero degrees) inclination angle i.e. satellite is placed above the equator at an altitude of 36,000 km. The orbital period is kept equal to the rotational period of the earth that results in fixed position of satellite relative to the earth (i.e. satellites observes and collects information continuously over specific areas and always views the same portion of the earth). Due to their high altitude, geostationary weather satellite monitors weather and cloud patterns covering an entire hemisphere of the earth. Geostationary orbits are commonly used for meteorological and telecommunication satellites.

 

Summary

 

Ø  Ground-based platforms: ground, vehicles, ladders and/or buildings:- up to 50 m

 

Ø  Airborne platforms: airplanes, helicopters, high-altitude aircrafts, balloons:- up to 50 km

 

Ø  Space borne platforms: rockets, satellites, shuttle:- from about 100 km to 36000 km

 

Space shuttle: 250 to 300 km

 

Space station: 300 to 400 km

 

Low-level satellites: 700 to 1500 km

 

High-level satellites: about 36000 km

 

 

Self assessment exercise:

 

1. What advantages do sensors carried on board satellites have over those carried on aircraft and building top? Are there any disadvantages also that you can think of for space borne platforms?

 

2. Can “remote sensing” employ anything other than electromagnetic radiation?

 

1.4 Classification of Sensors

 

Sensors are the devices used to record the electromagnetic radiations emitted or reflected from the target features and acquire images used in variety of remote sensing applications. In remote sensing, sensors are capable of acquiring information about the target feature to which human eye is insensitive to recognise specially the radiations in other parts of the electromagnetic spectrum than in the visible portion. Sensors are broadly classified into three categories: Optical sensor, Microwave sensor and Thermal sensor.

 

1.4.1  Optical sensors

 

Optical sensors utilises the energy from sun which is the source of illumination by recording the energy reflected or emitted from the target feature. Optical sensors record the reflected or emitted energy in the visible, near infrared and short-wave infrared regions of the electromagnetic spectrum. The amount of energy recorded by the sensor depends upon the spectral reflectance characteristics of target features, since, different materials reflect and absorb differently at different wavelengths. For instance, optical sensors are sensitive mainly to the electromagnetic radiations lying in the range of 0.4 μm to 0.76 μm (visible band) as well as 0.76 μm to 0.9 μm (near and mid infrared band), which are the wavelengths insensitive to human eye. Images acquired from optical sensors finds usefulness in variety of applications such as post-earthquake damage assessment, landslide damage assessment, oil spills, vegetation monitoring, flood assessment and relief measures, land use and land cover classification, temporal change detection analysis and many more. Optical remote sensing systems are classified into the following types, depending on the number of spectral bands used in the imaging process.

 

Panchromatic  imaging  system  (PAN):  The  imaging  sensor  is  a  single channel detector sensitive to wide range of wavelengths of light typically covering  the  entire  or  large  portion  of visible  band  of  the  spectrum thus resulting in grey scled image (i.e. images containing different shades of black and white). The PAN sensor records or measures the reflectance in terms of apparent brightness of the targets. The spectral information or “colour” of the targets in the resulting image is completely wiped off (Fig. 7). Examples of panchromatic imaging systems are:

  • o IKONOS PAN
  • o SPOT PAN
  • o IRS PAN
  • o Quickbird PAN

          Fig. 7 Panchromatic image extracted from SPOT PAN sensor with ground resolution 10 m.

 

(Source:http://www.crisp.nus.edu.sg/~research/tutorial/opt_int.htm).

 

Multispectral imaging system: the Multispectral Scanner (MSS) is one of the important Earth observing sensors introduced in the Landsat series of satellites that uses an oscillating mirror to continuously scan the earth surface perpendicular to the spacecraft velocity. Every mirror sweep scans six lines simultaneously in each of the four spectral bands and images of the objects are recorded across the field of view. The resulting image is a multilayer image which contains both the brightness and spectral (colour) information of the targets being recorded. Examples of multispectral systems are:

 

o LANDSAT MSS o IRS LISS

o  SPOT X

o IKONOS MS

    The Multi-Spectral Scanners are further divided into the following two types:

 

(i) Whiskbroom Scanners

(ii) Pushbroom Scanners

 

(i) Whiskbroom Scanners: The whiskbroom scanner is an optical machenical device which is also known as across track scanner. These scanners uses a rotating mirror and a single detector which scans the scene along a long and narrow band. The orientation of the mirror is such that on completing one rotation, the detector scans across the field of view between 90° and 120° to obtain images in narrow spectral bands ranging from visible to middle infrared regions of the spectrum. The angle extended by the mirror in one complete scan known as the Total Field of View (TFOV) of the scanner. Whereas the solid angle extended from a detector to the area on the ground it measures at any instant is termed as Instanteneous Field of View (IFOV); refer Fig. 8. Figure 9 (a) depicts the scanning mechanism of whiskbroom scanners.

 

Fig. 8   Instantaneous Field of View (IFOV) for a typical scanner.

 

(i) Pushbroom Scanners: also termed as along track scanners and does not have a mirror looking off at varying angles. Instead, there is a line of small sensitive detectors stacked side by side each having tiny dimension on its plate surface; these may be several thousand in number and each detector is a charge coupled device (CCD). In other words, these scanners consist of a number of detectors equivalent to the swath of the sensor divided by the size of the spatial resolution or pixel size (Fig. 9 b). For example, the swath of High Resolution Visible Radiometer – 1 (HRVR – 1) of the French remote sensing satellite SPOT is 60 km and the spatial resolution is 20 metres. If we divide 60 km x 1000 metres/20 metres, we get a number of 3000 detectors that are deployed in SPOT HRV – 1 sensor. The detectors are placed linearly in an arrayed fashion and each detector collects the energy reflected by the ground cell (pixel).

 

Hyperspectral Imaging system: Hyperspectral sensors are also referred to as imaging spectrometers that acquire images in a very narrow and large numbers of contiguous spectral bands (i.e. typically collect 200 or more bands of data) throughout  the  visible,  near-IR,  mid-IR,  and  thermal-IR  regions  of  the electromagnetic  spectrum  (Fig.  10).  Hyperspectral  sensors  employ  either across-track or along-track scanning mode and the images acquired are used for characterisation and interpretation of surface features with high accuracy and  detail  on  account  of  their  spectral  details.  Every  individual  pixel  in hyperspectral image contains a continuous reflectance spectrum of hundreds of spectral bands. A Landsat TM records an integrated response from a data point  in  7  spectral  bands  of  approximately  0.26  µm  wide.  Whereas,  a hyperspectral sensor records the spectral response of the same point in large number of bands in the order of 0.01µm. Due to the large number of narrow bands in hyperspectral image, expensive ground surveys and laboratory testing have been widely replaced and therefore, hyperspectral data of very fine spectral resolution finds its usefulness in studies related to the mineral exploration in an area, water requirements of plants, vegetation health monitoring, soil type classification etc.

Fig. 10 Images acquired simultaneously in 200 or more spectral bands and the wavelength range depicted by the pixels; Source: Basics of remote sensing, Unit I; ESA website.

 

Hyperspectral imaging sensors produce a complete coverage of spectral signatures of features with minimum wavelength omissions. Ground based or hand held versions of these sensors also exist and are used mainly for accuracy assessment purposes and for calibration of satellite data.

 

1.4.2 Microwave sensors

 

The microwave region of electromagnetic spectrum extends from wavelength approximately 1 cm to 1 m. The longer microwave wavelengths have special characteristics of penetrating the cloud cover, heavy precipitation, haze or dust and are not susceptible to atmospheric scattering which otherwise shorter optical wavelengths are prone to be scattered These property allows microwave remote sensing in almost all weather and environmental conditions and at any duration of time. In addition, microwave remote sensing also provides useful information on sea wind and wave direction, which are derived from frequency characteristics, Doppler effect, polarization, back scattering etc. that may not be possible by visible and infrared sensors.

    The sensors operating in the microwave region are broadly classified as active and passive sensors.

 

Passive sensor records natural radiation at a particular frequency or range of frequency. The passive microwave sensor records the intensity of the microwave radiation ranging between a frequency range of 5 to 100 GHz emanating from the surface of earth within the antenna’s field of view. The passive sensors that measure the emitted energy are the microwave radiometers. The signal recorded through the antenna is represented as an equivalent temperature, in terms of a black body source which would generate equal amount of energy within the bandwidth of the system. Therefore, the microwave energy recorded by a passive sensor is quite low as compared to optical wavelengths since the wavelengths are so long and therefore the field of view of the antenna is kept large to record sufficient energy. Most passive microwave sensors are therefore characterized by low spatial resolution. The important applications of passive microwave remote sensing include areas such as hydrology, oceanography and meteorology. In Meteorology, passive microwave sensors measure the water and ozone contents in the atmosphere. Soil moisture based studies measure surface soil emissions influenced by moisture content within soil medium. Oceanographers utilise passive microwave sensors to monitor ocean currents and waves, measurement of sea surface levels, surface wind currents as well as oil slicks.

 

                                                    Fig. 11 Microwave energy recorded by a passive sensor

 

Figure 11 illustrates the working principle of passive microwave sensors which records the energy emitted by the atmosphere (A), reflected from the terrain features (B), emitted from the surface features (C), or transmitted from the subsurface (D).

 

An active remote sensor emits their own electromagnetic energy in the microwave region toward the surface feature which after interaction with the target produces a backscatter of energy, which is finally recorded by the active sensor’s receiver (Fig. 12). Active microwave sensors are further categorised into imaging and non-imaging category. The most common form of imaging active microwave sensors is RADAR which is a acronym for Radio Detection and Ranging. Altimeters and scatterometers comes under non-imaging microwave sensors category. The most widely used active remote sensing systems is the RADAR, which transmits long wavelength microwave radiations from 3 cm to 25 cm and records the reflected energy from the target feature in the form of backscatter; LIDAR, an acronym for LIght Detection And Ranging is based on the emission and transmission of relatively short wavelength laser waves through the atmosphere and then recording the backscatter radiation; SONAR, (SOund Navigation And Ranging) is based on the emission and transmission of sound waves through water medium and then recording the backscatter radiation from bottom of the water medium.

 

                                                                     Fig. 12 Active microwave sensor.

 

 

Radar, which is basically a ranging device (Fig. 13) is composed of a transmitter, a receiver, an antenna, and an electronics system to generate and process the recorded data. The function of transmitter is to emit successive short pulses of microwave radiation (I) at regular intervals which are converged into a beam (II) by the antenna. The microwave radiations emitted from radar illuminates the surface features obliquely at a right angle to the direction of the platform motion. The antenna receives and records the reflected energy also called as backscattered energy from the target features illuminated by the radar beam (III). Therefore, by measuring the time delay between the emitted radiation and the backscattered radiation from target features, their distance from radar and location are calculated. The forward motion of the platform and simultaneous recording of backscattered signal generates a two-dimensional image of the surface (Fig. 14).

 

Fig. 13 Principle of the operation of the Radar sensor.

 

            Fig. 14 ERS-2 C-band image of mountain area.

 

The microwave region of the spectrum (Fig. 15) is quite large, in contrast to the visible and infrared, and broad classification of several wavelength ranges or bands commonly used are mentioned below.

 

X-band: used extensively for military reconnaissance and terrain mapping.

 

C-band: sensor on board NASA airborne system and space borne systems including Envisat, ERS-1 and 2 and RADARSAT. Mainly used for surface soil moisture mapping.

 

S-band: sensor board the Russian ALMAZ satellite and used for biomass modelling.

 

L-band: sensor onboard American SEASAT and Japanese JERS-1 satellites and used for subsurface explorations.

 

P-band: sensor onboard NASA airborne research systems used for archeological explorations.

                               Fig. 15  The   microwave   region    of   the    electromagnetic   spectrum;

                                         source: Fundamentals of Remote Sensing by CCRS, Canada.

 

 

1.4.3  Thermal sensors

 

All objects having temperature above absolute zero (0 Kelvin) starts emitting electromagnetic energy in wavelength range between 3 m to 100 m. All objects selectively absorb short wavelength solar energy and radiate thermal infrared energy. In a thermal image the tone of an object is a function of its surface temperature and its emissivity i.e. all objects emit infrared radiation and the amount of emitted radiation is a function of surface temperature. Hot objects appear in lighter tone and cooler objects appear darker in an infrared image. In other words, the energy recorded by the radiometer is proportional to the product of the absolute physical temperature (T) and emissivity (î), where ‘T’ is referred to as brightness temperature. As all natural surface features emits radiations so as to keep thermal equilibrium which are measured by radiometer and is represented in terms of a black body.

 

The concept of a perfect black body relates to an ideal material that completely absorbs all incident radiation, converting it to internal energy that gives rise to a characteristic temperature profile. The radiant temperature of any object depends on two major factors i.e. kinetic temperature and emissivity. Infrared sensors detect remote objects by recording the emitted infrared energy as a continuous tone image on thermal sensitive photographic film.

 

Emissivity is the ratio of radiance spectrum of a non perfect emitter over that of a perfect emitter (black body) at the same temperature and is a measure of the ability of a material to radiate and to absorb the incident radiation. Radiant energy striking the surface of a material is partly reflected, absorbed and partly transmitted through the material. A black body material absorbs all radiant energy striking it therefore absorptivity equals to 1. Spectral curves in figure 16 implicit the underlying principle of thermal infrared remote sensing indicating relative intensities of radiation (radiances) as a function of wavelength for materials with different intrinsic temperatures.

Fig. 16 Spectral curves showing relative intensities of radiation (radiances) as a function of wavelength different materials having varying temperature.

 

As illustrated in the above figure, all the curves have similar shapes, and higher intensity of emittance for hotter radiating object. Also, the peaks of the curves shift symmetrically towards left with the increase in kinetic temperature of the radiating object as per the Wien’s Displacement Law. To be more precise, curves in the figure are representative of blackbodies at different temperatures. Natural materials are referred as gray bodies with temperatures above those of perfect blackbodies. There are objects emitting radiations even at longer wavelengths (i.e. right portion of the curves and beyond) extending into the microwave region. The emitted radiations are generally quite low in intensity and also not much attenuated by the atmosphere. The temperatures measured by these instruments are brightness temperatures (Tb) also referred to as radio-brightness and is characterised by the product of its emissivity and its physical temperature (in Kelvin).

 

 

Applications of thermal infrared remote sensing can be broadly classified into two categories; one in which surface temperature is governed by man made sources of heat and other which is governed by solar radiation. In the former case, the technique has been used from airborne platforms for determining heat losses from buildings and other engineering structures. In the latter case, thermal infrared remote sensing has been used for identifying crop types, surface soil moisture, monitoring forest fires, military operations as well as identification of crop species for detecting crop diseases.

 

1.5 Satellite Characteristics: Swaths and Orbi

 

The path followed by a satellite is referred to as its orbit. Satellites are launched into the desired orbits based on the onboard sensor capabilities as well as the objectives of the launch mission. During the early 1960s, satellites were launched primarily to monitor the Earth and its environment and sensors were designed mainly to acquire data for meteorological purposes. However, with the launch of first US based Landsat satellite series in July 1972, the era of earth resources satellites started with primary objective of mapping the surface features. Currently, more than a dozen orbiting satellites provides variety of data types that plays a significant role in enhancing the knowledge of the Earth’s atmosphere, oceans, glaciers and land.

 

As discussed in the earlier section, remote sensing sensors can be installed on a variety of platforms to acquire data of target features. Although images from ground-based and aircraft platforms may widely be used, satellite based sensors provide images with unique characteristics which make them particularly useful for remote sensing of the Earth’s surface.

 

Selection of satellite orbit takes into account the altitude (their height above the Earth’s surface) and sensor orientation as well as its movement relative to the Earth. Satellites at very high altitudes, which view the same portion of the Earth’s surface at all times have geostationary orbits (Fig. 17a). These geostationary satellites, at altitudes of approximately 36,000 kilometres, revolve at speeds which match the rotation of the Earth so they seem stationary, with respect to specific portion of the Earth’s surface. This allows the satellites to observe and collect information continuously over specific areas. Weather and communications satellites commonly have geostationary type of orbits. Due to their high altitude, some geostationary weather satellites can monitor weather and cloud patterns covering an entire hemisphere of the Earth.

 

                             Fig. 17        (a) Geostationary Satellite                                   (b) Polar orbiting satellite

     Many remote sensing platforms are designed to follow north-south orbit which, in conjunction with the Earth’s west-east rotation, allows them to cover most of the Earth’s surface over a certain period of time. These are near-polar orbits, having inclination of the orbit relative to a line running between the North and South poles and are also termed as sun-synchronous or polar orbiting satellites (Fig. 17b) such that they cover each area of the world at roughly the same local time each day and referred as the local sun time. Sun-synchronous satellites are launched at 700 to 800 km altitudes. At any location, the position of the sun in the sky will be coherent to the satellite over passes thereby ensuring consistent illumination or sunlight conditions while acquiring images in a specific season over successive years.

 

A typical sun synchronous satellite completes 14 orbits a day, and each successive orbit is shifted over the Earth’s surface by around 2875 km at the equator. Also the satellite’s path is shifted in longitude by 1.17deg (approximately 130.54 km) everyday towards west, at the equator “from platforms and sensors”, as shown in Fig.18.

Fig. 18 Orbital shift of a typical sun- synchronous satellite

 

 

Landsat satellites and IRS satellites are typical examples of sun-synchronous, near-polar satellites. Fig.19 shows the orbits of the Landsat satellites (1, 2 and 3) in each successive pass and on successive days. Repeat cycle of the satellite was 18days and each day 14 orbits were completed.

 

                                                                 Fig. 19 Orbit of sun synchronous satellite.

 

 

The satellite while revolving around the earth scans a certain portion of the land area and the width of the area scanned during a single pass is called as the swath (Fig. 20). Swath widths for space borne sensors generally vary between tens and hundreds of kilometres wide. For example, swath width of the IRS-1C LISS-3sensor is 141 km in the visible bands and 148 km in the shortwave infrared band.

                                                           Fig. 20 Swath traced by polar orbiting satellite;

 

                                                              Source: Canada centre for remote sensing.

 

 

1.6  Weather and Communication Satellites

 

Any communication that makes the use of man made satellite in its propagation path is referred to as satellite communication and plays a dominant role towards advancement in technology for the betterment of mankind. There are numerous artificial satellites in operation employed for traditional point-to-point communications, mobile applications, and the TV broadcasting and radio programs. Satellite communications uses high frequency signals such as Ultra High Frequency (UHF), ranging from 300 MHz to 3 GHz and Super High Frequency (SHF), ranging from 3 GHz to 30 GHz.

 

Refer web link;http://www.britannica.com/EBchecked/topic/524891/satellite-communication and http://en.wikipedia.org/wiki/Communications_satellite for brief history and details on orbits of satellite communications.

 

Satellites that are primarily used to monitor the weather and climate of the Earth are referred to as weather satellites. These satellites can be polar orbiting, scanning the entire globe in successive passes, or geostationary, stationed over the same spot over the equator. There are several geostationary meteorological satellites in operation such as GOES-12, GOES-13 and GOES-15 of United States, Elektro-L 1 Russia’s new-generation weather satellite that operates at 76°E over the Indian Ocean, Japanese MTSAT-2 located over the mid Pacific at 145°E and the Himawari 8 at 140°E, European Meteosat-8 (3.5°W) and Meteosat-9 (0°) over the Atlantic Ocean and Meteosat-6 (63°E) and Meteosat-7 (57.5°E) over the Indian Ocean, India’s INSAT series and Chinese Fengyun geostationary satellites.

 

The INSAT series of satellites carries Very High Resolution Radiometer (VHRR) for providing data for generating cloud motion vectors, cloud temperature, water vapour content, utilised in the modelling and forecasting the rainfall, movement of thunder storms and cyclones. These satellites also carry Data Relay Transponders (DRT) to facilitate reception and dissemination of meteorological data from in-situ instruments located across inaccessible areas. ISRO has also designed and developed ground based observation systems such as, Automatic Weather Station (AWS), Agro-meteorological (AGROMET) Tower and Doppler Weather Radar (DWR) as well as Vertical Atmospheric Observations System such as GPS Sonde and Boundary Layer LIDAR (BLL) so as to augment the space based observations and validating the events directly associated with various natural phenomenon.

 

Summary note:-

 

images    shown  on  weather  forecast  news  are  the  products acquired from geostationary satellites because of the broad coverage of the weather and cloud patterns on continental scales. These images are useful in determining the movement of weather patterns in temporal scale. The repeat coverage capability of geostationary satellites assists in collecting several images on daily basis to monitor wind and cloud patterns more closely.

 

 

you can view video on Remote Sensing Systems: Platforms and Sensors

 

References

  1. Campbell, J.B. 1996. Introduction to Remote Sensing. Taylor & Francis, London.
  2. Colwell, R.N. (Ed.) 1983. Manual of Remote Sensing. Second Edition. Vol I: Theory, Instruments and Techniques. American Society of Photogrammetry and Remote Sensing ASPRS, Falls Church.
  3. Curran, P.J. 1985. Principles of Remote Sensing. Longman Group Limited, London. Elachi, C. 1987. Introduction to the Physics and Techniques of Remote Sensing. Wiley Series in Remote Sensing, New York.
  4. http://www.ccrs.nrcan.gc.ca/ccrs/learn/tutorials/fundam/chapter1/chapter1_1_ e.html
  5. Joseph, G. 1996. Imaging Sensors. Remote Sensing Reviews, 13: 257-342.
  6. Lillesand, T.M. and Kiefer, R.1993. Remote Sensing and Image Interpretation. Third Edition John Villey, New York.
  7. Manual of Remote Sensing. IIIrd Edition. American Society of Photogrammtery and Remote Sensing.
  8. Sabins, F.F. 1997. Remote Sensing and Principles and Image Interpretation. WH Freeman, New York.
  9. https://en.wikipedia.org/wiki/Main_Page
  10. http://www.isro.gov.in/applications/weather-forecasting
  11. http://www.hyspex.no/hyperspectral_imaging/
  12. http://satellites.spacesim.org/english/function/weather/index.html
  13. http://www.uprm.edu/biology/profs/chinea/gis/g06/NRC2_1_2_9.pdf
  14. https://en.wikipedia.org/wiki/Weather_satellite