Most UAS platforms come equipped with electro-optical (EO) or thermal infrared (IR) sensors that support the acquisition of Full Motion Video (FMV). But although the raw image data derived from either EO or thermal FMV can be used to meet some of DOI’s remote sensing data needs, meeting the full range of needs requires access to a much wider range of either passive or active light-weight UAS-mountable sensors.
A passive sensor, or optical sensor, requires either solar illumination or emitted energy to operate, which also means that its use can be limited by cloud cover and/or available daylight. Optical sensors also use filters, prisms or diffraction gratings for measuring incident radiation of various spectral ranges, making each different type of optical sensor sensitive to a specific part of the electromagnetic spectrum. EO, thermal IR, and multispectral are all considered types of passive sensors. On the other hand, active sensors, like light detection and ranging (LiDAR) and radar, provide their own illumination or pulses of energy which allows them to operate in either cloudy conditions or at night.
EO sensors, also known as natural color or RGB, is a passive sensor that operates in the visible light range within the natural color part of the electromagnetic spectrum, i.e. the wavelengths that the human eye can detect. EO cameras are available in a wide range of light-weight inexpensive FMV or single-lens reflex models that can be easily mounted on a sUAS platform. Imagery from these cameras, combined with ground control points (GCPs), can be processed using structure from motion (SfM) photogrammetry software to produce high spatial resolution orthophotos with ground sample distances of less than 5 cm, as well as 3D point clouds and digital elevation models.
Thermal IR (heat sensors) are a passive sensor that detects radiation within the IR part of the spectrum and is used to measure temperature differences. By employing thermal sensors on UAS, non-contact temperature measurements of surfaces in the form of photographs can be gathered to generate absolute and relative temperature orthophotos. Absolute temperature orthophotos are generated from raw 16-bit radiometrically calibrated thermal imagery where each pixel location has an associated absolute surface temperature. Relative temperature orthophotos, which can be used if absolute temperature is not required, can be generated from histogram-stretched JPGs with various color palettes such as WhiteHot, BlackHot, etc.
Multispectral cameras, another type of passive sensor, can collect information across the entire electromagnetic spectrum and collect multiple band widths usually segregated into less than 12 distinct regions of the spectrum. Most commercially available UAS-mountable multispectral cameras can collect images in the visible and near infrared (VNIR) wavelengths (400 – 1000 nm). Image data collected in this range is commonly used for land surface classification and vegetation monitoring. One of the most popular uses for visible and near infrared (VNIR) imagery is calculation of the normalized difference vegetation index (NDVI). NDVI is calculated by comparing the surface reflectance from the red band to that of the near infrared (NIR) band using the following equation: NDVI = (NIR – Red) / (NIR + Red). The results of an NDVI calculation have a strong correlation to the health of the plants imaged and if the radiometric calibration is performed correctly, NDVI maps can be compared between different times of day, month and year.
LiDAR (Light Detection and Ranging) is an active sensor used to obtain highly accurate and precise three-dimensional (3D) measurements of surface locations in the form of point clouds, i.e. collections of thousands of points with associated locations in x,y,z space. Any LiDAR sensor is composed of three main parts: 1) Global Navigation Satellite System (GNSS) for determining location, 2) a laser scanner for sending and receiving signals, and 3) an INS (inertial navigation system) for measuring pitch, roll and yaw of the system. LiDAR receivers measure the time it takes for a laser to leave and return to the system after being reflected by the surface, and a single laser pulse may be reflected multiple times depending on the complexity of the surface (i.e. if there is vegetation or structures). From this time measurement, distance is calculated using the known speed of the laser light. LiDAR point clouds provide critical information about the surface of the Earth such as light intensity/reflectivity, canopy or structural heights and bare ground elevation. LiDAR intensity values differ primarily due to changes in surface composition and partly due to incident angle, which makes them useful for surface classification and feature extraction. LiDAR point clouds allow for the calculation of average canopy height and vegetation density, and since LiDAR sensors read multiple surface returns, digital terrain models (DTMs) can be generated by removing the elevation signals of features such as vegetation and buildings and leaving only the elevation of the terrain (bare earth).
LiDAR sensors mounted on a low-altitude UAS can collect surface elevations for ground positions sampled at regularly spaced horizontal intervals at very high ground resolution (cm scale) that can then be used in the generation of very high-resolution point clouds, DEMs, DTMs, and digital surface models (DSMs), a type of DEM that contain elevations of natural terrain features in addition to vegetation and cultural features such as buildings and roads.
Radar is an active sensor which operates with polarization at dual or quad-polarization with phase to measure the full polarization response of an object. Radar systems operate in the radio wavelengths (on the order of 103 m), while LiDAR systems utilize light with shorter wavelengths in the visible, near-infrared, and ultraviolet wavelengths (10-5 to 10-8).