National Unmanned Aircraft SystemsUAS Project Office

Sensors

Most UAS platforms come equipped with electro-optical (EO) or thermal infrared (IR) sensors that support the acquisition of Full Motion Video (FMV). But although the raw image data derived from either EO or thermal FMV can be used to meet some of DOI’s remote sensing data needs, meeting the full range of needs requires access to a much wider range of either passive or active light-weight UAS-mountable sensors.

A passive sensor, or optical sensor, requires either solar illumination or emitted energy to operate, which also means that its use can be limited by cloud cover and/or available daylight. Optical sensors also use filters, prisms or diffraction gratings for measuring incident radiation of various spectral ranges, making each different type of optical sensor sensitive to a specific part of the electromagnetic spectrum. EO, thermal IR, and multispectral are all considered types of passive sensors. On the other hand, active sensors, like light detection and ranging (LiDAR) and radar, provide their own illumination or pulses of energy which allows them to operate in either cloudy conditions or at night.


Electro-Optical (EO)

EO sensors, also known as natural color or RGB, is a passive sensor that operates in the visible light range within the natural color part of the electromagnetic spectrum, i.e. the wavelengths that the human eye can detect. EO cameras are available in a wide range of light-weight inexpensive FMV or single-lens reflex models that can be easily mounted on a sUAS platform. Imagery from these cameras, combined with ground control points (GCPs), can be processed using structure from motion (SfM) photogrammetry software to produce high spatial resolution orthophotos with ground sample distances of less than 5 cm, as well as 3D point clouds and digital elevation models.

Pentax Ricoh GR
Specifications
Pentax Ricoh GR specifications
Imagery Examples
Orthophoto of Cooper Island at Palmyra Atoll generated from Ricoh GR II imagery taken at 130m AGL (resolution 2.74 cm/pixel).
Orthophoto of Cooper Island at Palmyra Atoll generated from Ricoh GR imagery taken at 130m AGL (resolution 2.74 cm/pixel).
Pentax Ricoh GR image over an undeveloped basin in the Stinking Water Gulch near Rangely Colorado.
Pentax Ricoh GR image over an undeveloped basin in the Stinking Water Gulch near Rangely Colorado.
Sony A5100 with Voigtlander lens
Specifications
Sony A5100 with Voightlander lens specifications
Imagery Examples
Natural Color image collected using the Sony A5100 with Voightlander lens
Natural Color image collected using the Sony A5100 with Voightlander lens
Natural Color image collected using the Sony A5100 with Voightlander lens
Natural Color image collected using the Sony A5100 with Voightlander lens

Thermal

Thermal IR (heat sensors) are a passive sensor that detects radiation within the IR part of the spectrum and is used to measure temperature differences. By employing thermal sensors on UAS, non-contact temperature measurements of surfaces in the form of photographs can be gathered to generate absolute and relative temperature orthophotos. Absolute temperature orthophotos are generated from raw 16-bit radiometrically calibrated thermal imagery where each pixel location has an associated absolute surface temperature. Relative temperature orthophotos, which can be used if absolute temperature is not required, can be generated from histogram-stretched JPGs with various color palettes such as WhiteHot, BlackHot, etc.

FLIR Vue Pro R (thermal)
Specifications
FLIR Vue Pro R with 13mm lens specifications
Imagery Examples
Absolute temperature orthomosaic of the Denver Federal Center created with images from the FLIR Vue Pro R.
Absolute temperature orthomosaic of the Denver Federal Center created with images from the FLIR Vue Pro R.
Orthomosaic of the relative temperature over willow trees in Arizona (white is hottest). Generated in with FLIR Vue Pro R 8 bit JPEG images taken at 200ft AGL.
Relative temperature orthomosaic over willow trees in Arizona (white is hottest). Generated with FLIR Vue Pro R 8 bit JPEG images taken at 200ft AGL.

Multispectral

Multispectral cameras, another type of passive sensor, can collect information across the entire electromagnetic spectrum and collect multiple band widths usually segregated into less than 12 distinct regions of the spectrum. Most commercially available UAS-mountable multispectral cameras can collect images in the visible and near infrared (VNIR) wavelengths (400 – 1000 nm). Image data collected in this range is commonly used for land surface classification and vegetation monitoring. One of the most popular uses for visible and near infrared (VNIR) imagery is calculation of the normalized difference vegetation index (NDVI). NDVI is calculated by comparing the surface reflectance from the red band to that of the near infrared (NIR) band using the following equation: NDVI = (NIR – Red) / (NIR + Red). The results of an NDVI calculation have a strong correlation to the health of the plants imaged and if the radiometric calibration is performed correctly, NDVI maps can be compared between different times of day, month and year.

MicaSense RedEdge 3 (multispectral)
Specifications
MicaSense RedEdge 3 specifications
Imagery Examples
Images of the exact same area from each of the five MicaSense RedEdge sensors showing relative reflectance of the surface for a specific wavelength band.
Images of the exact same area from each of the five MicaSense RedEdge sensors showing relative reflectance of the surface for a specific wavelength band.
False color orthomosaic generated from radiometrically calibrated MicaSense RedEdge imagery taken at Rocky Flats National Wildlife Refuge in Colorado.
False color orthomosaic generated from radiometrically calibrated MicaSense RedEdge imagery taken at Rocky Flats National Wildlife Refuge in Colorado.

LiDAR

LiDAR (Light Detection and Ranging) is an active sensor used to obtain highly accurate and precise three-dimensional (3D) measurements of surface locations in the form of point clouds, i.e. collections of thousands of points with associated locations in x,y,z space. Any LiDAR sensor is composed of three main parts: 1) Global Navigation Satellite System (GNSS) for determining location, 2) a laser scanner for sending and receiving signals, and 3) an INS (inertial navigation system) for measuring pitch, roll and yaw of the system. LiDAR receivers measure the time it takes for a laser to leave and return to the system after being reflected by the surface, and a single laser pulse may be reflected multiple times depending on the complexity of the surface (i.e. if there is vegetation or structures). From this time measurement, distance is calculated using the known speed of the laser light. LiDAR point clouds provide critical information about the surface of the Earth such as light intensity/reflectivity, canopy or structural heights and bare ground elevation. LiDAR intensity values differ primarily due to changes in surface composition and partly due to incident angle, which makes them useful for surface classification and feature extraction. LiDAR point clouds allow for the calculation of average canopy height and vegetation density, and since LiDAR sensors read multiple surface returns, digital terrain models (DTMs) can be generated by removing the elevation signals of features such as vegetation and buildings and leaving only the elevation of the terrain (bare earth).

LiDAR sensors mounted on a low-altitude UAS can collect surface elevations for ground positions sampled at regularly spaced horizontal intervals at very high ground resolution (cm scale) that can then be used in the generation of very high-resolution point clouds, DEMs, DTMs, and digital surface models (DSMs), a type of DEM that contain elevations of natural terrain features in addition to vegetation and cultural features such as buildings and roads.

YellowScan Surveyor (LiDAR)
Specifications
YellowScan LiDAR Surveyor specifications
Imagery Examples
LiDAR point cloud data at Fountain Creek Colorado colored by height above ground.
LiDAR point cloud data at Fountain Creek Colorado colored by height above ground.
LiDAR point cloud data of the Coal Creek Canyon in Colorado colored by elevation.
LiDAR point cloud data of the Coal Creek Canyon in Colorado colored by elevation.

Radar

Radar is an active sensor which operates with polarization at dual or quad-polarization with phase to measure the full polarization response of an object. Radar systems operate in the radio wavelengths (on the order of 103 m), while LiDAR systems utilize light with shorter wavelengths in the visible, near-infrared, and ultraviolet wavelengths (10-5 to 10-8).


Retired Sensors

Sony A5100 (electro-optical)
Sony A5100 specifications
Canon PowerShot S100 (electro-optical)
Specifications
Canon PowerShot S100 specifications
Canon PowerShot SX260 (electro-optical)
Specifications
Canon PowerShot SX260 specifications
Canon PowerShot SX230 (electro-optical)
Specifications
Canon PowerShot SX260 specifications
GoPro Hero3 (electro-optical)
Specifications
GoPro Hero3 specifications
GoPro Hero2 2mm (electro-optical)
Specifications
GoPro Hero2 (2mm) specifications
GoPro Hero2 5.4mm (electro-optical)
Specifications
GoPro Hero2 (5.4mm) specifications