The surface of a body of water rippling with waves can make it difficult to get a clear view of submerged objects from above. A new technology known as fluid lensing removes the distortions, providing sharp images through rough seas. The remote sensing technology is capable of imaging through ocean waves at sub-centimeter resolutions and can even generate 3D pictures. But how does it work?
In a normal aquatic environment, light interacting with surface waves produces optical irregularities that continuously change over time. These include areas of the seafloor that appear enlarged or shrunken due to refractive lensing, as well as bands of light caused by the intersection of refracted rays of light from the curved surface of the water (a phenomenon known as caustic focusing). Both of these optical aberrations are caused by refraction, which causes a change in the direction of a ray of light as it travels into a different medium (from air to water, for instance).
Fluid lensing corrects for the refractive abnormalities, providing a clear view of underwater objects. Developed by Ved Chirayath for his Ph.D. thesis at Stanford University, the technique utilizes a general fluid lensing algorithm that characterizes the aquatic wave field, models caustic phenomena and even incorporates a 3D airborne fluid lensing algorithm to generate three-dimensional images of underwater environments.
The technique also recruits waves as optical “lenslets,” exploiting them as magnifying elements to improve resolution and clarity compared to ordinary remote imagers.
Fluid lensing allows scientists to get an unobstructed look at shallow marine habitats like coral reefs, which support a diverse range of aquatic life. As the rising threats of ocean acidification and global warming destroy these vulnerable ecosystems, assessing their health is important. Fluid lensing offers centimeter-scale resolution, providing images with enough detail to monitor reef accretion rates, which typically grow by around 1 cm per year, enabling researchers to quantify the damage suffered by reefs.
Early fluid lensing demonstration campaigns captured 3D multispectral imagery of shallow marine systems from unmanned aerial vehicles, including coral reefs in American Samoa and stromatolite reefs in Shark Bay, Australia, in 2013 and 2014, respectively.
Realizing fluid lensing’s potential, NASA funded its development with a 2014 grant of a portable imaging system that harnesses the technology for remote sensing applications. Two instruments known as FluidCam 1 and 2 were created with a design goal of automatically mapping underwater coastal targets in three dimensions. Packaged in a 1.5U CubeSat form factor and weighing just 1.5 kg each, the imagers were designed to be small enough to mount on the gimbals of small unmanned aerial systems (sUAS).
The instruments are equipped with high data volume and processing capabilities. A 1.5 GHz quad-core Intel NUC Core i5 CPU capable of 100 giga-floating-point operations per second (gigaFLOPS) is supported by an Intel HD Graphics 5000 GPU with 0.3 teraFLOPS of processing power. The imagers also have 8 GB of DDR RAM and two 250 GB SATA II solid-state drives in RAID0 configuration.
The powerful computational capabilities enable high-resolution, high-bandwidth imaging, calculating solutions to the fluid lensing algorithm to generate clear underwater images while maintaining a high frame rate. The instruments’ focal plane array captures 10-bit, uncompressed, 4-megapixel images at 90 frames per second. FluidCam 1 takes pictures in visible color from 380 to 720 nm, while FluidCam 2 extends the range into the near-infrared range from 300 to 1100 nm.
Since 2015, NASA has continued to fund enhancements to the technology developed for FluidCam. MiDAR (multispectral, detection and active reflectance) combines an array of multispectral, high-intensity, light-emitting diodes (the MiDAR transmitter) with the FluidCam imaging system (the MiDAR receiver). The instrument offers high-resolution (2048 by 2048 pixels at 30 fps) multispectral imaging in seven channels from 405 nm to 940 nm.
MiDAR improves on existing passive remote sensing systems on aircraft and satellites with a signal-to-noise ratio multiple orders of magnitude better. MiDAR is being deployed on aircraft and underwater remotely operated vehicles (ROVs) to enable a new method for remote sensing of living and nonliving structures in extreme environments. The technology also has potential applications in mineral identification, UV and fluorescent imaging and three-dimensional reconstruction using structure from motion (SfM) techniques.
NASA is continuing to support development of the instrument with a 32-channel version in the works. Meanwhile, MiDAR’s spectral range was recently improved to include five ultraviolet bands from 280 to 400 nm, enabling remote identification of mineral types and measurement of organic proteins and compounds through stimulated fluorescence.