Optical Components

Fluid Lensing Generates Clear Underwater Images Through Ocean Waves

07 May 2018

The surface of a body of water rippling with waves can make it difficult to get a clear view of submerged objects from above. A new technology known as fluid lensing removes the distortions, providing sharp images through rough seas. The remote sensing technology is capable of imaging through ocean waves at sub-centimeter resolutions and can even generate 3D pictures. But how does it work?

Eliminating Distortion

In a normal aquatic environment, light interacting with surface waves produces optical irregularities that continuously change over time. These include areas of the seafloor that appear enlarged or shrunken due to refractive lensing, as well as bands of light caused by the intersection of refracted rays of light from the curved surface of the water (a phenomenon known as caustic focusing). Both of these optical aberrations are caused by refraction, which causes a change in the direction of a ray of light as it travels into a different medium (from air to water, for instance).

A 22 by 15 in. test target positioned at a depth of 12 ft. from the surface of a swimming pool imaged from a height of 7 ft. above the surface. Raw image (left) and after post-processing with fluid lensing (right). Source: NASAA 22 by 15 in. test target positioned at a depth of 12 ft. from the surface of a swimming pool imaged from a height of 7 ft. above the surface. Raw image (left) and after post-processing with fluid lensing (right). Source: NASA

Fluid lensing corrects for the refractive abnormalities, providing a clear view of underwater objects. Developed by Ved Chirayath for his Ph.D. thesis at Stanford University, the technique utilizes a general fluid lensing algorithm that characterizes the aquatic wave field, models caustic phenomena and even incorporates a 3D airborne fluid lensing algorithm to generate three-dimensional images of underwater environments.

The technique also recruits waves as optical “lenslets,” exploiting them as magnifying elements to improve resolution and clarity compared to ordinary remote imagers.

Two images of the same section of a coral reef near Ofu Island in American Samoa. The top image was captured by the Pleiades 1A satellite while the bottom image is from an airborne fluid-lensed dataset. Source: NASATwo images of the same section of a coral reef near Ofu Island in American Samoa. The top image was captured by the Pleiades 1A satellite while the bottom image is from an airborne fluid-lensed dataset. Source: NASAReef Health

Fluid lensing allows scientists to get an unobstructed look at shallow marine habitats like coral reefs, which support a diverse range of aquatic life. As the rising threats of ocean acidification and global warming destroy these vulnerable ecosystems, assessing their health is important. Fluid lensing offers centimeter-scale resolution, providing images with enough detail to monitor reef accretion rates, which typically grow by around 1 cm per year, enabling researchers to quantify the damage suffered by reefs.

Early fluid lensing demonstration campaigns captured 3D multispectral imagery of shallow marine systems from unmanned aerial vehicles, including coral reefs in American Samoa and stromatolite reefs in Shark Bay, Australia, in 2013 and 2014, respectively.

FluidCam’s onboard computational capabilities enable high-resolution underwater mapping. Source: NASAFluidCam’s onboard computational capabilities enable high-resolution underwater mapping. Source: NASA

FluidCam

Realizing fluid lensing’s potential, NASA funded its development with a 2014 grant of a portable imaging system that harnesses the technology for remote sensing applications. Two instruments known as FluidCam 1 and 2 were created with a design goal of automatically mapping underwater coastal targets in three dimensions. Packaged in a 1.5U CubeSat form factor and weighing just 1.5 kg each, the imagers were designed to be small enough to mount on the gimbals of small unmanned aerial systems (sUAS).

The instruments are equipped with high data volume and processing capabilities. A 1.5 GHz quad-core Intel NUC Core i5 CPU capable of 100 giga-floating-point operations per second (gigaFLOPS) is supported by an Intel HD Graphics 5000 GPU with 0.3 teraFLOPS of processing power. The imagers also have 8 GB of DDR RAM and two 250 GB SATA II solid-state drives in RAID0 configuration.

The powerful computational capabilities enable high-resolution, high-bandwidth imaging, calculating solutions to the fluid lensing algorithm to generate clear underwater images while maintaining a high frame rate. The instruments’ focal plane array captures 10-bit, uncompressed, 4-megapixel images at 90 frames per second. FluidCam 1 takes pictures in visible color from 380 to 720 nm, while FluidCam 2 extends the range into the near-infrared range from 300 to 1100 nm.

MiDAR

Since 2015, NASA has continued to fund enhancements to the technology developed for FluidCam. MiDAR (multispectral, detection and active reflectance) combines an array of multispectral, high-intensity, light-emitting diodes (the MiDAR transmitter) with the FluidCam imaging system (the MiDAR receiver). The instrument offers high-resolution (2048 by 2048 pixels at 30 fps) multispectral imaging in seven channels from 405 nm to 940 nm.

NASA’s MiDAR instrument advances the FluidCam design with the addition of a transmitter comprised of an array of high-intensity, multispectral light emitting diodes. Source: NASANASA’s MiDAR instrument advances the FluidCam design with the addition of a transmitter comprised of an array of high-intensity, multispectral light emitting diodes. Source: NASA

MiDAR improves on existing passive remote sensing systems on aircraft and satellites with a signal-to-noise ratio multiple orders of magnitude better. MiDAR is being deployed on aircraft and underwater remotely operated vehicles (ROVs) to enable a new method for remote sensing of living and nonliving structures in extreme environments. The technology also has potential applications in mineral identification, UV and fluorescent imaging and three-dimensional reconstruction using structure from motion (SfM) techniques.

NASA is continuing to support development of the instrument with a 32-channel version in the works. Meanwhile, MiDAR’s spectral range was recently improved to include five ultraviolet bands from 280 to 400 nm, enabling remote identification of mineral types and measurement of organic proteins and compounds through stimulated fluorescence.

To contact the author of this article, email eric.olson@ieeeglobalspec.com


Powered by CR4, the Engineering Community

Discussion – 2 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Re: Fluid Lensing Generates Clear Underwater Images Through Ocean Waves
#1
ser
2018-May-15 10:10 AM

Am I the only one who can't hear any sound on the video?

Re: Fluid Lensing Generates Clear Underwater Images Through Ocean Waves
#2
GJM
2018-May-15 4:05 PM

No, you are not alone.

Engineering Newsletter Signup
Get the Engineering360
Stay up to date on:
Our flagship newsletter covers all the technologies engineers need for new product development across disciplines and industries.
Advertisement
Advertisement

Upcoming Events

Advertisement