Data Acquisition

Is Hyperspectral Imaging Ready to Take Off?

20 October 2014

After the Deepwater Horizon oil rig exploded and spilled oil in the Gulf of Mexico in April 2010, the U.S. space agency, NASA, deployed its Earth Resources-2 aircraft to detect the presence, extent and concentration of the spill. Equipped with the airborne visible/infrared imaging spectrometer (AVIRIS), the aircraft collected data that helped researchers identify thicker parts of the spill by measuring how the water absorbed and reflected light.

AVIRIS calibrates images of the spectral radiance in 224 adjacent spectral bands. By comparison, a traditional RGB (red, green, blue) sensor used in color cameras captures three bands. The NASA-deployed technology, known as hyperspectral imaging or imaging spectroscopy, allowed AVIRIS to document what satellites could not by distinguishing material properties otherwise undetectable by the human eye.

Although hyperspectral imaging is used primarily in remote sensing applications such as the Deepwater Horizon accident, new applications that include food safety inspection and life sciences are bringing the technology from the skies overhead to the plant floor.

Color infrared composite image made from three spectral bands of NASA's MASTER instrument mounted on the high-altitude ER-2. Red areas depict green vegetation in Las Vegas, Nev., on May 30. 2014. Source: NASA/Dean Neeley/Jeff MyersColor infrared composite image made from three spectral bands of NASA's MASTER instrument mounted on the high-altitude ER-2. Red areas depict green vegetation in Las Vegas, Nev., on May 30. 2014. Source: NASA/Dean Neeley/Jeff MyersHyperspectral imaging adds a third spectral dimension to standard 2D pictures in which each pixel contains dozens and even hundreds of values, each representing a section of the electromagnetic spectrum. A standard 2D image taken with a digital camera captures three values per pixel and splits the visible spectrum into red, green and blue values. In other words, a hyperspectral image is visualized as a “data cube” representing spectral and spatial information, typically covering the electromagnetic spectrum between 300 nanometers (nm) and 2,600 nm. Hyperspectral imagers record 100 bands or more in the spectrum, whereas multispectral imagers typically capture about 20 channels.

Within the hyperspectral sensor, a specialized optic called a “diffraction grating” spatially separates the electromagnetic spectrum by wavelength. This data cube, essentially a stack of images with each one viewed through a narrow band of the electromagnetic spectrum, holds more data on the material properties of an object in the imager’s field of view than with standard imaging techniques.

“Each object under view has a unique spectral signature, so based on how the sensor views the scene, we can make a real-time determination as to the material composition of the object,” says David Bannon, CEO of Headwall Photonics, a spectral imaging manufacturer based in Fitchburg, Mass.
Hyperspectral machine vision has two critical components, says Adam Stern, senior scientist at Resonon Inc. in Bozeman, Mont. One component is the hyperspectral imaging itself and the other is real-time statistical pattern-recognition software that uses the hyperspectral data to control robotic actuators.

Benefit or Weakness?

Until recently, hyperspectral imaging’s benefits have also been its weakness. While it provides a lot more data on a scene, the data flow can be too much for computers to handle in real time. As a result, it requires specialized knowledge to extract useable information from the massive amounts of visual data.

“Not only do you have a picture with a million pixels, but each pixel has 240 12-bit data points, so the datasets are enormous," Stern says. "Computers are finally getting big, fast and inexpensive enough to make this an economically feasible technology.”

These advances have enabled the adoption of hyperspectral imaging in automated sorting applications where existing machine vision technologies or manual sorting fail. In the case of almond sorting in the food industry, for example, A technician could train the hyperspectral system to recognize almonds based on spectral data. Source: ResononA technician could train the hyperspectral system to recognize almonds based on spectral data. Source: Resonon Standard machine vision systems that are equipped with monochrome or three-color cameras often don't acquire this data, or they acquire it but without the spectral specificity to reliably make an automated decision.

Using software algorithms developed for multi- and hyperspectral imaging systems, automated sorting systems can analyze data in real time and instruct a robot or other material handling system as to what to reject and what to accept.

Food inspection facilities have been early commercial adopters of hyperspectral imaging, says Bannon, because of government safety regulations. In fact, Headwall sold its first hyperspectral inspection instrument in the food safety industry to a major poultry processor five years ago following research and development conducted with the U.S. Agriculture Department.

“When you are in a regulated environment that is being controlled or overseen by a human, it’s a difficult job to be able to inspect food products at very high speeds,” he says. “The hyperspectral sensors will run continuously and provide accurate and repeatable results.”

Not only is hyperspectral imaging able to improve upon human visual inspection, but it also can replace traditional biological or chemical detection laboratory sampling experiments that require additional time and resources.

Expense and Training

While advances in microprocessors, memory and high-speed data interfaces make hyperspectral imaging more attractive to industrial users, the imagers themselves can be expensive and require trained technicians to keep the hardware calibrated.

Nanoelectronics research center imec (formerly the Interuniversity Microelectronics Centre based in Leuven, Belgium) builds its hyperspectral sensors directly on top of, and as an extension to, the image sensor. Other hyperspectral instruments combine high and optical components and align the optical path.

In the sensor, each row essentially acts as a linear camera filtered to a specific spectral band. By scanning the camera across a target area, each row acquires a single line of a 2D image for a given spectral band. Using software and computers, each new row constructs a 2D image for that spectral band. By combining continuous images from these rows, a 3D image block is created with the number of spectral bands limited by the size of the area-array image sensor. Different filters can be positioned over individual pixels meaning that other arrangements that do not require line scanning also are possible.

A camera with a standard area-array sensor can be fitted with an image sensor that contains imec’s hyperspectral filter. The resulting camera retains the same dimensions, but features spectral capabilities with the same form factor. This enables the mass fabrication of hyperspectral cameras and opens new applications that previously could not use hyperspectral cameras, says Andy Lambrechts, program manager and team leader of Integrated Imaging at imec.

Imec manufactures the hyperspectral sensors at wafer level, with each wafer containing tens to hundreds of imagers that are manufactured together. The process can use semiconductor industry equipment, which enables the alignment of spectral filters on a per-pixel scale. This approach reduces the optical complexity and cost of the hyperspectral imaging camera, Lambrechts says.

Multiple Manufacturers

Imec is partnering with multiple camera manufacturers, including Adimec, Tattile, Bayspec, 3D-One and VRMagic to bring this technology to multiple markets. One example is the xiQ USB3.0 camera manufactured by XIMEA of Münster, Germany. The camera measures 26 mm³ and weighs 27 grams. As earth observation platforms move from expensive satellites to more cost-effective, widely deployable unmanned aerial vehicles, Lambrechts says the need for compact hyperspectral cameras will grow.

XIMEA CEO Max Larin sees expanding applications in a variety of industries, including life science instrumentation and medical imaging.

“You now have a portable device for express skin diagnostics, for example, that you can bring to the patient rather than bring the patient to the system,” he says.


Hyperspectral imaging has been used in remote sensing for about 30 years, but the technology is still in its early stages within industrial and medical settings. As with any new implementation, growing pains are expected.

“Speed and resolution will always be a challenge for this technology, but it’s getting better all the time,” Stern says.

Furthermore, there is a lot of information in the shortwave infrared (SWIR) spectral range that cannot be obtained with standard sensors. The near-infrared (NIR)-SWIR range from 900 to 1700 nm can be measured with conventional InGaS cameras, but many materials have reflectance signatures that extend to 2500 nm, Stern says. “Sensors in this spectral range often cost $50,000 or more, and the technology is just not there yet to affordably capture that data.”

Ease of use also will drive adoption. Headwall Photonics’ Hyperspec imaging systems integrate a sensor, an embedded processor (containing a library of spectral signatures for comparison against data acquired by the imager) and a diagnostic module within an IP-rated enclosure. This configuration reduces the number of components a customer has to buy.

Headwall also has prioritized software development. “We need to come in with robust, flexible application software capabilities that allow our customers to immediately understand the spectral composition of the product in the terms and nomenclature that they are familiar with,” Bannon says. Headwall’s software interfaces with upstream and downstream instruments such as robotic vacuum arms on the processing line to help users act on the data that is received.

Fulfilling hyperspectral imaging’s commercial promise so far has seemingly been just one more technological advance away. Thanks to advances in processing power, economical hyperspectral imaging sensors and software that simplifies the physics, the technology's day in the sun may be near.



Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the Engineering360
Stay up to date on:
Our flagship newsletter covers all the technologies engineers need for new product development across disciplines and industries.
Advertisement
Advertisement

Upcoming Events

Advertisement