Supercomputer Titan Conquers Big DataEd Brown | January 30, 2017
The UJ.S. Energy Department's Oak Ridge National Laboratory’s Titan supercomputer is helping researchers make sense of data that they could easily have missed.
Computer scientist Ramakrishnan Kannan says that in the past scientists only had to focus on one instrument at time. Now, however, experiments can involve several instruments at once and the amount of data can be overwhelming.
Kannan has developed a distributive machine-learning tool, which collects and supports data in a fraction of the time other methods. It helps researchers use Titan’s computing power of 18,688 nodes (20 petaflops) to extract meaningful data.
"This technology condenses the information into what’s significant, enabling us to better understand very large high-dimensional data," he says.
Kannan designed what he describes as off-the-shelf data analysis algorithms with some modifications to handle large amounts of scientific and Internet data with speed and efficiency.
His approach minimizes the amount of traffic among computers by pooling multiple messages and communications into bigger message sizes, and sequences operations to avoid unnecessary communication.
In a laboratory, his technique can capture molecular movements in great detail and eliminate background noise to identify precisely when a significant event occurs. Beyond the laboratory, Kannan says that his technique may be useful for analyzing video of highways and intersections, for example, which could aid in the design of better roads and help reduce congestion. It could also help researchers better understand trending social topics in near real time at different geographical levels, from rural to urban.