Watch: Motion Capture Through AI
Tony Pallone | December 20, 2018Click to play video.“Motion capture” suits are full-body costumes loaded with sensors and used by movie studios to transform actors into a wide variety of CGI characters, from Hobbits to Hulks, and everything in between.
An artificial intelligence project at Princeton takes that concept a step further, with a new tool designed to automatically track animals' body parts in existing video. The tool, called LEAP, can be trained within minutes to track individual moving parts over millions of video frames with high accuracy -- no physical markers or labels are required.
Hollywood isn’t the goal, however.
"The method can be used broadly, across animal model systems, and it will be useful to measuring the behavior of animals with genetic mutations or following drug treatments," explained Mala Murthy, an associate professor of molecular biology at the Princeton Neuroscience Institute (PNI).
Murthy developed LEAP along with Joshua Shaevitz, a professor of physics at the Lewis-Sigler Institute for Integrative Genomics. A paper detailing the technology went live today in the January 2019 issue of Nature Methods, although an open-access version has already led to the software’s adoption by other labs.
"The way it works is to label a few points in a few videos and then the neural network does the rest,” said Talmo Pereira, a PNI grad who served as the paper’s first author. “We provide an easy-to-use interface for anyone to apply LEAP to their own videos, without having any prior programming knowledge."
In principle, any type of video data can be used. Although the initial subjects consisted primarily of flies and mice, Pereira also produced a demonstration of LEAP's abilities on a motion-tagged video of a giraffe taken from the live feed at the Mpala Research Centre in Kenya.
For the neuroscientists, the database of motion and behaviors collected by LEAP can be used to draw connections to the neural processes behind them. According to Shaevitz, this could lead to a better understanding of how the brain produces behaviors, and even explorations of future diagnostics and therapies that rely on computer interpretation of someone’s actions.