Press "Enter" to skip to content

GlowTrack: Unleashing the Power of AI To Track Human and Animal Behavior

Human hand marked with GlowTrack fluorescent tags. Credit: Salk Institute

Salk scientists create GlowTrack to track human and animal behavior with better resolution and more versatility.

Movement offers a window into how the brain operates and controls the body. From clipboard-and-pen observation to modern artificial intelligence-based techniques, tracking human and animal movement has come a long way. Current cutting-edge methods utilize artificial intelligence to automatically track parts of the body as they move. However, training these models is still time-intensive and limited by the need for researchers to manually mark each body part hundreds to thousands of times.

Introducing GlowTrack

Now, Associate Professor Eiman Azim and team have created GlowTrack, a non-invasive movement-tracking method that uses fluorescent dye markers to train artificial intelligence. GlowTrack is robust, time-efficient, and high definition—capable of tracking a single digit on a mouse’s paw or hundreds of landmarks on a human hand.

The technique, published on September 26, 2023, in Nature Communications<em>Nature Communications</em> is a peer-reviewed, open-access, multidisciplinary, scientific journal published by Nature Portfolio. It covers the natural sciences, including physics, biology, chemistry, medicine, and earth sciences. It began publishing in 2010 and has editorial offices in London, Berlin, New York City, and Shanghai. ” data-gt-translate-attributes=”[{“attribute”:”data-cmtooltip”, “format”:”html”}]”>Nature Communications, has applications spanning from biology to robotics to medicine and beyond.

Daniel Butler and Eiman Azim

From left: Daniel Butler and Eiman Azim. Credit: Salk Institute

“Over the last several years, there has been a revolution in tracking behavior as powerful artificial intelligence tools have been brought into the laboratory,” says Azim, senior author and holder of the William Scandling Developmental Chair. “Our approach makes these tools more versatile, improving the ways we capture diverse movements in the laboratory. Better quantification of movement gives us better insight into how the brain controls behavior and could aid in the study of movement disorders like amyotrophic lateral sclerosis (ALS) and Parkinson’s disease.”

Overcoming Current Limitations

Current methods to capture animal movement often require researchers to manually and repeatedly mark body parts on a computer screen—a time-consuming process subject to human error and time constraints. Human annotation means that these methods can usually only be used in a narrow testing environment, since artificial intelligence models specialize to the limited amount of training data they receive. For example, if the light, orientation of the animal’s body, camera angle, or any number of other factors were to change, the model would no longer recognize the tracked body part.

To address these limitations, the researchers used fluorescent dye to label parts of the animal or human body. With these “invisible” fluorescent dye markers, an enormous amount of visually diverse data can be created quickly and fed into the artificial intelligence models without the need for human annotation. Once fed this robust data, these models can be used to track movements across a much more diverse set of environments and at a resolution that would be far more difficult to achieve with manual human labeling.

This opens the door for easier comparison of movement data between studies, as different laboratories can use the same models to track body movement across a variety of situations. According to Azim, comparison and reproducibility of experiments are essential in the process of scientific discovery.

“Fluorescent dye markers were the perfect solution,” says first author Daniel Butler, a Salk bioinformatics analyst.  Like the invisible ink on a dollar bill that lights up only when you want it to, our fluorescent dye markers can be turned on and off in the blink of an eye, allowing us to generate a massive amount of training data.”

Looking Ahead

In the future, the team is excited to support diverse applications of GlowTrack and pair its capabilities with other tracking tools that reconstruct movements in three dimensions, and with analysis approaches that can probe these vast movement datasets for patterns.

“Our approach can benefit a host of fields that need more sensitive, reliable, and comprehensive tools to capture and quantify movement,” says Azim. “I am eager to see how other scientists and non-scientists adopt these methods, and what unique, unforeseen applications might arise.”

Reference: “Large-scale capture of hidden fluorescent labels for training generalizable markerless motion capture models” 26 September 2023, Nature Communications.
DOI: 10.1038/s41467-023-41565-3

Other authors include Alexander Keim and Shantanu Ray of Salk.

The work was supported by the UC San Diego CMG Training Program, a Jesse and Caryl Philips Foundation Award, the National Institutes of HealthThe National Institutes of Health (NIH) is the primary agency of the United States government responsible for biomedical and public health research. Founded in 1887, it is a part of the U.S. Department of Health and Human Services. The NIH conducts its own scientific research through its Intramural Research Program (IRP) and provides major biomedical research funding to non-NIH research facilities through its Extramural Research Program. With 27 different institutes and centers under its umbrella, the NIH covers a broad spectrum of health-related research, including specific diseases, population health, clinical research, and fundamental biological processes. Its mission is to seek fundamental knowledge about the nature and behavior of living systems and the application of that knowledge to enhance health, lengthen life, and reduce illness and disability.” data-gt-translate-attributes=”[{“attribute”:”data-cmtooltip”, “format”:”html”}]”>National Institutes of Health (R00NS088193, DP2NS105555, R01NS111479, RF1NS128898, and U19NS112959), the Searle Scholars Program, the Pew Charitable Trusts, and the McKnight Foundation.

Source: SciTechDaily