Complex neuroimaging data can be interrogated through translation into audiovisual form – a video with an accompanying musical soundtrack – to help interpret what is happening in the brain when performing specific behaviors. David Thibodeaux and colleagues at Columbia University in the US present this technique in the open access journal PLOS ONE on February 21, 2024. Examples of these beautiful “brain movies” are included below.
Recent technological advances have made it possible to record multiple components of brain activity in real time. Scientists can now observe, for example, what happens in a mouse’s brain when it performs certain behaviors or receives a certain stimulus. However, such research produces large amounts of data that can be difficult to explore intuitively to gain insights into the biological mechanisms behind patterns of brain activity.
Previous research has shown that some brain imaging data can be translated into audio representations. Building on such approaches, Thibodeaux and colleagues have developed a flexible toolkit that enables the translation of different types of brain imaging data—and accompanying video recordings of laboratory animal behavior—into audiovisual representations.
The researchers then demonstrated the new technique in three different experimental settings, showing how audiovisual representations can be prepared with data from various brain imaging approaches, including 2D wide-field optical mapping (WFOM) and 3D scanning confocal planar excitation microscopy ( SCAPE).
The toolkit was applied to previously collected WFOM data that detected both neural activity and changes in brain blood flow in mice engaged in different behaviors such as running or grooming. The neural data was represented by piano sounds played in time with spikes in brain activity, with the volume of each note indicating the magnitude of the activity and its pitch indicating the location in the brain where the activity occurred. Meanwhile, blood flow data was represented by violin sounds. Piano and violin sounds, played in real time, demonstrate the interconnected relationship between neuronal activity and blood flow. When viewed alongside a video of the mouse, the viewer can discern which patterns of brain activity correspond to different behaviors.
The authors note that their toolkit is not a substitute for quantitative analysis of neuroimaging data. However, it could help scientists screen large data sets for patterns that might otherwise go unnoticed and worthy of further analysis.
The authors add: “Listening and watching his performances [brain activity] Data is an immersive experience that can tap into this ability of ours to recognize and interpret patterns (consider the online security feature that asks you to “pick traffic lights in this image”—a challenge beyond most computers, but trivial to our brain). ..[It] it’s nearly impossible to track and focus on both time-varying [brain activity] data and behavioral video at the same time, our eyes will have to flick back and forth to see things happening together. You generally need to keep replaying clips over and over to be able to understand what happened at a certain moment. Having an audio representation of the data makes it much simpler to see (and hear) when things are happening at the exact same time.”
Source:
Journal Reference:
Thibodeaux, DN, et al. (2024) Real-time audio-visualization of neuroimaging data. PLOS ONE. doi.org/10.1371/journal.pone.0297435.