Recent open-source tool helps to detangle the brain

-

In late 2023, the primary drug with potential to slow the progression of Alzheimer’s disease was approved by the U.S. Federal Drug Administration. Alzheimer’s is considered one of many debilitating neurological disorders that together affect one-eighth of the world’s population, and while the brand new drug is a step in the best direction, there remains to be an extended journey ahead to completely understanding it, and other such diseases.

“Reconstructing the intricacies of how the human brain functions on a cellular level is considered one of the largest challenges in neuroscience,” says Lars Gjesteby, a technical staff member and algorithm developer from the MIT Lincoln Laboratory’s Human Health and Performance Systems Group. “High-resolution, networked brain atlases will help improve our understanding of disorders by pinpointing differences between healthy and diseased brains. Nonetheless, progress has been hindered by insufficient tools to visualise and process very large brain imaging datasets.”

A networked brain atlas is in essence an in depth map of the brain that will help link structural information with neural function. To construct such atlases, brain imaging data should be processed and annotated. For instance, each axon, or thin fiber connecting neurons, must be traced, measured, and labeled with information. Current methods of processing brain imaging data, equivalent to desktop-based software or manual-oriented tools, usually are not yet designed to handle human brain-scale datasets. As such, researchers often spend numerous time slogging through an ocean of raw data.

Gjesteby is leading a project to construct the Neuron Tracing and Energetic Learning Environment (NeuroTrALE), a software pipeline that brings machine learning, supercomputing, in addition to ease of use and access to this brain mapping challenge. NeuroTrALE automates much of the information processing and displays the output in an interactive interface that enables researchers to edit and manipulate the information to mark, filter, and seek for specific patterns.

Untangling a ball of yarn

One among NeuroTrALE’s defining features is the machine-learning technique it employs, called energetic learning. NeuroTrALE’s algorithms are trained to mechanically label incoming data based on existing brain imaging data, but unfamiliar data can present potential for errors. Energetic learning allows users to manually correct errors, teaching the algorithm to enhance the following time it encounters similar data. This mixture of automation and manual labeling ensures accurate data processing with a much smaller burden on the user.

“Imagine taking an X-ray of a ball of yarn. You’d see all these crisscrossed, overlapping lines,” says Michael Snyder, from the laboratory’s Homeland Decision Support Systems Group. “When two lines cross, does it mean considered one of the pieces of yarn is making a 90-degree bend, or is one going straight up and the opposite goes straight over? With NeuroTrALE’s energetic learning, users can trace these strands of yarn one or two times and train the algorithm to follow them accurately moving forward. Without NeuroTrALE, the user would need to trace the ball of yarn, or on this case the axons of the human brain, each time.” Snyder is a software developer on the NeuroTrALE team together with staff member David Chavez.

Because NeuroTrALE takes the majority of the labeling burden off of the user, it allows researchers to process more data more quickly. Further, the axon tracing algorithms harness parallel computing to distribute computations across multiple GPUs without delay, resulting in even faster, scalable processing. Using NeuroTrALE, the team demonstrated a 90 percent decrease in computing time needed to process 32 gigabytes of information over conventional AI methods.

The team also showed that a considerable increase in the amount of information doesn’t translate to an equivalent increase in processing time. For instance, in a recent study they demonstrated that a ten,000 percent increase in dataset size resulted in just a 9 percent and a 22 percent increase in total data processing time, using two several types of central processing units.

“With the estimated 86 billion neurons making 100 trillion connections within the human brain, manually labeling all of the axons in a single brain would take lifetimes,” adds Benjamin Roop, considered one of the project’s algorithm developers. “This tool has the potential to automate the creation of connectomes for not only one individual, but many. That opens the door for studying brain disease on the population level.”

The open-source road to discovery

The NeuroTrALE project was formed as an internally funded collaboration between Lincoln Laboratory and Professor Kwanghun Chung’s laboratory on MIT campus. The Lincoln Lab team needed to construct a way for the Chung Lab researchers to investigate and extract useful information from their great amount of brain imaging data flowing into the MIT SuperCloud — a supercomputer run by Lincoln Laboratory to support MIT research. Lincoln Lab’s expertise in high-performance computing, image processing, and artificial intelligence made it exceptionally suited to tackling this challenge.

In 2020, the team uploaded NeuroTrALE to the SuperCloud and by 2022 the Chung Lab was producing results. In a single study, published in , they used NeuroTrALE to quantify prefrontal cortex cell density in relation to Alzheimer’s disease, where brains affected with the disease had a lower cell density in certain regions than those without. The identical team also positioned where within the brain harmful neurofibers are inclined to get tangled in Alzheimer’s-affected brain tissue.

Work on NeuroTrALE has continued with Lincoln Laboratory funding and funding from the National Institutes of Health (NIH) to accumulate NeuroTrALE’s capabilities. Currently, its user interface tools are being integrated with Google’s Neuroglancer program — an open-source, web-based viewer application for neuroscience data. NeuroTrALE adds the power for users to visualise and edit their annotated data dynamically, and for multiple users to work with the identical data at the identical time. Users may also create and edit plenty of shapes equivalent to polygons, points, and features to facilitate annotation tasks, in addition to customize color display for every annotation to tell apart neurons in dense regions.

“NeuroTrALE provides a platform-agnostic, end-to-end solution that might be easily and rapidly deployed on standalone, virtual, cloud, and high performance computing environments via containers.” says Adam Michaleas, a high performance computing engineer from the laboratory’s Artificial Intelligence Technology Group. “Moreover, it significantly improves the tip user experience by providing capabilities for real-time collaboration throughout the neuroscience community via data visualization and simultaneous content review.”

To align with NIH’s mission of sharing research products, the team’s goal is to make NeuroTrALE a completely open-source tool for anyone to make use of. And any such tool, says Gjesteby, is what’s needed to succeed in the tip goal of mapping the whole thing of the human brain for research, and eventually drug development. “It is a grassroots effort by the community where data and algorithms are supposed to be shared and accessed by all.”

ASK DUKE

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x