Blog: Artificial Intelligence Tool Vastly Scales Up Alzheimer’s Research – ECNmag.com
Researchers at UC Davis and UC San Francisco have found a way to teach a computer to precisely detect one of the hallmarks of Alzheimer’s disease in human brain tissue, delivering a proof of concept for a machine-learning approach to distinguishing critical markers of the disease.
Amyloid plaques are clumps of protein fragments in the brains of people with Alzheimer’s disease that destroy nerve cell connections. Much like the way Facebook recognizes faces based on captured images, the machine learning tool developed by a team of University of California scientists can “see” if a sample of brain tissue has one type of amyloid plaque or another, and do it very quickly.
The findings, published May 15 in Nature Communications, suggest that machine learning can augment the expertise and analysis of an expert neuropathologist. The tool allows them to analyze thousands of times more data and ask new questions that would not be possible with the limited data processing capabilities of even the most highly trained human experts.
“We still need the pathologist,” said Brittany N. Dugger, Ph.D., an assistant professor in the UC Davis Department of Pathology and Laboratory Medicine at UC Davis and lead author of the study. “This is a tool, like a keyboard is for writing. As keyboards have aided in writing workflows, digital pathology paired with machine learning can aid with neuropathology workflows.”
In this study, she partnered with Michael J. Keiser, Ph.D., an assistant professor in UCSF’s Institute for Neurodegenerative Diseases and Department of Pharmaceutical Chemistry, to determine if they could teach a computer to automate the laborious process of identifying and analyzing tiny amyloid plaques of various types in large slices of autopsied human brain tissue. For this job, Keiser and his team designed a “convolutional neural network” (CNN), a computer program designed to recognize patterns based on thousands of human-labeled examples.
To create enough training examples to teach the CNN algorithm how Dugger analyzes brain tissue, the UCSF team worked with her to devise a method that allowed her to rapidly annotate or label tens of thousands of images from a collection half a million close-up images of tissue from 43 healthy and diseased brain samples.
Like a computer dating service that allows users to swipe left or right to label someone’s photo “hot” or “not,” they developed a web platform that allowed Dugger to look one-at-a-time at highly zoomed-in regions of potential plaques and quickly label what she saw there. This digital pathology tool—which researchers called “blob or not”—allowed Dugger to annotate more than 70,000 “blobs,” or plaque candidates, at a rate of about 2,000 images per hour.