Travis Johnson punched a few computer keys and, a moment later, his monitor glowed with what appeared to be an abstract painting, with swirls of green, red, orange and blue.
It wasn’t modern art but a data visualization showing changes in a mouse’s brain when it is fed a high-fat diet—a basic research step in trying to discover a new drug to treat obesity.
The colorful images on Johnson’s screen, powered by a supercomputer from Indiana University, were generated from about one terabyte of data—equivalent to tens of millions of rows of data if it were loaded into a spreadsheet.
A stack of printer paper with all that data would soar as high as 44 Empire State Buildings, said Johnson, director of bioinformatics at the Indiana Biosciences Research Institute and assistant professor of biostatistics at the Indiana University School of Medicine.
“If you had 100 people, with average reading speed, reading for eight hours a day every day, it would take them 17.6 years to finish reading all this data,” he added.
But by using artificial intelligence tools, Johnson and his colleagues can, in just two or three days, process all the data into visualizations and produce a list of genes associated with disease states and cell types, which can then be used as a candidate list for novel drug targets.
At the research institute, a not-for-profit based at 16 Tech innovation district on the western edge of downtown, scientists and data experts are using AI tools to search through reams of information to find new ways to treat conditions from obesity to Alzheimer’s disease.
So are thousands of other researchers at medical centers, pharmaceutical companies, biotech companies and AI specialty firms, all in the race to find better drugs and treatment for diseases.
Making the most of AI
Artificial intelligence is not new to drug making. Pharmaceutical and biotech firms have been working with early forms of it for years.
But a raft of new AI tools, including generative and predictive technologies, are turbocharging the process. During the last nine years, the cumulative amount of AI investments across the pharma and biotech sectors has increased by almost thirtyfold, to $24.6 billion as of last December, according to Deep Pharma Intelligence, a London-based market research firm.
Drug discovery—the process of identifying drug candidates for testing—is notoriously time-consuming and expensive. It often costs hundreds of millions of dollars and typically takes three to six years before a drug can even be tested in humans.
But a group of AI tools is revolutionizing nearly every stage, with the potential to cut years and millions of dollars out of the process.
At the early stage of drug discovery, the AI tools are used to identify certain proteins or genes that can counteract certain diseases. They do that by doing the type of grunt work no human could do: analyzing troves of genomic data, health records, medical imaging, clinical trials and publication.
In the middle stage, the tools hold the potential to examine huge libraries of molecules and predict key properties, such as toxicity, bioactivity and chemical characteristics of molecules.
Finally, in the late stage, they might generate entirely new molecules from scratch.
In the case of the biosciences institute’s research into obesity, digging into the huge collection of mouse-cell data is an AI approach to finding answers in a mountain of information that would otherwise be all but inaccessible.
“Artificial intelligence is a way to kind of reorganize and regroup this data so that you can have an understanding of what’s happening in the cell,” said Mary Mader, the institute’s vice president of molecular innovation and a former oncology researcher at Eli Lilly and Co. “You can do it in a way that the older-style methods of analysis would make it much more laborious.”
Venture capital companies are pouring billions of dollars into small and growing AI vendors and AI-driven biotechs, which are, in turn, forming an increasing number of research partnerships with large drugmakers.
“Since 2017, there has been an obvious shift in the perception from skepticism and cautious interest all the way to a realization of a strategic role AI has to play in the emerging ‘‘data-centric’ model of innovation,” according to Deep Pharma Intelligence’s overview report last year.
The change in perception was underpinned by a number of factors, including several commercial successes and milestones, reached mostly by smaller “end-to-end” AI drug discovery companies.
At Lilly, a 147-year-old pharmaceutical company that is one of Indianapolis’ largest employers, researchers are increasingly turning to AI tools, said Ramesh Durvasula, senior vice president of research and development information technology.
“We are hiring significant amounts of talent,” he said. “We are massively upscaling our existing talent base, as well.”
He declined to specify what AI tools the company is using or to name the drugs under development that were discovered using AI. He said he wants to wait for clinical trials to be completed—a process that can take five years or more—before naming specific diseases, projects or novel compounds that used AI.
“Once we have more clinical validation of our efforts, we certainly will have high enthusiasm and opportunity,” Durvasula said. “… Then we can certainly talk more robustly.”
Even if Lilly is keeping tight-lipped, small startups are making announcements on the topic. In May, XtalPi Inc., a biotech that boasts it is “powered by artificial intelligence and automation,” based in Cambridge, Massachusetts, announced an AI drug discovery collaboration with Lilly that could be worth up to $250 million in upfront and milestone payments.
The collaboration will use XtalPi’s AI capabilities and robotics platform to design and deliver drug candidates for an undisclosed target, the announcement said.
“In a discovery process, you want to funnel wide,” Lilly CEO David Ricks told Business Insider, a financial news website, in June. “In the past, perhaps humans would just think of the things they already knew about. The machine doesn’t. It just knows about everything that was there, and it comes up with constructs that humans just don’t.”
But the race is on to see which large drugmaker can strike up partnerships. In 2021, French drugmaker Sanofi struck a partnership deal with Baidu, an AI company based in China, and the two began a program to integrate Baidu’s LinearDesign algorithm into Sanofi’s product-design pipeline for vaccine and drug development.
Drugmaker Merck, based in New Jersey, last year teamed up with BigHat Biosciences, a machine learning specialty biotech firm based in San Mateo, California—in Silicon Valley—to design drug candidates. BigHat, founded in 2019, said it will use its machine-learning-enhanced Milliner design program to “synthesize, express, purify and characterize molecules” for Merck.
Google parent Alphabet, based near BigHat in Mountain View, California, recently launched Isomorphic Labs based on AI breakthroughs at its DeepMind AI operation. The unit is “in talks with multiple pharmaceutical companies on potential partnerships,” according to Information Age, a technology news website.
The work will build on DeepMind’s AlphaFold2, an AI program that demonstrates the “unprecedented capability to accurately predict” all possible protein shapes in the human body, Information Age said in December.
Meanwhile, other large medical-research organizations are doubling down on AI in the race to discover drugs to treat dread diseases, which are critical ailments including cancer, stroke, etc.
At the Cleveland Clinic last year, researchers created an AI-based tool for analyzing vast amounts of genetic data related to Alzheimer’s disease.
The tool will help examine 156 risk-associated genes through the model, which is based in deep-learning technology—a program that “learns” from large amounts of data to identify and analyze information. The model has already identified gemfibrozil, a cholesterol medication prescribed to reduce blood fat levels, as a strong candidate for potential prevention and treatment of Alzheimer’s disease.
Researchers from Indiana University joined a nationwide effort in 2020 to use artificial intelligence to study Alzheimer’s disease under a five-year effort funded by a $17.8 million grant from the National Institutes of Health.
Forty investigators at 11 research centers, including the IU School of Medicine in Indianapolis, are joining forces in an effort to comb through huge data sets to bolster precision diagnostics and develop new treatments.
“You can never have too much data,” said Andrew Saykin, director of the Indiana Alzheimer’s Disease Research Center, based at the medical school. “If we’re to have studies that are valid, that are going to be replicated, it’s just critical to have large data sets and multiple data sets.”
Some of that data will be magnetic resonance images of the brain in Alzheimer’s patients and looking at plaques and tangles in the brains of patients through massive collections of positron emission tomography scans.
Once the data is sliced and diced by hundreds of possible patient types, the amount of data can seem overwhelming without AI tools.
“I mean, there’s 3 billion DNA base pairs in the human genome,” Saykin said. “So, each individual who is sequenced has billions of data points in just their DNA analysis. … It would be impossible for a human to comb through all that data and find anything significant.”
He said the team is trying to use all the leading AI tools to look for associations or insights that would lead to potential treatments.
Technology has advanced to the point that researchers can no longer ignore it. And engineers are developing tools to allow researchers to ask the right prompts that allow AI to find the proverbial needle in the haystack, said Shaun Grannis, vice president for data and analytics at Indianapolis-based Regenstrief Institute.
“We have finally developed the rocket engine in AI,” Grannis said. “Now we need to learn how to control it, use it, attach it to a ship pointing in the right direction.”•