Developing effective treatments for cancer is perhaps the greatest health care challenge facing modern society.
One-third of adults will develop some form of this disease within their lifetime. Because it touches so many of us, we are all invested in trying to find cures.
But in this pursuit, we must avoid the risk of over-investing emotionally, scientifically and financially in one approach at the expense of another.
I fear a trap we have fallen into is our commitment to the philosophy of “big data” and “machine learning” in attacking cancer. The belief is that if we collect enough data on enough patients, then we will be able to find statistical patterns indicating the best way to treat patients and find cures.
This is fundamentally misguided.
Machine learning relies on properties of large groups of people that hide characteristics of the individual patient — this is especially problematic for a disease that manifests itself so differently from person to person. Indeed, cancer is not even one disease; there are dozens of cancers. Recent high-profile efforts applying big-data methods to oncology indicate other approaches must be explored.
In May of 1961, President John Kennedy proposed that the United States send a man to the moon within the decade. We just celebrated the 50th anniversary of that successful program. Contrast this with President Richard Nixon’s proclamation in December of 1971 when he boldly stated it was time to attack cancer with the same resources as the moonshot.
Nearly 48 years after he signed into law the National Cancer Act, we still have not conquered this disease. While there have been tremendous advances in many types of cancer, too many have seen too little improvement.
A fundamental difference between these two programs is that the mathematical theory for gravity was known for 274 years at the time of Kennedy’s proposal, but we did not have a mathematical theory of cancer in 1971 — and we still don’t. And without a mathematical theory, we are left with trial and error. Imagine trying to get to the moon by launching thousands of rockets, recording the events, and analyzing the patterns to find the right combination of rocket features to make the journey. It’s too absurd to even discuss. Yet, this is what the “big data only” approach would have us try in the fight against cancer.
It should be noted that when scientists use the word “theory,” they mean a systematic explanation of natural phenomena that can make predictions that have been rigorously tested against observation. For example, without Albert Einstein’s theory of relativity, the GPS on your phone does not work.
There are a few teams around the country that have been laboring to develop a mathematical theory of cancer relying on patient-specific differences — the very things that make us individuals. The idea is to write down mathematical equations that describe how tumors grow and respond to treatment in terms of established biology and physics.
Then, when working with a particular patient, the person’s individual data determines the parameters in the mathematical model. This model then makes a prediction about this specific patient using only his or her characteristics.
The major effort in the field is to discover the “best” equations to describe cancer, and then test these equations to optimize therapy for the individual patient. Investigators working in this area are determined to transform oncology from population-based to patient-based therapy. But financial support for these efforts are dwarfed by support for the methods of big data and machine learning.
Although it is undeniable that machine learning has a role to play, it is time to rein in our unrestrained excitement, and financial investment, in the “promise of big data.” Rather, we advocate for a more balanced attack on the problem from a direction that has served us well for millennia, using what the great physicist Eugene Wigner termed the “unreasonable effectiveness” of mathematical modeling to describe the natural world.
Thomas Yankeelov is the W.A. “Tex” Moncrief Chair of Computational Oncology and a professor of biomedical engineering, diagnostic medicine and oncology at The University of Texas at Austin.
A version of this op-ed appeared in The Hill.