Oncologists see value in real-world evidence, but remain skeptical about using it in treatment decisions
Twenty years ago, no one was talking about real-world evidence—but I clearly remember the day I began to understand its importance. While leading one of the largest oncology practices in the country, I was aware of many advanced colon cancer patients with limited treatment options. When a new two-drug therapy regimen called IFL was heralded by ASCO and then approved by the FDA, my partners and I embraced it with great anticipation. Unfortunately, reality did not meet expectations. A few weeks after treating my first patient with IFL, he was admitted to the ICU with life-threatening toxicity. Disturbingly, my partner’s patient was in the next bed suffering the same outcome from IFL treatment. I decided to query our new practice EMR on the first 20 IFL patients and found that 17 had been hospitalized, half in the ICU —a far different response than those patients in the lauded clinical trial. Based on this real-world data, we quickly pivoted to a safer application of these drugs known as FOLFIRI, but it was four more years before IFL was formally abandoned in the U.S.
That experience helped me recognize that clinical trial data, which serves as the foundation for most clinical decisions, provides a limited, and often not representative, assessment of a treatment’s efficacy and toxicity. In oncology, only three percent of all adult patients participate in clinical trials – and they tend to be younger, healthier and less diverse than patients who don’t participate. To understand the other 97 percent, we need to rely on real-world evidence (RWE).
Over the past few years, we have seen growing awareness of the importance of RWE in the healthcare industry. With the passage of the 21st Century Cures Act, Congress advocated for the use of more RWE in regulatory submissions and, earlier in 2019, the FDA issued guidance to sponsors to clarify and simplify the process for including RWE in regulatory applications. At the same time, the pharmaceutical industry has deepened its commitment to real-world research – funding everything from claims-based health economics studies to medical record (chart) reviews to patient-reported outcomes research. These studies have helped both healthcare providers and payers better assess the value of the treatments administered and more clearly understand the patient experience.
In light of these developments, my colleagues and I at Cardinal Health Specialty Solutions were eager to understand oncologists’ views of real-world evidence and how they use it in their practices, if at all. Our latest research, gathered from surveys with more than 170 oncologists in the fall of 2019, was published in our Oncology Insights report.
Our overall finding was that, while oncologists acknowledge the growing importance of RWE, there is a disconnect in their understanding and adoption of it within their own practice.
Broadly speaking, 67 percent of oncologists agreed RWE is necessary to inform treatment decisions due to the limitations of clinical trials – and 63 percent say they use RWE at least somewhat frequently to inform their decision making.
However, when asked specifically how RWE adds value to their practices, respondents said they primarily see the value in using RWE for select patient populations, such as older adults or poor performance patients, who are underrepresented in clinical trials. Only 15 percent agreed that RWE helps to inform most or all their treatment decisions.
The survey also indicated that many oncologists do not fully understand what real-world evidence is. When asked to define RWE, less than half (48 percent) of respondents correctly identified observational data outside a clinical trial as RWE, and even fewer correctly identified pharmacy and health insurance databases (32 percent) and patient-powered research networks (25 percent) as sources of real-world data.
We saw mixed perspectives when we asked about whether RWE data should be used on dosing and scheduling in label expansions. While 51 percent of queried oncologists said it should be considered unconditionally, 38 percent said it should only be used for limited populations, such as older patients, and 10 percent believed it shouldn’t be used at all.
Part of the reason why oncologists have been slow to embrace RWE may be related to the way the data is structured. Oncologists have been trained to rely on clinical trial data, which is both rigid and uniform. Real-world data is inherently more varied, which some providers may perceive as less reliable. Clinical trials for advanced-stage treatment of solid tumors typically look at efficacy and safety, as measured by defined, specific assessments, such as CT scans performed every eight weeks. In the real world, physicians are less rigid in their patient assessments and their documentation, which has resulted in RWE surrogate measures like Time to Next Treatment (TNT) being used to demonstrate efficacy and toxicity. To address the lack of uniformity, some RWE researchers are now focused on conducting studies using validated instruments and a research design that delivers data that is not only more predictable and measurable, but which also can be directly compared to clinical trial outcomes.
While some continued skepticism about RWE may persist, oncologists may soon find that they need to get on board or find themselves left behind. The embrace of RWE by policy makers and drug developers is rapidly changing the landscape of the industry. In just the past six months, five oncology drugs have gained FDA approval with a submission dataset that included RWE – and as demands to get new products to market more rapidly continue, the reliance on RWE is expected to increase. Oncologists who embrace the possibilities of RWE may not only be in position to more clearly understand the best treatment options in an increasingly complex field, but to deliver care more efficiently and effectively.