Reading Time: 3 minutes

Daniel NivenNot many patients would be happy to hear there’s a lag of about 17 years between when health scientists learn something significant from rigorous research and when practitioners change patient care as a result.

But that’s what a now famous study from the Institute of Medicine uncovered in 2001.

The study reflects a major problem that has plagued healthcare for decades: the timely integration of high-quality scientific evidence into daily patient care.

If you knew there was research available to guide the healthcare you required, wouldn’t you want your care provider and the system to use that research in relation to your care? Wouldn’t you want to receive care that’s scientifically proven to be of benefit, rather than care that’s proven to be of no benefit?

Although it’s been clear for centuries that science contributes to advancing medicine and improving disease-specific survival rates (for example, the discovery of penicillin and its effect on infection-related mortality rates), this concept only became popularized within the medical community toward the last quarter of the 20th century through the evidence-based medicine movement.

More recently, those who work in the field of ‘knowledge translation’ have been working hard to close the gap between research and practice. For the most part, they’ve been successful by making the abundant research findings more accessible to policy-makers, professional societies and practitioners, and nudging them to adopt more timely evidence-based practices.

Their methods have largely focused on the adoption of new beneficial practices – drugs, tests or interventions with substantial evidence behind them. But a pattern has emerged from the scientific literature: new is not always better and too much healthcare can be bad for your health.

Owing to the recognition that unnecessary practices may negatively affect patient outcomes – and contribute to burgeoning costs within healthcare – there is now a movement to promote the discontinuation of practices used in patient care that research finds to be of no benefit or potentially harmful. Initiatives such as the Choosing Wisely campaign, the Less is More and Reducing Research Waste have sprung from medical professional societies and high-ranking medical journals to help reduce the practice of too much healthcare.

It turns out that cervical cancer screening in women under 30 years old is not beneficial and may cause unnecessary follow up testing; the use of bone cement to treat painful spine fractures among patients with osteoporosis doesn’t improve pain any more than usual care; and placement of stents in the coronary arteries of patients with narrowed arteries but minimal symptoms is no better than treatment with medications alone.

Other examples include reducing the use of a sophisticated monitoring device (pulmonary artery catheter) to obtain frequent measures of heart function in patients with heart failure and tightly controlling blood sugar using intravenous insulin in patients admitted to intensive care units.

For each of these examples, new research demonstrates that they don’t improve patient outcomes, yet each persists to some degree in clinical practice.

The 17-year gap between research and practice traditionally refers to the time required to adopt new practices. Unfortunately, new research shows it may take even longer to abandon unnecessary practices. Shortening the gap between research and practice has been a long time coming, and can only help improve outcomes for patients and control health spending.

How do we get there?

Shortening the time between research and practice will require an increased understanding of what it takes to implement new research and a reduction in the time new research is reflected in professional guidelines.

Guidelines also need to be less cumbersome and directed more toward use at the point-of-care rather than simply a reference document. Health-care systems also need to be engineered so frontline providers have a greater likelihood of providing care congruent with current science. This is likely best facilitated by using comprehensive electronic medical records. Given that many healthcare systems still employ the traditional paper-based charting and order system, this will require considerable financial commitment.

Moving from research to improved practice more rapidly will also take an engaged group of stakeholders – professional societies, healthcare providers, patients and their family members, medical administrators and governments – who appreciate the long-term benefit possible from such considerable initial investment of time and money.

A healthcare system that enables providers to consistently deliver care that aligns with recommended best practice should be a national priority.

Daniel Niven is an intensive care physician and assistant professor in the Departments of Critical Care Medicine and Community Health Sciences in the Cumming School of Medicine at the University of Calgary.

David is one of our contributors. For interview requests, click here.


The views, opinions and positions expressed by columnists and contributors are the authors’ alone. They do not inherently or expressly reflect the views, opinions and/or positions of our publication.

© Troy Media
Troy Media is an editorial content provider to media outlets and its own hosted community news outlets across Canada.