Getting the right evidence to the right person at the right time to make a difference
Diane Watson
Dr Diane Watson discussed how to make evidence "useful"
Dr Diane Watson outlined criteria for “useful evidence”

Making evidence and data useful in policy and action takes a lot more than publishing a research paper, says Dr Diane Watson*, who offered insights into “useful evidence” in a keynote address to the 45 and Up Study annual meeting. 

For many researchers, the ultimate goal is to get a paper published in a peer-reviewed journal. Of course, that’s something to be proud of, but does it make that research useful?

In reality, it’s unlikely to do so since evidence suggests that “we are awash in a sea of unread journal articles”.¹

I recently tried to access one of my own papers on what makes evidence useful.² I wrote it at the request of the journal editor four years ago and I no longer have a copy. I found the link to the paper online ‒ then hit a paywall.  I didn’t go any further. The irony!

If I, the author, was stymied by barriers to access, how can that commentary become useful more broadly?

In recent years, there has been rapid growth in the demand for research evidence and meaningful information derived from analyses of health data, with governments, health insurers and providers all building more in-house capacity than ever before to inform and monitor their efforts to improve health policy and care.

My experiences at the Bureau of Health Information and the National Health Performance Authority (NHPA) showed clearly that if evidence derived from big health data, using tools of research, is going have the best chance of making a real impact, it must meet a number of key criteria.

Of course, it must be both credible and impartial ‒ but that’s not enough.  The research must also be accessible, it must be relevant to the intended audience, and it needs to be presented or accessed by the right person, at the right time.

Making evidence accessible and relevant

That means thinking about the type of “product” in which research is published or disseminated, as well as considering the intended audience ‒ those you hope will see and act on that evidence.

If you want to get your message to parents or patients, why write a publication more than a couple of paragraphs long? Very few people spend more than 10 minutes reading about something, more will spend three minutes, even more spend one minute, and an even greater number will spend just five seconds.

If you have written a 25-page paper, you probably need another one-page version. The first paragraph of that one-page version should highlight what you’ve learned and entice the reader to continue.

After we released several reports on the performance of hospitals and primary care at the NHPA, it became apparent that people working in local health networks were interested in information about how their area measured up across a number of measures ‒ but they needed to see that data at a glance rather than search for it in numerous different reports.

We put together a one-page infographic, showing how each local area measured up against other comparable areas on 17 different measures ranging from the percentage of adults who were overweight to waiting times to see specialists to immunisation rates and rates of potentially avoidable deaths. The graphic was printed on posters and sent to each area – and many then displayed that data for all staff to see and act on.

Who is the target audience?

When it came to data on immunisation rates, the NHPA discovered that many in our target audience, including the nurses who do the vaccinations, had never seen immunisation rates for their own local areas.

The NHPA released the data on an interactive webpage where anyone could drill down to their postcode local area, to see what percentage of children were immunised and the number of children that were not.

It was a hit on day one and within 30 days of being released in 2013, it had generated 560 media items. The  website was accessed 38,000 times within the subsequent two years.

When we revamped that interactive tool in 2016, it was accessed 22,000 times within the first two days of its release ‒ and 60% of those users were from Facebook, with posts all generated by people spreading the word about immunisation in their local communities.

Putting the evidence in context

It’s not only the content of the evidence, but the context that plays a role in determining how relevant it will be to an audience.

In looking at variations in care, the NHPA named local communities where the rate of avoidable hospitalisations for chronic health conditions was nine times higher than of other similar communities, and released details of public hospitals that had running costs twice that of other comparable hospitals for providing similar services to similar patients. This information is available on MyHealthyCommunities and MyHospitals websites.

On the flipside, we also highlighted hospitals that had achieved remarkable improvements in efficiency and were seen as wonderful role models.

Because we ensured meaningful information was just a click away, the NHPA was hosting 200,000 page views each month before it closed as an organisation on 30 June 2016.

But we also need to ensure that our research reaches the right people, at the right time.  An ideal example of doing just that was the Australian Commission for Safety and Quality in Health Care’s release of the Australian Atlas of Healthcare Variation with recommendations in relation to antimicrobial dispensing, diagnostic interventions, surgical interventions, mental health and psychotropic medications, opioid medications and interventions for chronic conditions. At the time of release, the report was accompanied by acceptance or action statements from 12 peak bodies ‒ the data was already having an impact.

Useful data for the future

It’s not only the way we report on new evidence that is changing. Traditionally, researchers studied one topic, with one dataset, but increasingly, we need data that is produced by inter-professional research collaborations, in conjunction with extensive stakeholder engagement.

The future will no doubt see more research based on linked data, there will be greater demand for data predicting the likely outcomes of planned interventions, as well as a greater need for evaluations showing how interventions and policy programs have worked on the ground.

We are moving towards getting the right information, to the right people, at the right time to make a difference ‒ and it is that “useful evidence” that will become the catalyst for positive change in Australia’s health system.

*Dr Diane Watson is a Senior Adviser at the Sax Institute. She was the inaugural and only chief executive of the National Health Performance Authority between 2012 and 2016, and prior to this was inaugural chief executive of the NSW Bureau of Health Information.

An edited version of this article appeared in The Australian: Big health data research wasted if no one reads it

References

1. O’Grady K & Roos N. 2016. It’s time for a global movement that pushes academic research beyond journal paywalls so it makes a difference in the world. Policy Opinions. 1 August 2016. Accessed 9 September 2016 at: http://policyoptions.irpp.org/magazines/august-2016/linking-academic-research-with-the-public-and-policy-makers/

2. Watson DE. Can a Book of Charts Catalyze Improvements in Quality? Views of a Healthcare Alchemist. HealthcarePapers, 12(1) April 2012: 26-31

Find out more