Sax Institute researchers shared new insights into the diverse ways in which research and evidence can be translated into policy at a recent conference in Sydney.
The Emerging Health Policy Research Conference, hosted by the Menzies Centre for Health Policy at the University of Sydney, explored the policy response to health challenges both locally and globally, as well as showcasing new health policy research.
Gai Moore, Principal Analyst in the Institute’s Knowledge Exchange division, presented early results from a study into the way in which knowledge brokers assist policy makers to commission rapid reviews, such as the Sax Institute’s Evidence Checks.
Knowledge brokering is a strategy frequently used to enhance dialogue between policy makers and researchers, and the Institute uses knowledge brokers in one-off brief interventions to commission rapid reviews to help inform decision making on a broad range of health and community services, she said.
“Using knowledge brokers has been found to increase the clarity of rapid review proposals, yet we know very little about the process by which they assist policymakers to define their rapid research needs,” she said.
Ms Moore led a study analysing transcripts from 15 knowledge brokering sessions, focusing on three themes: how knowledge brokers elicit information, how trust is enacted in the knowledge brokering session, and how the content of rapid reviews is negotiated.
“Early results suggest that brokers are skilled facilitators, and that trust is built by a broker’s open and neutral stance, and by their knowledge of the policy context,” she said.
The SPIRIT of research utilisation
Abby Haynes, senior research officer for CIPHER (Centre for Informing Policy in Health with Evidence from Research), outlined some of the findings from the process evaluation of a novel, year-long intervention trial designed to increase policy agencies’ capacity to use research in policy and program development.
Six Sydney-based health policy agencies took part in Supporting Policy In health with Research: an Intervention Trial (SPIRIT), and an in-depth process evaluation was conducted to explain how and why the intervention functioned as it did in each site.
Although the realist analysis was still in progress, Ms Haynes told the conference that a number of key mechanisms have been identified that explain the causal relationship between the intervention, agency contexts and the process outcomes.
“Findings suggest that research utilisation interventions in policy agencies that activate these mechanisms can engage busy professionals and provide valuable content that informs practice,” she said. “Our research also emphasises the need to consider agencies’ contextual characteristics in study design and implementation planning because the same intervention can result in quite different process outcomes in different sites.”