Lessons from the rapid evaluation conference

The rapid evaluation in health care conference, held earlier this month, brought together representatives from the rapid evaluation community, including analysts, users and commissioners, to explore evidence-based decision-making in an era when budgets are tight. With the recordings of the sessions now available to watch, Stuti Bagri gives her reflections of what she learned at the event.

Blog post

Published: 29/05/2024

The NIHR Rapid Service Evaluation Team’s evaluation of patient-initiated follow-ups (PIFU) led me to spend many weeks with the data, learning to recognise its quirks and test many sets of assumptions on which our estimated answers held true. But my relationship with the data, as an ‘evaluator’, was fairly insular, and the context in which evidence can and should be mobilised was not something I knew a lot about. Attending the rapid evaluation in health care conference earlier this month, based on the theme of mobilisation under tight budgets, offered a lot of food for thought on the world of rapid evaluation beyond the horizon of analysis.

In some ways, should the research process change?

The evidence produced from a rapid evaluation cannot amount to much unless it is picked and used by the right people. However, this link between evidence production and use is not always forged seamlessly. From many speakers throughout the day, I heard that researchers and decision-makers need to understand each other’s worlds better and learn, or at the very least attempt, to speak each other’s language for collaboration between the two to be a realistic feat (watch the “From rapid evaluation to rapid change” session). Carving out the capacity to do so, by allocating more project time towards dissemination or changing the words used to present data, should be given some serious consideration. This is especially important for those relatively early on in their career, as bringing long-term change to working habits is easier to do when less set in certain ways of working.

In other ways, the research process will inevitably change

Some sessions shed light on how the world of research and evaluation is changing. First, patient and public engagement (PPIE) is, rightly so, becoming an expectation. Even £100 of the budget allocated for a project, as demonstrated by one of the panellists on the PPIE session (watch “Patient and public involvement in rapid evaluation”), can be used towards gaining meaningful feedback from members of the public or patients in different ways. Second, the integration of AI in methodology has slowly begun to take off, with some researchers outsourcing data analysis in some part to AI (watch “Using AI as a tool in evaluation”).

Both these tools will soon become important aspects of planning that the research community will regularly make decisions about. Learning to use AI tools or working collaboratively with a PPIE representative on a project can go a long way in demonstrating the ability to work to contemporary standards. For researchers in any stage of their career, looking out for opportunities where these skills can be learned can help kickstart a journey towards modernising research processes.

Learning while 'out of office'

As a researcher, one of the health service datasets that I spend most time with is on urgent and emergency care. So when a conference attendee asked a question to a panellist and identified herself as the head of urgent and emergency emerging policy, my ears perked up. Her views on the challenges and opportunities of rapid evaluation in accident and emergency settings helped give me richer insight into data that I regularly see, in a way that I would not have got looking at documentation in the office.

Admittedly, there was some feeling of disillusionment at the end of the conference after hearing about how difficult it can be for evidence to be mobilised. What stuck with me the most is a mantra proposed by one of the speakers (watch “Making evidence based decisions”): “better evidence + better decisions = better outcomes”, as it highlighted that evidence is just half the battle. However, there was also a lot said on what we can do better, like communicating using better distilled summaries (watch “Current challenges in the system – where next for rapid evaluation?”) and using theory-of-change models, to make collaboration possible.

Paying attention to what you can control, instead of what you cannot, is the thought with which I strive to incorporate these lessons from the conference. 

*The Rapid evaluation in health care 2024 conference was held earlier this month. You can watch the sessions from the event here.

Suggested citation

Bagri S (2024) “Lessons from the rapid evaluation conference”, Nuffield Trust blog

Comments