Webinar: Improving Health System Performance Through Economic Planning and Post-Implementation Evaluation of Health Services
The fourth webinar in the PCHSS 2020-2021 webinar series was on Improving Health System Performance Through Economic Planning and Post-Implementation Evaluation of Health Services.
In this webinar, our two speakers discussed the importance of embedding evaluation within health systems to inform decisions to improve, to expand or to contract existing services. In a constantly evolving healthcare landscape, formative and ongoing embedded evaluations provide a robust way to assess which programs should be broadly implemented, maintained, or ultimately abandoned. Moderated by Professor Henry Cutler (Macquarie University), the session featured presentations from Professor Jon Karnon (PCHSS and Flinders University), and Associate Professor Yvonne Zurynski (PCHSS and Macquarie University).
To begin, Professor Karnon discussed the “Embedded Economist” program, which links economists with health care managers and clinicians to evaluate local health services. Associate Professor Yvonne Zurynski followed with a discussion of “Embedded Evaluation” and the value of evaluation to support health program sustainability.
Professor Jon Karnon
Professor Karnon leads the PCHSS Priority-Setting and Decision-Making in Healthcare Organisations Research Stream. He is a Health Economist at Flinders University, who has been developing and applying methods for the economic evaluation of health technologies and services for over 25 years. He is past president of the Health Services Research Association of Australia and New Zealand and has been a member of the Economic SubCommittee of the PBAC since 2009. He is leading the South Australian node of the ‘embedded Economist’ project, developing and applying processes for the use of health economics in a local health service.
Associate Professor Yvonne Zurynski
Yvonne Zurynski co-leads the PCHSS’ Observatory on Health System Sustainability. She is Associate Professor of Health System Sustainability at the Australian Institute of Health Innovation, Macquarie University. She has broad experience and expertise in research and education across health sectors and disciplines having conducted research in settings ranging from primary care to intensive care. She has conducted numerous studies of patient and provider experiences of the health system, including health services use and costs (such as out-of-pocket expenses incurred by consumers). Her research focuses on integrated models of care delivery that cross traditional boundaries of disease- sector or profession-specific care, and underscores the importance of evaluating implemented healthcare services.
During the session, the speakers received several questions from attendees and provided the following responses:
We work in a complex environment both within implementation science and economic evaluation. How do we embed research into policy and clinical decision making, and how do we convince others that our research matters?
A/Prof Zurynski: I think involving policy makers and managers from the get-go and making sure that we are addressing policy issues that they are concerned about is important, and keeping them in the loop along the way. Its not just a matter of implementing, evaluating and then feeding information back and hoping that something will happen. You need to do quite a bit of work, and that takes time, and it takes engagement and building of trust between sectors. Having that conversation at the very beginning and having a shared vision and accepting the results that come out of these things and acting on them is really what is needed.
Prof Karnon: Having the pathway to decision making from the start is what we have found to be important. You finish, you publish, and you move on and there is no link into the decision-making process. I think in the future my philosophy will be only to get involved in local economic evaluations where there is that link to a decision-making process in the local health service.
Do you think that decision making processes need to change somewhat?
Prof Karnon: Yes. Working with SALHN, one of the areas that they have asked us to look at is their process for assessing business cases. This financial year they have received almost 140 business cases, and a significant portion of those will be requesting additional resources for new services and technologies. They have requested help in terms of how economic evaluation can be used to improve the quality of business cases that are being submitted but also to inform how decisions are made on the bases of those business cases.
What is an ’embedded’ researcher/economist/evaluation-ist? How is this different to an employed researcher/economist or evaluator?
A/ Prof Zurynski: It is someone who is part of the team and is involved with the team, who is actually making decisions and developing solutions. They are not just an employee who does the evaluation, they are actually embedded with the decision makers and with the people delivering the improvements.
Prof Karnon: I’d just build on that to say that from my perspective they are the link between academia and the health service. You’ve got someone who is working with the health service, but they’ve got a whole bunch of resources they can draw on- in this case at Flinders- but it could be any university.
[In reference to Professor Karnon’s presentation]: Was there any assessment of unplanned representations in Case Study 3?
Prof Karnon: We did look at unplanned representations and subsequent inpatient admissions and found no difference between equivalent patients who were seen by the physios or the ED physicians.
You both work in a complex environment; Jon, your background is mostly health economics whereas Yvonne, yours is mostly implementation science and integrated care. How do we learn from each other, i.e., how does economics learn from implementation science, how does implementation science learn from economics?
A/Prof Zurynski: There is no doubt that we need to work a lot closer together, I think we bring complementary skills to the evaluation and implementation space, and we are doing that through the PCHSS to some extent. We are trying to develop those collaborations and actually looking at interventions from multiple points of view.
Prof Karnon: I think researchers and individuals that experience implementation are key people to inform the process of interpreting published evidence in the local context. They have insights into the intervention, and know what are likely to be the issues in terms of implementing this evaluation in the local context, which is a key input to any economic evaluation.