Other handy tools from Health Economics (guest blog 4/4)

Dr. Eliana Jimenez Soto, experienced Health Economist

Dr. Eliana Jimenez Soto, experienced Health Economist

The Beacon Strategies team would like to welcome Dr. Eliana Jimenez Soto as a guest blogger for the coming months. Eliana is a health economist with over 15 years of expertise in health financing, data analytics and M&E. From a PHN perspective, Eliana has held roles as a Senior Technical Advisor to a number of PHNs, providing technical advice on outcomes-based commissioning financing and the overall use of evidence to improve program performance and outcomes. Eliana has also been a consultant on health economics on equity-based financing to WHO and UNICEF. Her international assignments include the development of complex financing costing and epidemiological models to improve service delivery and evaluate outcomes for disadvantaged populations.

Economic tools deal with how well (or poorly) we are using available resources from dollars to human capital, and as the old saying goes ‘there is more than one way to cook an egg’.  In our last blog, we draw on a couple of examples to show how they can be used to help commissioning organisations. 

Spending reviews to evaluate the big picture of programmatic areas

A common problem for many organisations is that after just a few years of incremental budgeting, the patterns of expenditure get extremely distorted and is easy for managers to lose sight of the big picture. 

In many instances, the first step is to evaluate the overall expenditure patterns of programmatic areas. The so-called Spending Reviews which underlie any serious performance or for that matter outcomes-based financing:   

  • Aim at answering the question, are we investing in the ‘right things’? 

  • Involve a strategic analysis of programmatic expenditure against priorities and best-practice

  • Put all the pieces of the expenditure puzzle together to reveal where there is poor alignment between expenditure, priorities and KPIs

‘Aha’ moments usually come when managers see which priorities are slipping through the cracks and receiving insufficient funding; identify spending bottlenecks; or realise that a sizeable proportion of funding goes to activities of limited capacity to benefit, not supported by evidence, or only very tenuously linked with commissioning priorities.  

Basic modelling of the large transaction costs of atomised funding or benchmarking large investments can complement Spending Reviews and make a convincing case for change.   

Open-box evaluations with cost-consequence analysis

When looking at individual projects a common mistake is to undertake M&E exercises that leave out critical economic considerations. 

And I am not talking about the type of cost-effectiveness analysis (CEA) that we read about in academic journals and most health managers are familiar with. 

CEA is critical for many investment decisions. However, for commissioning organisations pursuing broader goals including the Quadruple Aim a single measure of effectiveness (i.e. QALYs) is not suitable as it leaves out important outcomes related to person-centred care, staff satisfaction, equity or community engagement.  While Cost-Benefit Analysis offers the possibility of adding a monetary value to those outcomes, the required evidence is rarely available. 

Though gaining currency nowadays in countries like the UK, Cost-Consequence Analysis (CCA), is a much undervalued and underused economic evaluation framework.  

CCA provides a systematic way of assessing costs (including the very important opportunity costs) against activities, outputs and outcomes without imposing a single metric or extraneous assumptions.  

The catch is to open the box rather than hiding it behind modelling assumptions and clearly spell out the activities and resources (properly costed) that the program is using vs. intended outputs and outcomes.  

CCA is an ‘acid test’ for those fussy program logics/models of care that make it into contracts since vaguely defined activities/inputs cannot be properly costed. 

By blending insights from economics, implementation and service management in an open-box evaluation, CCA becomes a powerful tool for clearly identifying which elements of the service are adding/detracting value. 

Basic behavioural economics for co-design and monitoring & evaluation

I have found that important insights from behavioural economics, such as those related to performance incentives, though highly relevant for market shaping and outcomes-based commissioning are usually absent from many co-design and M&E exercises.  

Performance incentives was perhaps the most contentious topic when I taught health financing to Public Health graduates. Understandably so, since health professionals take pride in their occupation, which many have chosen due to important personal values. 

Unfortunately, financial pressures facing many service delivery organisations are putting health and social service professionals between a rock and a hard place. High coverage targets might put pressure to focus on low-cost clients to manage caseloads. If commissioning organisations include a client-mix target, service frequency might go down below required levels affecting quality and client outcomes. 

Adaptive strategies to manage performance expectations (i.e. performance incentives) are just common sense. Any co-design or M&E exercise that fails to consider where those in-built incentives are will leave the door open to the unwanted side-effects of staff burn-out, low quality, poor coverage, and unsustainable models of care.

Data analytics with the economics ‘oomph’ - a larger framework

I am unapologetically convinced that good commissioning requires insightful data analytics and modelling. And yes, ‘insightful’ data analytics involves much more than clever use of spreadsheets, statistical packages and BI tools. 

Economics can certainly add the ‘oomph’ factor. Even so, to really harness their power we need to integrate this work into larger frameworks geared towards supporting organisations manage change and continuously improve quality. 

On its own, good data analytics/modelling produce insights that go into dashboards and reports that a few people browse. Only in this larger framework can organisations effectively use them to build organisational capacity, shape markets and influence change.


Previous
Previous

What’s new in mental health in 2019?

Next
Next

From equitable resource allocation to outcomes-based commissioning: a basic roadmap (guest blog 3/4)