We say it works, but does it? Rethinking effectiveness in health care
This article is part of our Impact Governance series, exploring the core domains that shape how health service organisations maintain systems that deliver safe, effective, person-centred care.
To support this work, we’ve developed an Impact Governance Self-Assessment Tool to help organisations reflect on their current maturity and identify practical next steps. Access the self-assessment tool here.
———
Effectiveness is often assumed, rarely examined. In many organisations, it is mistaken for effort. We count hours worked, services delivered, staff in post. These are signs of activity, but they do not always mean that outcomes are being achieved.
Effectiveness, in clinical governance, is not about how busy a system is. It is about whether what is being done makes a meaningful difference to the people it is meant to serve. That requires us to ask harder questions. Are our services achieving what we say they will? Do our actions result in better health, better connection, better quality of life? Are our decisions grounded in outcomes, or just in operations?
These questions matter because health, care, and support services are rarely short on commitment. Most teams are doing a great deal. But not all effort translates into impact. A program can run on time and on budget and still fail to improve anything. A clinical team can follow every guideline and still deliver care that misses the mark for the person receiving it.
High-performing systems recognise this. They do not just collect data. They learn from it. They distinguish between activity metrics and outcome indicators. They connect strategy to experience and use that connection to drive change. Where effectiveness is understood as a governance concern, organisations build the capacity to reflect, adapt, and re-align as conditions shift.
In the literature, this is well established. Research into early clinical governance systems in the United Kingdom found that organisations with stronger accountability structures and clearer leadership were significantly more likely to act on outcome data. These were not necessarily the systems with the most sophisticated reporting tools. They were the ones where governance forums were structured to ask better questions and had the authority to act on what they found.
This is not simply about having the right indicators. It is about whether those indicators inform the decisions that follow. Outcome data should not sit separately from operational planning or quality discussions. It should shape them. When measurement becomes a retrospective task, something done to complete a report, the link between evidence and action is broken.
Many organisations fall into the trap of treating audits as proof of effectiveness. But audits only tell us whether a task was completed. They cannot tell us whether that task mattered. Effectiveness requires a deeper kind of scrutiny. It involves asking whether services achieve what they intend to, for the people they are designed to help.
When this works, it is not always dramatic. Often, it looks like small course corrections made early. A shift in priority because the data suggests a program is drifting. A conversation about outcomes that changes how a service is designed. A decision not to expand a model that is well liked, but not working. These moments reflect a culture where effectiveness is not just something to declare, but something to pursue.
This kind of governance is not oppositional. It does not undermine teams. It supports them. It helps staff know that their effort is going somewhere and that systems are designed to understand and amplify what works. In environments that are constantly under pressure, this is one of the most important contributions governance can make.
Effectiveness is not a fixed state. It is a practice. It requires alignment between intention and action, between evidence and response. Where that alignment is missing, services can remain busy and well-intentioned but ultimately unaccountable. Where it is present, even limited resources can be used to real effect.
Governance systems must be set up to hold that alignment. This includes how data is selected, how success is defined, how results are interpreted, and how decisions are made when performance falls short. Most of all, it includes whether the organisation sees effectiveness as a technical goal or a matter of integrity.
If safety is the floor that prevents harm, effectiveness is the roof that gives care its meaning. One without the other is incomplete. Good governance must hold both.
Reflections for Your Organisation
Are we using outcome data to shape decisions, or simply to validate them after the fact?
Do our governance forums ask whether our services are working, not just whether they are running?
When performance falls short, do we have the will and the systems to adjust, or do we press on regardless?
Practical Actions to Strengthen Effectiveness Governance
Select one major service or program and gather the outcome data from the past six months. In a cross-functional session, ask three questions: What were we trying to achieve? What evidence do we have that it happened? What changed as a result of what we learned?
Review the most recent board or executive meeting agenda and note where effectiveness was discussed. Was it based on data, experience, or assumptions? Take one area of vague or uncertain impact and commission a short outcome review before the next meeting.
Interview three service leads and ask what data they rely on to understand whether their program is working. Note whether their answers include meaningful outcomes or mostly refer to activity or compliance. Share the themes with your quality or strategy team to improve the metrics being used.
Revisit a recent quality improvement initiative and trace whether the improvement was defined in terms of outcomes or process completion. If the latter, ask whether those processes actually led to a better experience or result for the people involved. Use that learning to reframe how improvement is scoped and measured in future.
Add a standing question to governance and leadership meetings: “What are we currently doing that is not working?” Make space for discomfort and ensure there is a process to translate insights into action rather than defensiveness.
Identify one area of real-time monitoring in your service and assess whether it produces actionable insight or just confirms known trends. Use that finding to either refine the indicator or reduce the reporting burden so that time is spent where it adds value.
———
Beacon Strategies supports health and human service organisations to strengthen governance systems that deliver measurable impact.
If you’re looking to assess and strengthen your approach, our Impact Governance Self-Assessment Tool offers a structured starting point for boards and executive teams. Access the tool here.