How can we save lives in hospitals? Start by looking for and investigating red flags

0
281

State health departments should continuously monitor the hospital activity data it collects for red flags. Tyler Olson/Shutterstock

Earlier this month, the Victorian Health Department released the results of an independent review into its handling of the tragic events at Bacchus Marsh hospital between 2013 and 2014.

Over this period, seven babies suffered avoidable deaths as a result of deficiencies in clinical care. The review shows a serious failure of clinical governance, with the responsible health service (Djerriwarrh) failing to respond appropriately to a number of serious safety breaches and complaints about the hospital.

The report raises the question: would public reporting of hospital safety measures have brought unsafe practices to light much earlier, perhaps triggering timely and potentially life-saving intervention?

Public reporting has a long history – dating back at least to Florence Nightingale – but its effect is at best indirect. It relies on hospitals acting to improve performance because of perceived reputational risk or from market pressure. In situations of oversupply or monopoly supply, neither motivation may be important.

Public reporting is a second-best solution. A far better improvement strategy is to make sure, through performance monitoring and regulatory strategies, that hospitals own their performance issues and act on them.

Public in the dark about poor safety

Before news of the deaths broke in October, the Bacchus Marsh community would have had no idea how unsafe their hospital’s obstetric practice was.

The My Hospitals website gave no sign of any problems. Though it was designed to give the public easy access to hospital performance information, My Hospitals publishes only two indicators for hospital safety: hand hygiene compliance and rates of staph infections. Neither tells you much about overall safety at a hospital, particularly in smaller hospitals where staph cases are extremely rare.

Those who like to keep a close eye on their local health service’s annual report might have noticed that Bacchus Marsh hospital failed a safety review in 2013. As is the case with most official reports, however, the information was strategically presented, so readers could easily have missed the gravity of this disclosure.

The public would have known that one of the obstetricians at the hospital had been reported for unsafe practice, had the responsible body investigated the complaint in a timely fashion. As it was, it took the Australian Health Practitioner Regulation Agency (AHPRA) 28 months to conduct its investigation. The restricted conditions on the doctor’s registration were not made public until June 2015, by which time he had retired.

Finally, the public did not know about the hospital’s high fetal and infant (perinatal) death rate. These rates are calculated from five years of data and published for all hospitals – except those with fewer than five perinatal deaths in any one year of analysis. This rule excluded Bacchus Marsh.

Pros and cons of public reporting

Public reporting in Australia is in its infancy and its impact has not been evaluated. The research evidence, mainly from the United States’ experience, on the value of performance “report cards” is mixed.

Well-designed information can be useful to patients, though surgeons may not alert their patients to it. A bad report card also doesn’t seem to affect the number of referrals a surgeon gets from doctors and specialists. Report cards may therefore not stimulate improvement in hospital quality.

Inevitably, things will sometimes go wrong in hospitals. What matters is that the organisation is able to learn from those events and reduce the likelihood they will happen again. That will happen if clinicians have information about their performance relative to peers in other hospitals.

Such a change relies on everyone involved feeling able to report errors and “near misses” without thinking they will be singled out and blamed. What hospitals must aim for is a “just and trusting culture”.

One problem with report cards is that it is easy for a hospital to be “named and shamed” in the media, as has happened recently.

This in turn may create a “name and shame” culture internally. When people fear adverse consequences from reporting problems, they tend to stop reporting them. The problems may never be picked up at all, whether internally or by the public. Report cards thus need to be handled with care.

A better way

A much better solution is for state health departments themselves to monitor continuously the hospital activity data they collect for red flags.

If departments follow up early danger signs with clinical audits that are designed to support improvement, rather than punish failure, the government can strengthen safety while minimising the risk of hospitals under-reporting.

Routine use of data to monitor performance and prioritise safety audits should already be commonplace. The data has been available for decades now and is still under-used.

Only after its own safety scandal in 2005 did Queensland Health begin using such data to monitor safety. The case will probably be the same in Victoria, if the department draws the right lessons from the wrongs at Bacchus Marsh.

Stephen Duckett was responsible for the design of Queensland’s approach to monitoring quality and safety of health care following the Bundaberg Hospital scandal.

Terri Jackson has previously received research funding from the Australian Commission on Safety and Quality in Health Care, and from the Victorian Department of Health and Human Services.

Danielle Romanes does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond the academic appointment above.