Failures of Automation: Misunderstanding of the Hierarchy of Evidence

Researchers who are into blood sports would be hard pressed to find a more engaging fight than that between the Cochraneists and the Clinicians over oseltamivir (Tamiflu).

The Clinicians derisively refer to the epistemic flaws of the Cochraneists as, 'Methodolatry'. This is an allusion to the Evidence Based Medicine movement's supposed fetishisation of the Hierarchy of Evidence and the methodologies it employs. To cut a long story short, the aggregate data from systematic reviews of oseltamivir could not definitively demonstrate sufficient benefit to outweigh the potential harms.

Meanwhile, the eminent gentlemen over at Science Based Medicine were adamant that on the ground, anti-flu drugs are a crucial weapon in the arsenal against a deadly disease and possibly a pandemic.

So... whom to believe?

The collators of data because data doesn't lie, right?

Or the skeptical clinicians, because skeptical clinicians are inured against various biases, right?

The source of my facetiousness should be obvious.

Clearly clinical expertise is useless without knowledge gleaned from randomised controlled trials. Meanwhile, randomised controlled trials can throw up perverse results for all sorts of reasons, and are in and of themselves useless without interpretation by people who have clinical expertise.

Indeed, this fight is a classic example of automation failure.

Automation, at its most basic, is simply the removal of human input from a process - whether that process involves manufacturing, computing, or analysing evidence.

Automation is beloved of any individual or industry valuing standardisation and efficiency divorced from the vicissitudes of human folly.

The Hierarchy of Evidence is exactly this: it is a tool created to attempt to obviate the inherent biases in human - specifically, clinical - judgement, and it constructs a framework for evaluating evidentiary heft that is supposedly divorced from judgement calls that might be influenced by human bias.

The obvious objection to this process is, known in the trade as GIGO - Garbage in, Garbage Out.

Basically, if a case report is on the lowest rung of the hierarchy, and a systematic review is on the highest, the hierarchy of evidence by itself  has no procedural mechanism for valuing a lack of conflict of interest behind a case report, and devaluing a meta-analysis comprising studies which are poorly designed.

But there are further, deeper objections to over-reliance on this hierarchy.

The Hierarchy exists in an imaginary epistemic ecosystem in which all information - good and bad - is published and accessible. We know, however, that this is not the case - that countless trials returning 'uninteresting' results die a quiet death in the bottom of a researcher's desk drawer.

This is not to say that clinicians are always right either. Medical history is replete with doctors' fidelity to butchery and other harmful practices that were only stopped when well designed trials demonstrated the harms outweighing the benefits.

As always, in any issue dealing with complexity - and there is no more complex system in the universe than the human body and brain - there is still no substitute for human intelligence. Human intelligence is needed to create the automation process and human intelligence is needed to parse automation's results.

Get in Touch

Please enter your name.
Please enter a valid phone number.
Please enter a message.

© 2018 Macroscope Consulting Network | We publish on lands stolen from the Wurundjeri and Boonwurrung people of the Kulin Nation. Until there is redress, we ask for forgiveness and forbearance. We pay respect to their Elders, past and present and stand with them in their struggles for recognition, liberation, and justice.