Dowsing the flames
The headline article in The Independent caught my attention this morning: ‘Head of bomb detector company arrested in fraud investigation‘. “This is an act of terrible betrayal”, wrote the Independent’s defence journalist Kim Sengupta in a parallel piece – clearly an accurate comment given that the detectors in question failed to detect literally tons of explosives that were used to kill and maim hundreds in Iraq in a single suicide-bomb event, and all too many others like it.
As I read the article, my heart sank still further – though perhaps not for the reasons you might expect. Yes, the ‘bomb-detector’ has proved to be unreliable: there are huge problems on that score, without doubt. But to me the ‘betrayal’ turns out to be much more complex than it seems on the surface – because despite the ‘military-hardware’ packaging of the device in question, and its impressive-looking dials and cables and the rest, the underlying technology of the ‘bomb detector’ is a plain old ordinary everyday dowsing-rod.
Dowsing has been a serious interest of mine for several decades: over the years I’ve written what are now some of the best-known books on dowsing, in fact. Hence – unlike many of the critics – I do have some solid understanding of what’s going on in this case. And because of that longstanding background in the field, I’ll freely admit that I have few fundamental doubts about the use of dowsing in this context, not least because there’s plenty of long-documented, long-proven military practice in dowsing for land-mines and the like (contact the British Society of Dowsers for case-studies in Aden, for example, or the American Society of Dowsers for US use in Vietnam). Like most people, I would much prefer a predictable and reliable machine to do the job, if there’s one available and it actually does work – which many don’t. But when lives are on the line and you don’t have anything else, a dowsing-rod in experienced hands can work wonders: so at least that part of this sad, messy story is no fraud. Yet that point about ‘experienced hands’ is extremely important: in unskilled hands a dowsing-rod can easily be worse than useless – as those on the receiving-end of those undetected explosives would have discovered to their cost…
(This is getting very long: better put a ‘Read more… link in here.)
Despite the protestations of pseudoscientist ‘skeptics’ like James Randi, the blunt fact is that dowsing works. Interestingly, how it works is almost irrelevant, though these days we do have a much better understanding of the psychology and physiology of what’s going on in dowsing – especially how the brain enacts pattern-processing against unconscious cues, much as in some aspects of proof-reading, for example, or ‘reading’ a stock-market ticker or a full-on fire in an apartment-block. But we also know that the instrument itself has very little impact on the quality of dowsing: the relevant physics are so trivial that good dowsing-work can be done with a couple of bits of bent fence-wire or even a used tea-bag. The physiological constraints are also trivial – so trivial that just about anyone can do it one way or another if they put their mind to it. And the amount of knowledge needed to get started is trivial too – so trivial that most people can pick up the basics in a couple of minutes. But where so many people go so badly wrong with dowsing is that beyond that simple base, everything else depends on personal skill, on observation and self-observation, experience and interpretation – and none of that is trivial at all.
Every true skill depends on the development of judgement and awareness: my real professional interest is around identifying common-factors that apply in every skill, and using that knowledge to improve skills-education in any domain. In that sense, dowsing has been a very good test-case for that research – research which has been applied in a whole swathe of much more ‘conventional’ skills, from archaeology to enterprise-architecture, from software-development to quality-system design, and just about everything else in between. (Except sports, for some reason. I’ve never understood sports. I don’t know why, but there ’tis. 🙂 )
Any skills-based technology will depend on the specific combination of the equipment and the operator: it needs to be understood as a interacting, interdependent system, not solely in terms of any of its individual components. The closer we get to a ‘pure’ skill, the more important the capabilities of the operator become – and dowsing is actually one of the closest examples we have to a ‘pure’ skill, because it consists of almost nothing but ‘judgement and awareness’. In practice, the choice and characteristics of a dowsing-instrument are often almost irrelevant, because the real ‘instrument’ you’re using in dowsing is you.
So yes, I do have some real concerns about the company selling a dowsing-based ‘bomb-detector’ for £15,000 each (or £45,000 each in Iraq, apparently), because in principle at least the job could have been done just as well with tuppence-worth of power-cable. (Belief and credibility do play an important part in dowsing-success, so there are some arguments for putting a serious price-tag on what is always going to be a very simple piece of kit – but for ethical reasons if nothing else, that price-tag should be measured in tens or low hundreds at most, not tens of thousands!)
Yet what worries me far more is the risk that this has been presented as a ‘deus ex machina’, inducing people to rely on the ‘machine’ itself (which, as the ‘skeptics’ correctly state, has just about zero capability on its own) rather than the personal skills of the operator (who can or may have the required capability, but only if the right skills-development has been applied). If the marketing-literature purports that the ‘bomb-detector’ itself does the task, it would be technological incompetence at best, because dowsing simply does not work that way – and the company should certainly have known this before they sold anything at all. If they knew, and went ahead anyway – especially without a rigorous focus on really solid skills-development – then it would indeed be fraud of a very serious kind. I would hold back any judgement on that until we’d had a chance to scrutinise the training-regime. I’d have to admit, though, that so far it doesn’t look good – which is unfortunate, to say the least.
Yet it’s important to tackle the right target here: and I’m sure that in this case dowsing itself isn’t it. But you may think otherwise, of course – your comments, perhaps?
Update: 24 Jan
Seems there’s been quite a follow-up on this on the BBC. As usual they’ve tried to make sense of it in conventional ‘deus ex machina’ terms, with the obvious and correct conclusion that it doesn’t and can’t make sense in those terms: see the BBC article ‘Export ban on ‘useless’ detector‘. The ‘sensor card’ turns out to be a very ordinary RFID tag or some such, without any connection to any real electronics: there’s no possible way in which it can work in any conventional physical or chemical sense. But dowsers would recognise this straight away as what’s known as a ‘sample’ or ‘witness’: in effect, it’s best understood as a psychological trick to focus the operator’s mind on only the specified substance, in much the same way as the ‘cocktail party effect’, selecting out a single conversation at a very noisy party – though here the basic signal-to-noise ratio is vanishingly small, and needs to be enhanced in any way that we can. As with all dowsing, it’s based far more on psychology than physics, so attempting to assess in strictly physical terms not only makes no sense, but is literally unscientific. Worse, as with all skills – in fact, exactly as with operation of a conventional sonar/radar mine-detector – the ‘bomb-detector’ process needs to be understood as a complete system, the intersection of equipment and operator: but in this kind of analysis they’ve ignored that fundamental constraint, and instead tested the least-active part of the overall system, which again is flat-out unscientific – applying controls for parameters which are not even in play, and applying no controls for the parameters that do affect the system. Would be good if some of those self-styled ‘scientists’ had any real grasp of what true scientific investigation actually requires, but there ’tis…
Hence bleakly amusing to see the BBC’s evident surprise when the Iraqi Interior Ministry says that it does trust the devices: see ‘Iraqi Interior ministry still backing ‘bomb detector’‘. Part of that trust may come from the fact that they’ve so far spent a staggering $85m on the devices, of course. 🙂 But at least that article does also make clear that a lot of emphasis has been placed on proper training: so actually there’s a fair chance that the device will work – in dowsing terms at least. And unlike the BBC, they do seem to be aware of the centrality of the operator in the system; but even so,the reliability will always be somewhat in question, especially for tired, scared, bored operators out on the street, day after day, dealing with an endless stream of vehicles and insults.
Judging from the descriptions, I really don’t think it’s fraud: at least, not in the sense of deliberately selling something that they know can’t work, though it may be fraud in other ways, of course. Only a proper investigation will be able to tell on that – and I don’t include the clumsy, myopic BBC hatchet-job as a ‘proper investigation’…
In many ways this whole sad mess is best understood as a clash of worldviews: the Western view, which focusses solely on the ‘objective’ world and tries to remove the person from the equation wherever possible; versus a more systemic worldview – pretty much any ‘non-Western’ worldview, in fact – which focusses far more on the person as an intrinsic and interdependent component of the ‘system’ in scope. The other deeper clash is between science-as-religion, which is obsessed with finding the Ultimate Answer to how everything ‘really works’, but will only accept answers within the confines a very rigid set of materialist assumptions; versus the much more practical technology view, which is less interested in ‘how things work’, and far more in ‘how things can be worked’ – which is not the same question.
The saddest part of this will be obvious to anyone who understands the psychology of skills-education, though will certainlynot be obvious to the BBC or the other ‘skeptic’ critics. This is that a very large part of this overall system depends on belief, and especially on the operator’s belief that the ‘sensing’ is separate from themselves, even though the operator is actually responsible for virtually all of the sensing and sense-interpretation – a complex double-bind described by the psychologist Kenneth Batcheldor as ‘ownership resistance’. The result is that if we ‘prove’ that the device ‘does not work’, in a suitably convincing manner, the belief in its efficacy will be destroyed, and hence the overall system will cease to work. Which would no doubt be taken as vindication of the aggressive ‘investigation’ – but in fact the investigation itself is the primary cause of failure. (Given his evident dislike of this class of technologies, Dave Snowden will no doubt not like me saying this, but this is in fact an exact illustration of his own dictum that “every diagnostic is also an intervention” – in this case, an inherently destructive intervention.) Which, in practice, could well leave Iraq without any form of mass-scale streetside bomb-detection – which would not be a good outcome… yet an outcome that arises directly from the so-called ‘scientific’ investigation.
I’ve no doubt that the BBC investigators were unaware of these complexities or their impacts: but there are real and very serious ethical issues here, which they appear to have failed to understand at all. Religious fanatics like James Randi are beyond the pale, and beyond reason, of course; but given that the only real ‘fraud’ here is the fundamentally non-scientific nature of their investigation, and that the destruction of a dowsing-based capability may well cost far more lives than the inevitable human-based limits on the system’s reliability, I would certainly caution the BBC and others to be a bit more cautious next time of the consequences of their no doubt well-meant but potentially lethal actions.