I have found myself increasingly troubled by democracy as a system of government. I know that this is not unusual, and that it is sometimes regarded as just the least worst option. But, it does feel like the form of democracy I see in the UK and US is flawed beyond what could be the case, and may well not be the least worst form of democracy.
There are several causes I can see, including congitive biases, lack of critical reasoning skills, inadequate models for assessing preference, and probably more. I thought it might be mildly entertaining to try and list an example of how democratic models, both decisions taken in a democracy, and the voting decisions of the elctorate, might be flawed due to by cognitive biases. For more info on what such biases are and a huge list, see wikipedia ( en.wikipedia.org/wiki/List_of_cognitive_biases ). It’s not intended to be a deeply thought out list BTW – just a bit of fun. I’ve also only included a sub-set of those where I could immediately think of an example. Feel free to post other, better or more specific examples as a comment.
Bandwagon effect (tendancy to believe what others do ‘after all they can’t all be wrong’)
In the recent non-election fracas in the UK, the only real data about different parties presented on the national news for several weeks related to opinion poll survey results and commentary on them. These polls moved over time, and were used as growing evidence of reducing Labour ratings … but there was no material new information going to the voters apart from poll results, so there was a horrible self-proving hypothesis based on the bandwagon effect.
Choice supportive bias (tendancy to remember your choices as better than they actually were)
Illusion of control (tendancy to belive that one has more control over a situation than you really do)
Example : Politicians routinely take credit for things that look on more detailed inspection no better than chance (e.g. “the economy has grown X under the last Y years of a Labour government”), but do not want to take the hit for the chance bad events. Voters are then (mis)informed by the claims made.
Confirmation bias (tendancy to search for supporting evidence for one pre-conceptions)
Example : Hard to know where to start. This is absolutely endemic, but probably most visible in the statistics selected in defence of a point of view. Politicians and others with an agenda rountinely take different points of view, then quote statistics that support that point of view. No real effort seems to be made to get balance in that point of view, and the multi-party democratic system reinforces this. Soundbite reporting then nails the coffin down hard. It’s well worth reading “Damn lies and statistics” for a more detailed spotters guide to how statistics are routinely mis-used.
Contrast effect (mis-weighting a data set due to a recent piece of non-aligned piece of data)
Anthropic bias (tendancy for ones observations to be biased by the observations you make, even when those observations are incomplete or flawed)
Example : a single hot or cold few months in one part of the world is enough to ‘show’ or ‘refute’ global warming, according to various silly press articles I’ve seen over the last few years. This may be intended to be tounge in cheek, but it’s scary what some folks will believe, so it wouldn’t shock me at all to discover that some people have exactly this kind of assessment.
Extreme aversion (tendancy to avoid extreme choices in favour of a middle ground)
Example : Global warming – there are many different points of view from ‘it’s nothing to worry about’, through ‘as long as it doesn’t cost anything we might do something’ through to ‘its huge, we need to really make big changes’. The last of these has massive scientific support, the first two have virtually no defensible support. But, in the last few years there seemed to be a lot more emphasis on the middle one for no more readily apparent reason than it was less extreme than the last one.
Hyperbolic discounting (tendancy to favour immediate/near term pay-offs rather than longer term – and the closer the better. Note that this is far beyond the correct discount for time and risk uncertainty – hence hyperbolic)
Example : Pensions and health-care for the elderly. These are interesting in that they are entirely predictable demographically driven topics. It should be possible to have an entirely rational discussion about how we should adjust things like retirement age. But, politicians accurately know that there will be hyperbolic discounting by the electorate, so making it virtually certain that the issue will be fudged. As I think about it, I suspect that this is confounded by ‘one person one vote’ effects where the retired and near-retired population is large, so can dominate the thinking; but, the people who will be doing the lions share of supporting of this population in 20-30 years time don’t have a vote in the matter at all!
Neglect of probability (tendancy to completely disregard probability when making a decision)
Example : The media has a tendancy to tell a story by way of a particuarly emotive example of a single person’s sitution. The probabililty of that situation is very low, but never talked about, and even if it was this bias means it would be pretty much disregarded. Specific example might be risks of death by accident whilst travelling. Since train crashes very sadly tend to kill many people at once, the view of the money that it is worth spending on train safety is massively higher than that for road safety where even more sadly there are far more deaths … just only in 1s and 2s. The figures I think I recall from a few years ago were that the marginal cost of work to avoid a single death on the road was £500k, but for trains it was £1.5million, and the Advanced Train Protection (ATP) system was something like £3million. This puts the cost of train travel up, so more folks travel by road, despite it being a higher actual risk.
Outcome bias (tendancy to judge a decision on final outcome, not the quality of the information available when it was made)
Example : The Iraq war on the basis of suspicions of weapons of mass destruction. I do recognise that this is a slightly flawed example since it is now known that the quality of information was materially deficient, and so I think we should never have invaded. But, if you cast your mind back, there was a time when the view stated was that we shouldn’t have invaded as WMD’s were not found. That’s irrelevant – what mattered was whether the decision was made well based on the quality of information available, not whether the weapons were found.
Post-purchase rationalization (tendancy to post-hoc justify the worth of something that was bought)
Example : Slightly off-topic from democracy, but a potentally good example is the choice of religion that someone follows. Almost all religious affiliations follow family patterns – someone brought up as a Jew is likely to have Jewish religious affiliation – the same for Muslims, Catholics etc. But, many who do follow the familial affiliation explain why they believe in the specific affiliation in words that imply an active choice
Availability heuristic (tendancy to bias a prediction based on the most emotionally charged outcome)
Example : Recent American reaction to imports, with a linked view that they take American jobs. But, failure to see that there must be, in the medium-long term, a completely matching value in exports that create jobs. Of course the ‘one person one vote’ issue comes to the head again. Blue collar work that is off-shored is likely to have employed more people directly than the white collar work that is likely to be exported. The import ‘loosers’ will also remember it much more, and may vote on that basis. The exporters are unlikely to balance this out even if there were as many of them. So, it becomes sensible behaviour for politicions to wilfully mislead voters about imports and exports. Democracy looses.
It’s getting late, so I’m pausing here. I’m not sure if I can load this to the blog via the hotel television anyway.