Saturday, 19 April 2014

EMBRACING WRONG

STOP PRESS: I am delighted to announce that Arton 'Float Sting ' Baleci and I are running a new course on learning - for coaches, teachers and other learners!For more information, and the register, please go to: http://t.co/hqdr4ksJPq



From an early age, we are conditioned to want to be right. We learn to need to be right.

Parents applaud their children when they do something well, or answer a question correctly. At school, we are rewarded and celebrated for passing tests, and can suffer in any number of ways for failing them.

The lesson is continually reinforced as we grow older. Promotions, reputations, careers are built on our capacity for right thinking. Or, at least, the appearance of right thinking. To be right is to succeed.

This is fine. It is entirely understandable. But this all-encompassing valuing of being correct comes with a risk. It can make us feel being wrong is always a bad thing, and so etching to be avoided always.

Right equates to success, wrong means failure. Newspapers are filled with stories of mistakes made by experts and public figures, often followed by the demand that they be punished or sacked or in some way called to account for their errors. 

If something goes wrong we seem compelled to look for blame, even if it is difficult to figure out where the fault really lies.

Whatever the wisdom or justice of this mindset in public life, it carries with it an implicit assumption that there is in some way a break from the norm of rightness.  Being wrong is understood as an aberration or a glitch.

And this is obviously not true.




We are all wrong. A lot.  We are wrong about small, inconsequential things, and we are wrong about big, important matters.  We might not want to accept them, we might even try to hide them (from ourselves as well as others). A moment's reflection will remind us that our lives are full of mistakes.

This is the terrain of behavioural economics and the psychology of decision-making. Daniel Kahneman's 'Thinking Fast and Slow' is possibly the best known of a series of recent books that have highlighted that our evolved human mind is not, as we might like to believe, a rational computer.  It is more accurately conceived as a veneer of reason on top of a collection of bias, hunches, and prejudices.  So, to borrow Kahneman's terms, we assume that we live our lives laid mainly by slow, rational thinking.  However, reason actually plays a relatively minor role in much of our day-to-day decision-making. Instead, we rely on fast, intuitive thinking, which is often below the level of consciousness. In other words, we believe (and want to believe) that we reason, when we usually just react. Reason occasionally steps in, in times of difficulty, but by then we may have long since acted.

This presents something of a problem for us, and especially those of us who would like to live lives in which reason and evidence have a say, because our minds have evolved over millions of years for survival and reproduction. And they evolved in environments very different than the ones in which the vast majority of us now live.

So, we tend to be extremely good at tasks that require quite quick judgements and actions, but not so good in those situations where reflection is needed. For example, evidence from research over the last 20 years also shows that the biases, hunches and prejudices that come pre-installed in the human mind can lead to a wide range of quirks:

Confirmation Bias - the tendency to accept evidence that confirms our beliefs and to reject evidence that contradicts them.
The Gambler's Fallacy - the sense that the odds of something with a fixed probability increasing or decreasing depends on what has recently happened.
Probability Neglect - our inability to properly grasp a proper sense of risk, which often leads us to overstate the risks of relatively harmless activities, while forcing us to overrate more dangerous ones.
Attribution Asymmetry - the tendency to attribute success to internal characteristics (such as talent and innate abilities) and to attribute failures to external factors (like simple bad luck). 
Repetition Bias - the willingness to believe what one has been told most often and by the greatest number of different sources.
Cognitive Inertia - the unwillingness to change thought patterns in light of new circumstances.

And there are numerous other biases and intuitions that 'feel' right, even if they are leading us astray.

So, we better think again about our attitude to being wrong, because to err really is human. It is scientifically, measurably human.

The philosopher Karl Popper argued that learning could be best characterised as a process of trial and error-elimination.  It begins with guesses that are, to all intents and purposes blind to their outcomes.  We cannot discover if they are right or wrong, or if they work do not work, until we test them with experience or criticism.   So, for Popper, error is an integral feature of learning.  If we shy away from the possibility of being mistaken, we dramatically limit our guesses, and consequently block learning.


With this in mind, I recommend an excellent TED Talk by the journalist Kathryn Schulz.  She takes a different focus than I have here, but the take-home message is the same: we had better start embracing wrong, because - for much more of our lives than we might wish to admit - wrong is what we usually are!


___________________________________

STOP PRESS: I am delighted to announce that Arton 'Float Sting ' Baleci and I are running a new course on learning - for coaches, teachers and other learners!
For more information, and the register, please go to: http://t.co/hqdr4ksJPq