Home Engineering The dark side of technology
Excessive reliance on initial opinions
We are also blinkered if we believe we already understand a problem or an idea, because then we either overlook key information, which is in full sight, or reinterpret it or dismiss it so that it fits our preconceived views. Again this is not prejudice but inherent human nature. This unexpected and subtle reason for rejection of information comes from our skill and ability to make a rapid value judgement when we first meet a person or situation. It is a good strategy, as we rapidly separate out a possible friend or a threat. Unfortunately, once we have decided, we subconsciously look for subsequent evidence to reinforce our initial view.
If we gain more information that contradicts our first view, then we have several choices. The first option is to ignore the new information; the second is to try to reinterpret it so that it fits our preconceived viewpoint. In either case, we are deluding ourselves. The third possibility— that we were wrong—is very low on our agenda.
Inability to accept being wrong is classic human behaviour (particularly for those who are insecure), and it seems to run across the whole spectrum of our dealings with people and facts. In science, there are many examples of excellent scientists who have missed known key information that would have helped them. In social contexts, we make the same mistakes. A particularly serious, and a little-publicized, frequent example is that made by members of a jury in a court trial. In times of danger, we have needed the instant decision of ‘fight or flight’. So we automatically and unwittingly make initial rapid assessments of the guilt or innocence of the accused on first sight. Jury members then tend to overlook evidence that contradicts their initial view. So look innocent when you enter the court! There is no second chance to make a first impression.
In studies of how doctors scan an X-ray image to look for cancers, it is possible to track their eye movements and see how diligently they view the entire picture. A very common situation is that, once they detect a suspicious region, this absorbs all their concentration. They will think that they are continuing to scan the total picture, but in reality their eyes only briefly flit over other parts of the image (which may include a second cancerous site). Having found one abnormal site, the usual behaviour is to focus on it and exclude any other information in the picture. This is not just the behaviour of inexperienced doctors but equally a failure of experts.
Further, even the best of multi-tasking brains has a very small limit to the number of tasks and bits of information that can be handled. So if there is a lot of complexity in the image being scanned, we cannot cope. This inherent human weakness (not just in medical images) is one of the justifications why there are benefits in using computer-based pattern recognition, as here the human traits are avoided and every element of the picture will be equally well scrutinized. Once the highlights have been identified in this way, the human expert will happily accept, and view, all the crucial features. Maybe this is because the attitude towards the information is less personalized, so they do not have inhibitions or attachments to a limited focus on one area.
Rejection is equally apparent when the new ideas and information clash with thoughts that have been ingrained from childhood by the local culture, place in society, or religion. Here one can find a host of highly contentious examples, but rather than consider an extreme example, I will just comment on one that was caused by scholarship in the seventeenth century. An Irishman, Bishop Ussher, used literature examples available to him to try to estimate the age of civilization. He succeeded in demonstrating that there were well-documented written records going back to at least 4000 BC in the Western world. (He did not have access to older writings from, say, China.) However, his age limit was then later presented as meaning it was linked to creation, and therefore was the age of the universe. The underlying problem is that we, as selfish humans, wish to believe we are the most important creature in the universe and that it was created specifically for us. Effectively, it is a ‘spoilt child’ syndrome.
Any evidence that demonstrates humans were merely a species that evolved from earlier life forms downgrades this self-importance, and therefore many people reject such possibilities.
Viewed with twenty-first-century data, we can now say that Earth has existed and evolved over billions of years, and the universe, as far as we can detect it, has expanded over some 13.8 billion (thousand million) years. These are scientific data that were not available to Ussher. It of course offers no comment on creation by a deity, except that instead of creation being specifically for our benefit, we now need a far more impressive event for the entire universe. From the spoilt-child view of humanity, we have sunk into being a very minor part of the total universe. Hence the instinctive reaction is to say scientific data must be false rather than trying to face up to, and comprehending, the unimaginably large scale of reality. In fact, this is not the end of the pattern, as astrophysicists try to think beyond the detectable universe to ask if there were forerunners, or hidden but parallel universes. For most of us, this is a challenge beyond our comprehension.
|< Prev||CONTENTS||Next >|