Desktop version

Home arrow Engineering arrow The dark side of technology

Source

Rejection of Knowledge and Information

How eager are we to learn?

In our inflated image of human progress, it may seem obvious that we will be keen to learn new ideas and skills. Reality is different, and we often actively reject ideas as well as factual new information. Initially, this attitude seems incongruous. Why should we not wish to learn new facts or concepts, or be so blinkered that we cannot comprehend them? However, the problem is well known and very apparent to all teachers, not just in schools, where children may be there somewhat unwillingly, but also at higher levels, including colleges and university. Rather than assume this happens solely from lack of interest or concentration, it now appears that we may be genetically programmed to ignore novel ideas. (This may already be a test of my comment!) Learning can be quite selective; there is a mental barrier except for ideas that are close to those we already understand. There are also some subjects that have a special appeal for us, so our inhibitions are reduced. For the rest, it is hard work or rejection. With this knowledge, and a better understanding of human behaviour, I feel I should now mentally apologize to the many students who seemed to ignore my lectures.

Self-taught skills are gained by experience and experimentation; for example, by tasting new foods. Caution is therefore a wise strategy for babies and children who need to experiment with unknown foods, but in practice it is equally true for new ideas. Caution is also sensible for most things in life—picking friends, making investments, starting new activities, or believing in new ideas. However, there is the same pattern: when we already have some related experience, we are more amenable to experiment and then branch out from a secure position into new territory. This cautious approach has been termed confirmation bias. We also may display a disconnect between our thoughts and actions termed cognitive dissonance. Overall it means that ideas far from our experience are automatically initially rejected, no matter how good the supporting evidence, whereas ideas or facts close to those we believe to be true will certainly be favourably considered and probably accepted. This may seem a little odd, but hearing things we understand is pleasant—it releases dopamine and serotonin, etc. in the brain, which we enjoy. New ideas have the opposite effect. Our liking for pleasurable sensation can equally colour many other decisions in subtle ways. Even driving errors and plane crashes have been linked to failure to recognize and make difficult decisions.

Rejection of new ideas need not be permanent, as given time we can gradually edge into accepting innovative concepts. Confidence in them is noticeably improved if we trust the presenter and source of the idea, especially if it is someone whom we believe is an expert in the topic. Parents, teachers, clergy, and TV stars may fit this ‘expert’ role if we have been conditioned to believe them, but, in general, novelty is low on our list of priorities, and we prefer existing concepts. Philosophy and religion are invariably based on opinion, so rejection of new input may not be surprising, but for scientific examples with hard and reproducible evidence, our rejection is less rational. Nevertheless, totally new scientific concepts are nearly always first rejected, and then they become slowly accepted over the next 20 years (i.e. roughly a new generation). Once in fashion, the novelty moves from way-out to become mainstream (even if it is wrong!).

In social terms, changes in attitudes can be very much slower than a generation. For example, in the UK it took nearly a century to legislate against slavery, and far longer to offer equality in other social terms. In many cases we still appear to have made no progress at all.

The bias to accept knowledge from experts is a tried and tested route to progress and knowledge, but it has some serious flaws. Not all ‘experts’ are correct, and if they are wrong, it is difficult to question them (they think they are experts). I want to offer new ideas, but they may be blocked by your instinctive reactions. So I will cheat. I have already made some comments on medicine, so I will reiterate them as examples. Subconsciously, this is not new material to you, so this time you will be more likely to believe it, and happy brain chemicals will flow.

This is definitely an efficient approach and it is widely used in religious and political indoctrination via repetitive mantras (especially at loud volume with repetitive music). In a large group of people, everyone else is saying the same phrases, and it becomes difficult to disagree; after a while you are trapped and believe the teaching.

Before the middle of the twentieth century, our attitude towards doctors and other people in authority was subservient and unquestioning. Because doctors had specialized knowledge, many people assumed they were infallible, or at least should never question them. It is only in recent years, with easy access to Internet opinion and knowledge, that dissent and a range of conflicting medical opinions have become evident to the general public.

Many examples are highlighted by TV coverage, ranging from statins and hormone replacement therapy to the effect of salt on hypertension. We want to trust experts because each style of treatment can have value in some cases, but equally it may come with long-term side effects in large fractions of the population. Sometimes the side effects are worse than the original problem. We hope experts will help, but in reality they may draw completely opposed conclusions, even from the same statistical data. For us, this is totally confusing.

In medicine and the biological sciences, the scope for misunderstanding or incorrect models, predictions, and use of new drugs is considerable, as there are very many factors that need to be considered. Furthermore, with living creatures, whether human or other animals, we do not have ideal and reproducible experimental samples. So if we are looking for a particular effect and we find it in some studies, we automatically see it as evidence for our ideas. The fact that other evidence contradicts us is then merely an example of abnormality. There are also difficulties in interpretation from, for example, statistics that appear to link the onset of Alzheimer’s behaviour in old age to people who took antidepressants when young. One view may be that the drugs precipitated the condition; another view may be that the two factors were already related in the patient. Statistics alone do not help in such situations.

Another example, which from our perspective is completely false, was that in the 1920s women were encouraged to smoke, as it was claimed they would not gain weight. Statistically this may have been true, but for totally the wrong reasons.

New scientific ideas in engineering, technology, or medicine, where we have minimal expertise, are basically beyond us in many cases. So our only option is to trust our chosen experts. Unfortunately, the gurus in one branch of science may be completely ill informed in other areas, so even this is not an ideal strategy. There is also a conditioning and training effect that an established leader in a field will find it extremely hard to accept new thoughts in the same area, particularly if they imply the expert has been wrong over a long period of time. Ego and conditioning are far stronger than factual logic. Long-established experts have difficulty distancing themselves from their background and seeing new perspectives. By contrast, those who are outsiders may have no inhibitions, so spot new ways forward. The dilemma for outsiders is that they will have great difficulty in convincing the people who consider themselves already skilled and expert in the field. Nevertheless, outsiders may still feel intimidated in challenging official experts (e.g. in the way we defer to the white lab coat syndrome).

Expert pronouncements are not infallible, and we should be extremely cautious in knowing whom to trust, even if they have Nobel Prizes or run multimillion-dollar companies. There is considerable amusement in finding comments from famous people who turned out to be totally wrong, and a search of websites will reveal many examples. Before being too critical, we need to readjust our thinking to the knowledge available at the time, plus the fact that intervening progress may have changed the relative importance of their comments and perspectives. Nineteenth- and early-twentieth-century errors seem obvious, but there is no guarantee that our current twenty-first century predictions are not equally missing key ideas. Here is a list of unfortunate early quotations and oversights.

  • 1840s: Colladon and Babinet separately invented alternative ways of bending light around curved paths, either in glass or water, and their ideas were used for fountains and stage lighting. However, the concept of applications in endoscopy and communications was thought to be impossible. The views of industrial leaders had not changed much even by the 1960s, as optical fibres were seen only as a laboratory gimmick that could never displace radio communications.
  • 1876: The telephone fared no better as it had ‘too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us’ — Western Union.
  • 1878/80: Electric lighting similarly struggled: ‘When the Paris Exhibition closes, the electric light will close with it, and no more will be heard of it’; ‘it is a conspicuous failure’.
  • 1883: ‘X-rays will prove to be a hoax’ — Lord Kelvin (then president of the Royal Society).
  • 1903 ‘The horse is here to stay, but the automobile is only a novelty—a fad’ — comment from a bank against investing in the Ford Motor Company. Indeed, in 1903, with poor roads, this was not such bad advice. A real incentive for motor power came from the dark side of technology: weapons and tanks killed several million horses during the First World War, and their vulnerability for use in warfare was exposed.
  • 1946 ‘Television won’t last’ — hopeful thought from 20th Century Fox.
  • 1959 ‘The world potential market for copying machines is 5,000, at most’ — IBM. For the machines then being produced, this was fair comment.
  • 1977 ‘There is no reason for an individual to have a computer in his home’ — DEC. This was not an unreasonable view, as at that time machines with significant processing power were immense and costly to operate.

Most such pronouncements are the result of ignorance and lack of foresight, but some errors are driven by wishful thinking or bias. I have seen bias or blinkered views many times when acting as a scientific consultant. Invariably, I am initially extremely ignorant in the field compared with those I am advising. It is therefore essential to ask questions very carefully on what is the difficulty that they want to overcome, how have they had proceeded in the past, and why those methods were favoured. The typical response is they have always done it that way, and no one ever questioned it. As an uninhibited outsider it is then possible to offer new ideas. In these consultancy roles, the in-house commercial experts are willing to listen, as they are paying, or have approached me for advice. If an unsolicited outsider had volunteered the same thoughts, the companies would invariably have ignored the ideas. I am told by wiser consultants that companies are more receptive if the consultant charges a higher fee.

This is not a new idea: increasing the price to raise the desirability of a product is a familiar example of successful marketing and it works in areas from perfumes and clothing to cars and holidays—it is not an unexpected phenomenon.

 
Source
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >

Related topics