Home Engineering The dark side of technology
Twentieth-century agricultural technology
By the start of the twentieth century, the pattern of growing crops in larger fields had become routine, as tractors provided the power that previously had been limited to horse and ox. The land measure of an acre was nominally the area that could be ploughed in a day. Mechanization has increased this by a hundred-fold. The obvious side effect of larger fields was the rapid depletion of nutrients from the soils, because without a mixture of crops, there were fewer mechanisms to regenerate the biochemistry of the soils. Technology was flourishing, particularly in chemistry, and there were concerted efforts to replenish the nitrogen that was consumed by crops. One route to this was the production of ammonia to add to fertilizers. A successful chemical process was devised by Fritz Haber and Carl Bosch. It was of immediate value in agriculture and the scientists won the 1920 Nobel Prize for chemistry.
Interestingly, Haber was equally determined to find ways to help the Germans to a rapid victory in the First World War and to save lives (i.e. from all the armies involved). He therefore invented a chemical poison gas that was first deployed at Ypres. After the war, he returned to his main priority of the chemistry of agricultural processes.
The need for more food was clear, and the major companies found it easier to work with very large, single-product farms and, wherever possible, to eradicate anything that might compete with the growth of the plants. This was clearly an ill-considered approach, as it was putting all the eggs in one basket. The difficulties in recognizing the dangers were in part economic, in part social. Focus on a single crop is a dangerous strategy, as seen, for example, in the potato problems in Ireland in the 1840s. They were growing an excellent hybrid at the expense of all other varieties, but were struck by a fungus that particularly liked their favoured type of potatoes. Without alternative types of potato, the food supply collapsed and the famine of 1845 ensued—with consequent political turmoil for the nation that has rumbled on into the present time.
A parallel example was seen around 1940 with the development, by Norman Borlaug, of a superior strain of wheat and maize with stronger stalks to support more grain. His solution was to find a dwarf variety, as this could support the larger heads of grain. It was financially successful, but not perfect: although the head was larger, the nutritional value was less than for the taller species. Nevertheless, since it was highly marketable, it moved into pole position of a monoculture species crop. The downside of this technological progress is that a single strain could be wiped out by a disease that targeted it, whereas a diversity of strains would inevitably have survivors resilient to any given disease.
Technologies in agriculture have generally been introduced with good intentions (as well as for profit); the undesirable consequences emerged from lack of foresight, narrow viewpoints that focussed on a single feature, and concepts that were easy to implement from the viewpoint of marketing and distribution of agricultural products.
Not only chemistry, but also biology and physics were advancing at an unprecedented rate during the twentieth century, and all were stimulated by a vast range of inventions during the Second World War. The destruction from the war put intense pressure on all countries (winners and losers) to provide food, and technology seemed an ideal contributor. Further results of the war included strong pressures of control from governments, a degree of secrecy stimulated by military activities, and societies that would never question decisions from government, industrialists, or those in privileged strata of society. In part, this was ignorance on the part of the farmers and employees, and in part it was a cultural attitude still present in the mid-twentieth century. In the UK, there was an ordered, stratified society in many respects, in which one never queried decisions made by those who were assumed to be more knowledgeable. For example, a decision by a doctor or priest was rarely questioned by the general public. Further, the involvement of complex science inhibited lay people from criticizing new technologies, and the major chemical and agriculture businesses were totally blinkered. Looking back half a century, the situation is scarcely believable for us now, as we are so used to ready access to information, at least in most Western countries.
However, the value of Internet access to information is probably overrated, as in many nations it is blocked for political or religious reasons. Further, the range of Internet and electronic entertainment material is vast, and this can distract us from any serious information retrieval. Also, many people will only look at sources and websites that confirm their existing views (or prejudices). Indeed, many sociologists consider that social contacts sites actually isolate subsets of people, as they each form a sufficiently large community that they are self-supporting in their views (no matter how extreme or ill-founded they are). Just as in the 1950s, there is still a difficulty accessing and understanding information of a technical nature, unless one is trained. There is even more difficulty deciding between conflicting interpretations. I of course doubt that these are problems of anyone who has read this far through this book, but such people are exceptional!
Nevertheless, we have progressed a little and have developed far better and more realistic critical attitudes towards the wisdom offered by ‘experts’. We have also moved to a society where there are many people with a modest breadth of scientific knowledge, which is freely communicated and accessible through the media and Internet. This should, and has, allowed us to be more confident in challenging practices that are questionable.
|< Prev||CONTENTS||Next >|