Desktop version

Home arrow Engineering arrow The dark side of technology

Source

Historic examples of unfortunate technology

The Victorian era offers many examples of exciting and fashionable technology that eventually were realized to have unfortunate side effects. Whilst examples can be cited from all periods of history, the nineteenth century was clearly one of great innovation, imagination, and courage in trying out new ideas. It is therefore not surprising that we can look back and recognize that many of their products were seriously flawed in terms of their impact on the users. Starting with examples from Victorian times has the bonus that we are unlikely to be emotionally involved and so are unprejudiced by current arguments, opinion, and modern commercial pressures. This will not be the case when I present more current examples.

Many of these historic innovations were imaginative and basically sound, but the available materials and production methods made them unsafe. From the twenty-first century viewpoint, their ignorance of chemistry, biology, and physics can seem surprising, and the examples of medical practice are often horrific. Accidents rates were especially high with the new devices based on gas and electricity, not because of the principles, but because of ignorance of the installers and users. So baths directly heated by gas (exactly like a saucepan!) were a luxury compared with boiling water and bringing jugs of hot water to the bath. But the possibility of cooking oneself whilst taking a bath was real; in addition, there was potential for overheating the water and the metal bathtub, scalding, and gas explosions. Such accidents were frequently reported in the newspapers.

Electricity in the home was equally hazardous, as it was a novelty that was very poorly understood. The workmen who installed it were therefore untrained and inexperienced in terms of potential hazards. Even the design of switches, insulation, and cabling lagged far behind the marketing. It is true that Victorian switches had attractive brass toggles and covers, with pretty designs, which may appear preferable to modern plastic packaging, but the metal units were potentially lethal (as indicated on many death certificates).

The more affluent Victorians presumably liked the colour green as they enthusiastically used green patterned wallpaper. One hidden danger was that the colour resulted from the inclusion of arsenic compounds, which reacted with moisture in the atmosphere to give off arsenic vapour. As a consequence, many people were taken ill or died from this hidden killer in their fashionable rooms. Manufacturers of course disputed this link to their wallpaper; at least one major wallpaper designer also owned mines that produced arsenic, so inevitably he would have had a blinkered perspective. The mining activities are never mentioned when praising his work and cultural influence.

Green was similarly popular in the designs of tinted glass, but one of the metals used as a colouring agent was uranium. Uranium provides an interesting green/yellow colour and fascinating, as on a dark night, the glass has a pale ethereal glow. At that time, no one considered the possibility of atomic nuclei, and so there was no concept that atoms might decompose and emit energetic radiation. Uranium glass is certainly decorative, but the owners are unwittingly exposed to the uranium radiation source. Times have not changed, and neither has the physics: I know of modern collectors of such glass who have acquired quite large quantities. In one example, a physics colleague was passing her supervisor’s office with a Geiger counter and picked up a healthy signal (actually an unhealthy one). He stored his glass collection under a sofa on which he would rest, as he frequently felt fatigued!

Radiation was also a hidden killer for those who painted fluorescent numbers on watch faces. In order to have a fine-pointed brush, the workers would lick it to provide just a small amount of water. The radium this transferred to their tongues frequently induced cancers.

At the end of the nineteenth century, X-ray sources were being developed. The machines gave nice images of the skeleton, and the users would happily demonstrate the fascinating shadow pictures of the bones. There was no sense of danger to the person from the energy of the X-rays, although in hindsight it should have been realized that many users died of cancers. The complacency towards radiation persisted well into the mid-twentieth century. I recall, as a small boy, seeing and using X-ray sets in shoe shops that were designed so one could look at the way the foot fitted in the shoe. Customers were exposed to quite large radiation doses, but the poor assistants were exposed on every occasion, and so cancers were not an uncommon outcome.

For modern youth, it may seem astonishing that in the 1950s the attitude to radiation exposure was very positive: it was seen as a source of energy for the receiver. Mineral spas and bottled water were proudly labelled with their radioactivity content. Higher values led to better sales. We need to remember that at that time chest X-rays might be required for some job applications, and there was a lack of understanding of the hazards from radiation. Anyone who has seen the newsreel films of atomic bomb tests will realize there was a minimal attempt at protection for the people involved. Workers returned to the sites almost immediately afterwards to assess the scale of the damage to buildings and vehicles that had been left in the test zone. Radiation safety standards then started to set lower allowable exposure limits that dropped by around ten times per decade for the next half-century (an amazing difference in terms of the levels that were considered to be safe).

The reaction to the Cold War and atomic bomb tests caused a complete cultural reversal in attitude to any product that appeared to be linked to ‘radiation’ or ‘nuclear’. It went from desirable to anathema. There was no logic or real understanding by the public, so when a very fine advance in medical imaging came from a new technique called, quite correctly, ‘nuclear magnetic resonance’, both patients and doctors rejected it because of the word ‘nuclear’. This was despite the fact that the diagnostic information was good and there were no harmful effects. Good publicity and a renaming of the technique as MRI (magnetic resonance imaging) suddenly made it acceptable.

By contrast, because X-ray images had been around for 60 years, there was little concern about the use of X-ray imaging, despite the inevitable ionization and damage, plus mutations that are produced in our cells. X-ray examinations can unfortunately never be free of these effects; one of their more common outcomes is development of cancers. This complacency still exists and there is a demand for high-grade X-ray imaging, dental X-rays, mammography screening, and CAT scans (computer- aided tomography), which all depend on X-ray irradiation. The better- quality images with more in-depth information (as in the CAT scans) can only be achieved with higher dosage.

The fact that X-ray diagnostics actually induce cancers is still overlooked by many doctors and patients. This is not a trivial side effect, as various medical sources estimate that around 2 per cent of cancers are actually caused by X-ray imaging. To me, 2 per cent seems high, but the number is the current positive view with state-of-the-art, high- sensitivity X-ray detectors, whereas in earlier decades the downside of examinations was far worse. In hindsight, some early historical consequences were tens of per cent of the patients! For others, 2 per cent will now sound like an acceptable risk, and perhaps it is, but to offer a different perspective, it is worth noting that around one million European women have breast cancers each year, and a 2 per cent value means 20,000 could have been triggered by diagnostic X-ray exposure. From this viewpoint, it is totally unacceptable, especially since there are other non-damaging methods that could be used.

In order to minimize the radiation exposure, image quality can be compromised. Radiation exposure is indeed reduced, and there are two consequences. First, poor image quality may mean we overlook small, early-stage cancers. Second, there is a more significant downside that it is easy to interpret the poorer images to give false positives (i.e. to believe there are non-existent tumours). Some authors claim that there are more false positives than actual tumours detected in many screening programmes. This error leads to anxiety, more examinations and, not infrequently, unnecessary surgery, plus considerable cost to the health service.

Returning now to the Victorians: they were justifiably proud of their indoor toilets, but ignorance of the designs of plumbing needed to cope with released gases of methane and hydrogen sulphide meant that such gases could be trapped and built up in concentration. There was then the excitement of quite major explosions if they came in contact with a candle or gaslight. Once again they provided familiar little newspaper stories. Another problem was that toilets and household water used plumbing with lead pipework, which contaminated the water. Modern understanding not only recognizes the medical dangers of lead, we now know there are many earlier precedents. The wealthy ancient Romans had lead plumbing, whereas the peasants did not. The social benefit of good plumbing was definitely a double-edged sword, as the side effects of the lead have been linked to numerous diseases, madness, and infertility. The plumbing technology may even explain some of the excesses and ultimate failure of their civilization 1,600 years ago.

The Victorians also invented new materials, such as the plastic called celluloid, which was cheaper than ivory, and looked fine, but with ageing became unstable and could be explosive or self-ignite! Because celluloid was used in clothing, the hazard potential was considerable. Once again, many newspaper stories record the accidents. The same fate of spontaneous crumbling or ignition befell the early motion picture industry, as the reels of film were on an unstable, celluloid-based compound. This caused fires not only when running in the hot film projectors, with an intense lamp behind the film, but even in storage containers. The hazard was sufficiently serious that many old films were deliberately destroyed and reprocessed to regain the silver content, rather than preserve historic examples of the industry.

Another popular construction material of the nineteenth century was asbestos. It has excellent thermal properties for insulation and building, but it was many years before it was admitted that the dust particles could cause serious lung damage. Usage of asbestos continued for nearly a century after the hazards had become obvious. Cash outweighed care.

The list of hidden horrors is considerable, but my examples underline that progress can have side effects that are unknown when the new technologies are first used, and, as with asbestos, may be deliberately ignored for decades because the product is effective and cheap.

To offer just one modern example, one could consider a modern composite material made from substances such as aluminium cladding on plastic or polyurethane fillings. These have many excellent properties, and are used on the exterior of futuristic design skyscrapers. Nevertheless, they are potentially flammable, and are thought to have been contributing factors in several skyscraper fires that travelled up the outside of buildings. Twenty years from now, will we look back and say they were an unwise choice?

 
Source
Found a mistake? Please highlight the word and press Shift + Enter  
< Prev   CONTENTS   Next >

Related topics