Home Engineering The dark side of technology
Twenty-first century technology control of trace contaminants
The real difficulty with chemical reactions such as those between chlorine and ozone is that the chlorine is an activator that drives the reaction but is not consumed by it. So it can react over and over again. Chemists call this a catalyst; similar trigger chemicals in biology are termed enzymes. One of the important considerations with catalytic processes is that very small quantities of material can make a large volume of material react. But precisely because they work in this way, they have often been overlooked. Only in the late twentieth century have we been able to detect them if they were hidden at levels below parts per million.
Parts per million may seem very minor, but in fact is well above the level of purity that must be achieved in many industrial processes. High purity costs money: the purer the material, the greater the cost and development time. So the economics mean that the efforts for extreme purity and cleanliness in production only take place for goods that are of very high intrinsic value. The two most familiar of these are the materials used to make semiconductor devices and the glasses for optical fibres. Both require fairly small amounts of material, but the products sell at very high prices per unit of weight. So the effort and production cost to achieve very pure materials is economically worthwhile.
There are actually two challenges. The first is to find ways to measure and quantify the component elements of compounds or impurities within a material. This must be done down towards detection levels of parts per billion. The second, separate difficulty is to find ways to make such pure materials. During the last few decades, our detection skills have improved, and so we can identify many chemicals that are present in minute quantities of parts per billion. There also has been major industrial progress in first purifying the starting materials and then accurately adding in precisely controlled amounts of impurities into these highly pure materials.
Purifying and then adding controlled amounts of contaminants is not a modern idea. A classic Victorian example was the manufacture of irons and steel from a Bessemer furnace. The strength and hardness of steel are critically dependent on the amount of carbon that is included in with the iron. Early steel making melted the natural mixture of ore and carbon, then removed some of the impurities (slag). The steel maker hoped the composition was correct and uniform. This was a poor assumption: because the starting material was variable, the quality of the steel was equally variable. Often the consequences were obvious in that railway lines or bridges collapsed. The major advance was to start with the original ore and try to totally eliminate all the carbon (emitting it as CO2 into the atmosphere). Once this was done, it was possible to add a measured amount of charcoal, and predict the compositional mixture. It worked: the iron and steel were controllable in terms of their properties.
Precisely the same approach is needed for semiconductor electronics, where materials such as silicon are deliberately ‘contaminated’ with impurities, including phosphorus and boron. Without these trace elements, there would be no semiconductor devices. The electronics industry is intelligent enough not to market their skill in emotive terms of impurities or contaminants; instead they call the additives ‘dopants’. This emphasizes that they knowingly added them to the silicon. Exactly the same style of progress occurred with the glass used in optical fibres. Immense effort is made to remove impurities that absorb light travelling along the fibres; these have been minimized down to parts per billion. This process results in glass a million times more transparent than window glass. Nevertheless, the fibre makers then have to add many other dopants to control the fibre properties.
|< Prev||CONTENTS||Next >|