<2Here we go again>2 MRS THATCHER has cunningly provoked the Labour party and the media into pushing her into holding a general election at the best time for the Conservative party. From the number of lengthy interviews she has given in recent weeks, it seems pretty obvious that the opposition has been playing the Prime Minister's game--she clearly intended to hold an election next month. In those inter- views, Margaret Thatcher has made much of science and technology. Not since Harold Wilson prattled on about the white heat of the technological revolution--or some similar meaningless platitude--and launched Concorde, has there been so much talk about innovation and our intel- lectual heritage. Mrs Thatcher has made much of Britain's scientific brilliance and innova- tive poverty. She has almost certainly over- estimated both factors; but in general she is correct in saying that Britain comes up with good ideas and often ends up importing the products that stem from those ideas. There is no denying that the present government has done something to influ- ence both sides of this equation. There is now a more equable climate for innovation and investment in new technology. It could be that fewer of those bright ideas will go astray. Unfortunately this is not just because the ideas will be taken up, but also because the savaging of the universities, especially the technically minded ones, will reduce the number of ideas. But let us turn to specific details of the present government's technological achievements. The Tories may have shaken up British Telecom because they dislike a state-run monopoly that seems to care more for itself than its customers; but the main impact on BT has little to do with monopolies or state ownership. BT was technologically sluggish before the govern- ment put the frighteners on it. BT is now crawling into the age of the silicon chip and the death of the British Tele- communications Bill, which would have allowed the government to sell off about half of BT, does not have to calm this wind of change even if the Labour party wins the election and reinstates BT's monopoly. It is too late for British Telecom to return to its old ways if only because the public now knows that it does not have to put up with a telephone system built for the 1950s. Another technological phrase that gained currency under the present government was "the cabling of Britain". This seems to mean two different things--cable television and optical communications. To begin with the government wants private operators to cast a spider's web of cable-TV over the country, It isn't obvious that this is the best way to go about it. Private investors will want a quick return. To achieve this they may have to adopt conservative technology--that is, tech- nology that has only a fraction of the communications power of optical fibres--and to pander to the needs of a mass audience. (If Channel 4 finds it hard to attract an audience, what are the chances of Channels 5 to 15, or even 50?) BT could, if given the incentive and the wherewithal, take the country into a genuine informa- tion revolution that goes far beyond the plans for cable television. Of course, such a step demands close control on the government's part. Conser- vatives don't like the idea of "meddling" in industrial affairs. Indeed, Sir Keith Joseph's two years as Industry Secretary were remarkably <1laissez-faire.>1 It was only after Patrick Jenkin became Industry Secretary that we began to see the prolif- eration of industrial support schemes. By contrast, the Labour party has been silent about new technology, preferring to mutter about unemployment without putting forward detailed proposals as to what it would do about the problem beyond throw- ing public money at the dole queues. (Let us not forget, though, that it was Mr Callaghan's government that, took the first faltering steps in the revolution in informa- tion technology.) One way to reduce unemployment would be to attack the problems of a less glamorous but equally important tech- nology. The country's civil engineering "infrastructure" is crumbling (see this issue, p 364). You don't win elections by mending roads and sewers, but you might lose them, if you don't. Fibre-optic communications, push-button telephones and microcomputers won't stop the country from literally collapsing around our ears. It is encouraging that the technical will does come from the top. The Prime Minis- ter talks a lot about science and technology. (She sometimes gets it wrong, as she did when awarding Cesar Milstein a Nobel prize for "cloning" something.) She even claims to be the government's science policy coordinator--a role we shouldn't attach too much weight too, according to Mrs Thatcher, who once told <1New Scientist>1 off for inflating this part of her job beyond what she really had in mind. Thatcher is quite right not to want to be the Minister for Science as well as Prime Minister. To a great extent she has to act as referee--she might prefer "umpire"--to see that her ministers play the game according to her rules. However, the idea of having a Science Minister who has other ministerial responsibilities does make sense. Perhaps Britain should follow the Cana- dian example. There the science and tech- nology portfolio has passed around among ministers, sometimes landing on a peculiar desk. It is now in the hands of the Minister for Economic Development. Not only does this mean that science is handled by a senior member of the Canadian cabinet--a tax lawyer who once said that the last thing that Canada needs is more tax lawyers--but also it means that science and technology are seen as a part of the process of economic development. Britain does have a Secretary of State for Education and Science, but there the science is secondary to the education port- folio and is too narrowly defined. Perhaps whoever becomes Secretary of State for Industry after the general election in four weeks time--what a mercifully short elec- tion period--should be called the Secretary of State for Industry and Technology. [] <2THIS WEEK>2 <2CERN physicists find the Z particle>2 CERN, Europe's centre for research into the fundamental nature of matter, may have pulled off another important discovery--the second this year. One group of researchers has found signs of an elementary particle known as the Zs (pronouned Z-nought). This news comes hot on the heels of the announcement earlier this vear concerning W particles, the electrically-charged partners of the neutral Z <1(New Scientist.>1 27 January, p 221). The W and Z particles figure promi- nently in attempts to develop a single unified theory of the particles that form the matter of the Universe and the forces that mould them together. Generally, different theories are used to explain widely different phenomena, from the forces in the atom to gravity. But unlike the W particles, which can be predicted by other "non-unified" theories, the Zs is a feature only of a theory that unifies forces in a particular way. As such the Zs, more than the W particle, is an indicator that the theorists are on the right track. The theory that predicts the existence of the W and Z particles evolved largely through attempts to understand the weak nuclear force. This force is effective at distances of one hundredth the diameter of the proton, or 10scm. Without this weak force the Universe would be very different, for the force mediates the radioactive decays of many subatomic particles and atomic nuclei. Such weak interactions underlie the processes that fuel the Sun and other stars. The idea was to develop a theory in which the weak force is carried by a type of particle. The precedent for this approach lies in the successful theory of quantum electrodynamics (QED), which describes the electromagnetic force in terms of the exchange of photons. The photons are the massless packets--"particles"--of energy that constitute light and all other forms of electromagnetic radiation. QED pictures electrically-charged particles, such as elec- trons, interacting by exchanging photons, much as rugby players interact by passing a ball between each other. In the case of the weak interaction the "balls" must be very heavy, because the force is so feeble and has such a short range: it would be difficult to throw a rugby ball made of lead very far, very often. The carrier of the weak force would also have to be electrically charged, for a neutron to decay into a proton, for example. Thus was born the idea of the W particle, which, according to the latest versions of the theory, should weigh in at some 85 times the proton's mass, just as two teams at CERN found earlier this year. A theory with only charged W particles turns out to have infinite quantities that arise in calculating certain effects. But these embarassing infinities cancel out if the theory includes not only the weak force, but also the electromagnetic force, and two neutral carriers as well as the charged W particles. One of these neutral particles is simply the photon, the carrier of the elec- tromagnetic force. The other is the Zs, which is predicted to have a mass in the region of 95 times that of the proton. The experiments that are now turning in the goods are on CERN's largest particle accelerator. This takes protons close to the speed of light, when their energies are hundreds of times greater than that needed to create the mass of a proton. But the device can also accelerate antiprotons, the antimatter equivalents of protons with the same mass but negative electric charge. The antiprotons travel in the opposite direction to protons around the ring of magnetics that steer the particles on a circu- lar path, so they pass many times through accelerating electric fields. At energies roughly equal to 270 proton masses each, the two beams of matter and antimatter collide, creating a host of subatomic particles from their combined pool of energy. The group of researchers that has seen the first indications of a Z particle is code named UAI, and is one of the two teams that reported the evidence for the W particle earlier this year. UA 1 is a collabo- ration of 100 or so physicists from Europe and the US, led by Alan Astbury from the UK's Rutherford-Appleton Laboratory and Carlo Rubbia from CERN. The team has been scrutinising new data collected during April, and last week they found one proton-antiproton collision recorded in their apparatus which bears the hallmarks of a Zo. What the apparatus "sees" is the decay of the Zo into an electron and a positron (an antielectron) flying off back to back from the point of the collision. If they do really emanate from a Zo the electron and positron should together carry energy equal to the particle's mass, and this seems to be the case. Clearly the UAl team remains cautious about claiming this single "event" to be a Z particle. But the researchers are optimistic. The mass is apparently about right, and the Zo has appeared at just the right frequency. The UAl team has now found four more possible W particles from the latest data, to add to the five found earlier. So everything fits neatly. The crucial test of the electroweak theory will come only when the experimenters have several Z particles under their belts and can say exactly what its mass is. Then the masses of the W and Z particles should relate in a precise way according to the electroweak theory. If this relationship proves right, it will be a triumph not only for CERN, and for the electroweak theory, but for particle physics in general. [] <2Victorious particle physicists face cash crisis>2 THE TIMING of the discovery of the Z particle has a particular irony for Britain's particle physicists. For they have just been faced with the prospect of losing #1 million of the money they have to spend on research this year. The #1 million is the particle physicists' share of the extra cost of Britain's subscriptions to international laborato- ries, such as CERN, caused by the weak- ness of the pound. These subscriptions amount to about 20 per cent of the Science and Engineering Research Council's budget. The nuclear physics board will lose #1 million out of its #45 million total budget. #30 million of the board's money is accounted for by the CERN subscription, and a further #10 million goes on salaries and maintenance, leaving only #5 million for capital investment in vital equipment tailor-made for experiments at CERN and other laboratories. Of this, #1 million is a large chunk that will reduce the UK's ability to contribute to the type of re- search that led to the discovery of the Zo. [] <2Lead: car firms bid to call shots>2 EUROPE'S car-makers have decided to fight Britain's proposal to convert the continent's new cars to low-octane, lead- free petrol by 1990. The counter-attack will begin next week in the office of Giles Shaw, under-secretary at the Department of the Environment. There, representatives of British manufacturers will try to overturn the government's commitment, made only last month, to put the onus on the car firms to produce vehicles that could run on low- octane petrol by 1990 at the latest. The manufacturers want the burden of making the change to lead-free fuel put onto oil companies who could, at a price produce substitutes for lead in petrol. This would allow high-octane petrol to remain the staple of British garage forecourts. Last week, Shaw announced that tri- partite meetings between himself and the oil and motor industries would begin within a fortnight. Their purpose would be to set a timetable for the conversion of British cars to low-octane fuel. But this week the Society of Motor Manufacturers and Traders told <1New Scientist>1 that it intended to force a reopening of the debate on whether low-octane fuel should become the norm in Britain. It will tell Shaw that the 92-octane norm is a "non-starter", according to a spokesman. Sister bodies in Europe have agreed to tell their own governments the same thing. Britain's car manufacturers and oil companies now accept the inevitability of lead-free petrol. But they are battling over how to go about it. So far the oil firms are winning. They want low-octane fuel because they can manufacture it without having to spend money to make substitutes for lead, which would be necessary to retain higher octane fuel. "It is the most energy- and cost-effective way to introduce lead- free petrol for the mass market," Douglas Harvey, director-general of the UK Petroleum Industries Association, said last week. The Royal Commission on Environmental Pollution and government ministers accept this view. The motor industry thinks different, Retooling to make cars that run on low- octane fuel will cost money. More important, the poorer performance from low-octane cars might reduce sales and could lead to a renewed onslaught by the Japanese car firms, which have a head's start in the production of low-octane cars, on the British market. The battle between the oil and car firms has been mirrored among environ- mentalists. Des Wilson, head of CLEAR, the anti-lead campaign, has accepted the oil industry's case and is claiming outright victory. But other campaigners, such as Professor Derek Bryce-Smith from Read- ing University, believe victory may be a long way off. It will take many years to phase out existing cars that run on high- octane leaded petrol. A high-octane lead- free substitute could be run in existing cars almost immediately. There are two ways of producing high- octane lead-free petrol. You can do it by more intensive refining of crude oil. This increases the proportion of hydrocarbons with stronger and more compact molecular structures in the petrol and so raises the octane value. But it requires more crude oil than "normal" petrol. The second method is to add a substitute for lead, such as methanol or methyl tertiary-butyl ether (MTBE). Methanol is toxic and dissolves the soldered joints in petrol tanks. It also needs a second additive to help it dissolve and raises the vapour pressure of petrol, which can cause havoc with a car's Fuel pumping system. MTBE is more expensive but better behaved. In fact Shell, Britain's leading oil company, marketed ether blends of petrol in the 1950s for hot-rod enthusiasts. At the moment the entire European production of MTBE is around 500000 tonnes per year, though this will double if a company called Highland Hydrocarbon decides to go ahead with its plan for a new plant making MTBE from North Sea gas at Nigg Bay in Scotland. The motor industry remains hopeful of eventual success. It calculates that drivers will be angry when they hear that the price of lead-free petrol could reduce performance--especialy in West Germany where there are no speed limits on autobahns. They say that 20 per cent of American motorists who can run cars on unleaded petrol are switching back to lead because it gives a better performance. Also, the pressure on ministers to hustle along with the removal of lead may subside after a general election. As one senior civil servant in the Department of the Environ- ment's directorate on environmental pollu- tion told an anti-lead campaigner after the minister's announcement: keep on lobbying. [] <2Canvey goahead>2 IN ITS dying days, the British govern ment has given the goahead for a contro- versial plan to build another oil refinery on Canvey Island in the Thames estuary. The island already has more hazardous plants than any other part of Britain. Ministers also found in favour of allow ing a methane terminal, run by the British Gas corporation less than a kilometre from the refinery site, to stay open. [] <2Science you can touch--Bristol fashion>2 BRISTOL may soon host Bri- tain's first centre for "interactive science". Exploratory will give visitors the op- portunity to discover why mirrors invert sideways but not up and down, to measure their heart rates while pedalling a bicycle and to re-enact Galileo's experiments with falling weights on inclined planes. Richard Gregory of the Brain and Percep- tion Laboratory at Bristol University is a key figure on the board of trustees for Exploratory. He says that the centre is not in competition with existing science muse- ums, where valuable exhibits are preserved inside glass cases. Visitors will "re-enact experiments with the technology available in previous centuries, to show the immense significance of technology for science from the earliest times". It will be a place to link technology with science, to stimulate and amuse, a place to find out about the natural and man-made world. Initially Exploratory will house 500 exhibits, Gregory hopes it will become a centre for research into communication and understanding in science. Exploratory's trustees want to raise #2 million from industry (their main target is computer firms) to set up the centre and guarantee its future for at least three years. A likely site for the centre is an old tobacco warehouse. It could be linked with other tourist attractions in the city by running steam trains along an existing track through the docks. The Nuffield Foundation (which paid for the feasibility study) has given #30 000 and the trustees are advertising for a project director this week. [] <2Last chance to see the IRAS Comet>2 TONIGHT (Thursday) is the last chance in Britain to see the comet that has passed closer to the Earth than any for two centuries--and has caused much confusion over who discovered it. Comet IRAS-Araki-Alcock is widely claimed to be the first comet discovered by an orbiting observatory--the Infrared Astronomical Satellite, IRAS. Moving objects spotted by IRAS are analysed at the Jet Propulsion Laboratory (JPL.). Califor- nia. But analysis at the JPL. is slow. So John Davies and Simon Green of the University of Leicester have collaborated with the Rutherford-Appleton Laboratory in Britain to look for fast-moving objects in RAS signals as they arrive from the labo- ratory's receiving station in Oxfordshire. They spotted a fast-moving object on 26 April and messages went out to optical observatories to identify it. Astronomers in Tokyo took a picture but reported nothing visible, although they now say there was a bright object and "we cannot imagine how we missed it". Swedish astronomers photo- graphed the region on 27 April, and saw the comet--but the British bank holiday delayed the confirmation for several days. In the meantime, two amateur astrono- mers found it simultaneously--Genichi Araki in Japan, and George Alcock in England. Alcock found it within one minute of starting his regular binocular search of the sky, after nine nights of cloud. It is his fifth comet discovery. Astronomers at JTL, afraid of being scooped by the amateurs, put out a press release claiming IRAS discovered it first--but they failed to mention the British team that found it for them. The latter is clearly stung. Leicester University's Professor Jack Meadows told <1New>1 <1Scientist>1 they were suffering "dignified pained surprise". Ironically, George Alcock maintains that he does not mind sharing the honours with the professionals. Meanwhile, the comet swept past the Earth yesterday at less than 5 million kilo- metres, the closest since Lexell's comet of 1770. Tonight it is in the constellation Hydra (to the left of the bright star Procyon), more-or-less due south. It should be visible to the naked eye as a fuzzy patch (without a tail) two to three times the size of the Moon, and moving at a relatively high speed--by about its own diameter every hour. By tomorrow night (Friday) it will be too far south to be easily seen from Britain, as it heads towards its closest point to the Sun on Saturday week. [] <2Europe plans to boost research>2 OFFICIALS at the European Commis- sion in Brussels hope to receive approval next month for an ambitious programme to breathe new life into conventional industries using new technology. The project is part of a four-year plan to double the amount that the European Community spends on research and development. The officials want to promote innovation in tradi- tional industries, such as steel- making and clothing, to the tune of #210 million over four years. The cash would pay for research into technologies such as lasers and fund demonstra- tion projects in factories. The aim is to fight off competition from developing nations like Brazil and South Korea. Also a part of the overall plan to boost resesrch is the European Strategic Pro- gramme for Research and Development in Information Technologies (ESPRIT). The project being proposed by the Commission would put up #450 million for collabo- rative work in computers and automation. The cash would be matched by a similar sum from private companies. The rubber stamp for the proposals could come at a meeting of the Community's Council of Ministers at the end of June. Officials say that the German government--which presides over the council until 1 July--is anxious to debate the research plans before its period of stewardship is finished. In their proposals for the conventional industries, officials at the commission have earmarked a list of technologies. For example, research in composites, polymers and other new materials could aid a range of engineering industries. Other disci- plines in which the Commission wants to find R&D include surface science, new testing methods, joining techniques, catalysis and corrosion research. At the top of the list of industries which the Commission wants to help is the clothing business. In recent years, clothing firms in Western Europe have shrunk in half. Yet the Commission says the industry still has a lot of life in it. For instance, lasers could cut cloth into small segments. Then robots could transfer the cloth between different work stations in a factory, where automated sewing machines would put the prices together. The programme on the conventional industries forms a key part of a "framework document" on research that the Commis- sion produced last November. It would shift the balance of community research away from energy projects--which at present take up 63 per cent of the Commis- sion's research funds--and towards schemes that promote industrial competitiveness. Peter Marsh [] <2Acid rain kills Welsh fish>2 ACID RAIN is decimating the fish stocks of many Welsh rivers, according to a report presented to the WeIsh Water Authority by Roscoe Howells, its director of scientific services, this week, A survey has revealed that "many of the upland streams, rivers and lakes draining afforested catchments in Dyffed and Gwynedd [south-west and north-west Wales respectively] cannot now support natural fish populations and have depleted populations of aquatic plants and animals". In the river Tywi in mid-Wales "native brown trout cannot survive the combined effects of the acidity and elevated aluminium concentrations found in water draining from conifer forests in the area." The Berwyn catchment in north Wales is now too acidic even to support the American Brook Charr, which was introduced specifically to cope with the acid. Howells adds: "The genetic implications of introducing exotic species are causing concern." In the Brianne catchment of mid-Wales streams with a hardness of less than 8 milligrams per litre of calcium carbonate "have a depleted flora and fauna and salmonid fish are either absent or present only in small numbers". The report goes on to warn that "failure to neutralise acidic waters can result in excessive corrosion of [water] mains with, in some areas, unacceptably high lead levels due to plumbo- solvency" [that is, lead in pipes being dissolved by the acidic waters.] [] <2Kohl's call to rearm science>2 WEST GERMANY'S new Chancellor, Helmut Kohl, has calIed for the revival of scientific elites and more competition between universities in order to revive his country's tech- nological fortunes. "Our society needs technical progress," Kohl said in his first policy declara- tion since his election in March. The first visible sign of his drive for progress has already been set in the nuclear field. Kohl has decided to go on with a fast-breeder reactor in Kalkar on the Rhine, although development costs have quadrupled to 6-5 billion DM. [] <2Canadian satellite U-turn>2 THE CANADIAN government has made a complete about-face over the regulation of private satellite-TV dishes. In the process it has relieved 5000 Canadians who already own dishes from the threat of prosecution. In the past, Ottawa has taken such owners to court in an attempt to maintain regulatory control over what TV programmes Canadians would be allowed to see. The government has now effectively given up trying to protect Canadian television (much of which is publicly funded) from new competition from US satellites. Instead, it is providing extra money for the development of film for TV. The decision will create problems for Canada's new pay-TV channels which have been sending signals via satellite to cable companies. Dish owned will receive the programmes free, unless the companies switch to the expensive option of transmitting scrambled signals. [] <2Smoking cure claim>2 A MONTREAL donor has come up with a smoking cud that he claims is better than 85 per cent effective. Dr William Najaar has concocted a mint-flavoured mouth-wash. The green product is now available in Canada at about $14 for a 100 millilitre bottle that will last for ten days--long enough, he says, to cure anyone's smoking habit. Tabinal is used as a rinse twice a day after breakfast and dinner. The mineral salt in the solution reacts with the salivary glands and taste buds. Najaar claims that it does not have any side-effects and that it does not affect the taste of food. But when a user smokes 12, his mouth will taste like "four dozen chewed-up cigars", [] <2Sizewell's pressure vessel under the microscope>2 THE CENTRAL Elec- tricity Generating Board's decision not to design the pressure vessel, the massive steel "heart" of its proposed pressurised- water reacter at Sizewell, in larger components came under fire last week from two eminent nuclear engineers. Concern over the risk of stress, affecting the lengthy girth welds on the upper part of the pressure vessel, has prompted criticism from Dr David Leslie, professor of nuclear engineering at the University of London. His critique is supported by an analysis of the reactor's design carried out by Dr Karl Kussmaul, director of West Germany's state labora- tory for the testing on materials and a member of the German Reactor Safety Commission. Their reservations were voiced at the long-running Sizewell public inquiry, where the Suffolk local authorities have just finished their evidence. Professor Leslie is the authorities' safety consultant. His pre- nuclear views are well known--he was a founder of the pressure group, A Power for Good--but he said he was "unconvinced by the board's choice of style of pressure vessel." Dr. Kussmaul argues that the integrity of the pressure vessel can be more adequately assured if the whole component is cast as one unit. The generating board is proposing that the part of the vessel around the coolant nozzle, where the cooling system joins the vessel, will be cast in sepa- rate parts. Dr Kassmaul says these parts should be combined into one "single, smooth highly reinforced ring . . . the resulting monolithic structure has lower stresses than the composite one with through-wall nozzle weldments." Doubts over the safety and reliability of the PWR pressure-vessel design wcre a key reason for the initial rejection of the PWR for the British nuclear programme. Following requests from objectors the inquiry inspector, Sir Frank Layfield, has directed the United Kingdom Atomic Energy Authority to produce a hitherto- unpublished report which is critical of current CEGB assurances on the likelihood of significant crack developing and affect- ing the integrity of the pressure vessel. The NII, the government's safety watch- dog, last week finished explaining what issues still need resolving. Six major points require fundamental design changes, perhaps costing as much as #60 million. The pressure vessel is not among them, [] <2Genes test for employees>2 EIGHTEEN large American companies, are now testing prospective employees for genetic traits that might put them at special risk in their work. The first survey ever of such tests has found that another 59 companies say they may start testing. Congress's Office of Technology Assess- ment asked America's 500 largest com- panies, plus several utilities and unions, to report whether they monitored workers or screened applicants. Those that do are, it turns out, using methods that do not meet scientific criteria for routine use in the workplace, the OTA reported. Most companies said they used the results as research, although a few said they advised workers that they were at higher risk in certain jobs than others. Although some 10 million American workers are regularly exposed to dangerous chemicals or radiation, very little is known about links between induced genetic damage and subsequent disease. Genetic tests of body fluids fall into two categories. Genetic monitoring regularly assesses workers for cytogenetic damage-- major changes to chromosomes--or may apply other, less perfected, tests that locate mutagens in body fluids or look for damage to individual genes. Genetic screening inspects workers before exposure to seek out inherited traits such as sickle-cell, a defect in blood cells that was the most common of genetic sign- posts the companies looked for. Congress- man Albert Gore, who requested the survey, noted that screening carries Orwellian implications. He said that there are no guidelines in the US for controlling what tests are done and how their results are employed. He fears that groups of workers might become the subject of discrimination based on inherited traits. Some companies search for a deficiency in an enzyme, glucose-6-Phosphate Dehy- drogenase (G-6-PD). Controlled by a single gene, the deficiency is usually harmless, but certain drugs for malaria or a type of bean can cause those without the proper gene to suffer acute anemia, and it is thought that some chemical might do the same. [] <2Spacelab will fly---but half blind>2 SPACELAB, Europe's first major voyage aboard America's space shuttle, will probably hobble through its paces at less than full capacity, according to scientists involved in the project and the National Aeronautics and Space Administration. Last month, the first of two Tracking and Data Relay Satellites (TDRS) which are needed for the spacelab mission, failed to reach geosynchronous orbit after being launched from the space shuttle Chal- lenger. The 2250 kilogram communi- cation satellite is now in an egg-shaped orbit well below its target, some 35000 km above Brazil. Last week, officials from NASA began the first of several dozen firing to nudge TDRSA into its proper orbit. They said they were opti- mistic. However, the unexplained failure of the upper-stage booster attached to the $100 million satellite makes the launch of the TDRS (B) "very remote", say NASA's experts. Scientists from Europe and from NASA met this month to determine whether Spacelab could operate with just one TDRS satellite. They resolved that despite the "significant" loss of scientific data from some of the 70 experiments on board, Spacelab should be launched on schedule, on 30 September this year. The relay satellites, each of which can relay 300 million bits of information, or about five million words, per second, were needed to handle the massive flow of data collected and immediately beamed down by Spacelab's instruments. Without them some data gathered in orbit will have nowhere to go. According to officials of the European Space Agency (ESA), about 60 per cent of the experiments will go ahead as planned. The rest will suffer some loss. Biological experiments will be particularly hard-hit, they said, although they declined to be drawn about which particular experiments could be expected to suffer. Because the shuttle will be out of contact with a TDRS for about half its 90-minute orbit of the Earth, scientists on board will be cut off from those on the ground who designed the experiments. Scientists are concerned that a special event, such as a sunspot or supernova might occur during this long period of communications blackout. Also, Spacelab was designed to relay its data immediately; now some data will have to be recorded on 25 to 30 videotapes that NASA is now making room for aboard the space shuttle. Christopher Joyce, Washington DC <2Etna-watchers slip up on lava flow>2 THIS WEEK's attempt to divert the flow of lava gushing out of the erupting Mount Etna in Sicily could be an ex- pensive failure--because a British team of vulcanologists taking essential measure- ments has had to return home after its money ran out. Meanwhile, Europe's scientists are claiming that observation of the volcano is a dog's dinner. They say the observers are underfunded and dis- organised. The result is both a danger to the local population (such as the town of Nico- losi, which is in the path of the present flow) and a loss of valuable research oppor- tunities. John Guest, from University College, London, spent three weeks at Etna along with two colleagues, at the start of the eruption six weeks ago. But they had to return because an emergency fund from the Royal Society, which paid their bills, ran out. While Guest was there he measured the flow rate and thickness of the lava outpouring--the kind of data which, he says, is essential for under- standing what will happen when the flow is diverted into a side channel. As it is, the estimates by engineers this week of how the diverted lava will flow and how soon it might return to its old course will be based largely on guesswork. Delegates at a conference on the sur- veillance of volcanoes, held in Palermo, Sicily, last week, heard that observations at Etna were being neglected. Speakers had masses of data on well-studied Hawaiian and Japanese mountains, but little on the one erupting within sight of the conference room. A new observatory on the north-east slope of Etna, which was finished in 1981, stands empty. A row within Italy's National Research Council means that there is no money to run it. Haroun Tazieff, the french commissioner for natural disasters, attacked the "armchair vulcanologists" of the Univer- sity of Catania's Institute of Vulcanology. They lack the stamina and the courage to keep close tabs on the mountain, he said. [] <2Greek physicists perfect earthquake prediction>2 THE earthquake in California last week came without warning and highlighted the need for a reliable method of predicting earthquakes. Even a few hours' warning can save lives and reduce injuries. Now a Greek team, headed by Professor Cesar Alexopoulos, of Athens University, has developed a technique which seems to provide just that much warn- ing--at least for earthquakes that have hit Greece repeatedly over the past two years. The story goes back to the major earthquake, magnitude 7 on the Richter scale, which rocked Greece in february 1981. It was centred on Athens and felt by more than a third of the population. Since then, the region has experienced a rash of smaller earthquakes. Amidst the excitement after that earthquake, a team of physicists from Athens Uni- versity discussed how they could put their knowledge of the behaviour of solids to practical use for predicting earthquakes. Alexopoulos and his co-worker Panayotis Varotsos devel- oped their work on point defects in crystals in this cause. The key feature of the crystal work is an electric current (known as the piezoelectric effect) produced when some crystals are squeezed, or when pressure on them is re- laxed. When a solid which contains electric charges, produced by the presence of im- purities in the crystal lattice, is subjected to a steadily increasing pressure, it will give a short-lived pulse of electricity--a transient electric current--well before it ruptures. Could this property be useful in earthquake prediction? The Athens team found that it could. In the weeks after the 1981 earthquake, it discovered a jumble of these electric pulses superimposed on the normal electri- cal properties of the solid Earth. These seemed to be correlated with the after- shocks from the 'quake, predicting the shocks several hours ahead. The team devel- oped a system to amplify and record to monitor these electric pulses, and gave it the name "VAN box", from their initials. With detectors spaced 100 km apart, they found that the strength of the pulse before an earthquake depends on the distance of the detectors from the epicentre of the 'quake. This gave them the key to predicting both the location and approximate intensity of the earthquakes, with a delay of between 7 and 11 hours from the prediction to the event. A minimum of three VAN stations are required to fix the location by trian- gulation, but the modest range of the tech- nique means that a network of many stations would be needed to cover even a country the size of Greece. With little offi- cial funding or backing, the team went ahead to establish just such a network. Today, they have 18 VAN-equipped automatic stations in different parts of Greece. The stations are connected by tele- metry to the suburban house of Varotsos in Glyfada, near Athens, where he and his wife Mary maintain a 24-hour watch on the instruments. Continuously rolling recorder charts are checked for the distinctive blips. Three more stations are operated manually. The 21 stations are all located in Army camps, where volunteers assist with the monitoring. The cost of the work had been met piecemeal by the state as the network has developed. Very roughly, Varotsos guessed, the total cost of hardware is about 20 million drachma (#160000. Earthquakes can now be reliably pre- dicted, for the greater part of Greece. Pre- dictions of the loca- tion of the epicentre are accurate to within 50-80 km, and strength on the Rich- ter scale has also been predicted successfully. These successes ap- ply to 400 seismic events since March 1981. No major earthquakes have occurred in the region since 1981, so there is no certainty that the prediction technique will work for such events. There is also no way of knowing what use the authorities will make of the information if and when the VAN team does give them a warning that another Richter-7 earthquake is due in 7 or 11 hours time. But interest in the work is running high, and Russian and Chinese teams have established contact with the Greeks and are drawing up plans to estab- lish their own networks. Will California be next? Costis Stambolis [] <2Genes give a lift to small children>2 BIOTECHNOLOGY seems set to come to the aid of the short chil- dren of America. The human growth hormone is the next in a long line of chemicals to be manufactured indus- trially, using recombi- nant-DNA techniques. Francisco-based firm, aims to supply the prod- uct to the 10 000 or so American children who suffer from a deficiency of the hormone, which is vital for normal growth. But critics have warned that a plentiful and cheap supply of the hormone is open to abuse in a nation that is obsessed with the idea that "big is better". The company hopes to receive approval to put the product on the market within the next few months. Dr Alfred M. Bongiovanni, professor of paediatrics at the University of Pennsylvania, Philadelphia, told the Amer- ican Academy of Paediatrics that once the growth hormone reaches the market, doctors might be pressured by parents to give the hormone to children who are slow developers. These children make up the majority of patients in growth disorder clinics. "Very often these children achieve satisfactory final heights without any treat- ment whatsoever," he said. Currently, less than 3000 children in the US, who are deficient in growth hormone, are being treated with hormone that is derived from the pituitaries of dead people. Some 2200 receive free treatment through the national hormone and pituitary programme, in Baltimore. The cost of therapy with natural growth hormone runs to about $5000 to $10000 per year. Little is known about the side-effects of synthetic growth hormone, particularly in normal children. But Dr Bongiovanni said that, in animal studies, very high doses of the hormone have induced diabetes. And normal adult volunteers, given growth hormone derived from DNA at twice the normal dose, have developed raised blood- sugar levels. But other doctors disagree. Dr Raymond Hintz head of paedia- tric endocrinology at Stanford University, for instance, said that none of the adults in his studies became diabetic. And, during his previous experience treating child- ren with natural growth hormone, none has ever developed elevated blood-sugar levels. Genentech says that once the com- pany gains approval from the food and Drug Administration it will distribute the drug only to hospitals. However, the company has its eye on the estima- ted 200 000 American children with "normal variant short stature", who produce growth hormone but apparently lack sufficient receptors to put it to work. Some of these children respond well to doses of growth hormone. Just how many are helped is a question now being taken up by the Lawson Wilkins ;: Society a group of paediatric endo- crinologists. [] <2Nuclear submarines should be buried in the desert---not at sea>2 THE UNITED STATES' navy is under heavy fire for its plan to dispose of 100 or more of its old nuclear submarines off the Atlantic and Pacific coasts over the next 20 or so years. A committee of scientists, commissioned by the Oceanic Society, has chastised the navy for its "weak scholarship". Burying the submarines in land dumps would be much safer, it says. The society claims that the navy has ignored the effect of radioactivity on deep- sea ecology, a subject about which scientific knowledge is still sketchy. "The principal concern", says the committee's report, "is radioactivity which, during operation of the submarine's propulsion system, accu- mulates in the stainless steel equipment which comprises the vessel's reactor and steam generator." The navy is considering dropping the submarines either into exist- ing nuclear dumps in South Carolina and Washington state or in watery graves some 4000 metres down off Cape Hatteras, North Carolina or Cape Mendocino, Cali- fornia. The navy says that burial at sea is less expensive, demands less shipyard work, and is isolated from human activity. Corrosion-resistant metals in the hull and reactor plant would keep radioactivity in check after burial at sea. "Most ofthe radio- active nuclides would have decayed to stable atoms before they could possibly be released to the environment by the slow corrosion process." lt adds that 99-9 per cent of the radioactivity remaining in each submarine is "an integral part of the corrosion-resistant alloy forming the plant components." The Oceanic Society does not accept many of the navy's presumptions. The navy says an accident would not give anybody a dose higher than the four milli- rems per year limit set for drinking water in the US. But the society says the navy has left little margin for error in a "worst case" incident. And the society's scientists claim that the navy's consideration of land burial is "sketchy and inadequate". They recom- mend that the sealed sections of the hull containing the reactor compartments should be buried in an open trench in an arid or semi-arid environment. The climate is crucial, because a reactor compartment exposed to dry air will rust more slowly than one buried in soil or sunk at sea. The navy appears to have ignored litera- ture on the migration of radionuclides from wastes into the water, sediments and organisms of the ocean, the scientists say. Communities of fish and invertebrates are attracted to structures such as oil platforms and nuclear waste dumps. The waters off North Carolina host numerous fish, a potential biological vector for trans- port of waste materials. [] <2Pit-fire could last a hundred years>2 ON COLD, damp spring days, the streets get so smoky in Centralia, Pennsyl- vania, that traffic is forced to a standstill until visibility im- proves. And the smell of coal gas forces residents to wear make-shift masks. This small American town (population 1100) is being engulfed by smoke from a huge under- ground fire in coal seams beneath its streets. The fire is burning at temperatures up to 500oC in the honey- comb of mine tunnels 150 metres below the surface. The danger of toxic gases, such as carbon monoxide and sulphur hexafluoride, has led to the installation of a computerised gas-detection system in 60 homes. One day, engineers fear, the town will collapse into a fiery hole. The fire probably started in a rubbish dump, above an old strip mine a few miles east of the town, in May 1962. From there it spread through the abandoned shafts and tunnels, feeding on oxygen in the shafts. The federal government has spent $2-8 million drilling 1635 holes, injecting 122 556 tons of fly ash, and flushing 117 220 yards of sand into the burning tunnels in an attempt to put out the fire. All efforts have failed. If left unchecked, the fire could consume 24 million tonnes of coal in the next 100 years. The only solution is to relocate the entire town, cut out all the burning coal, fill the cavity with gravel, and then dig a trench around the perimeter of the old fire. In 1977 the cost of this solution was put at $67 million. There have been no injuries from the fire. The nearest accident came in February 1981 when a 12-year old boy was nearly swallowed by a sudden opening as he walked across his grandmother's lawn. He clung to a tree root until his cousin saved him from slipping into the abyss, [] <2British relief for AIDS victims is threatened>2 A NEW technology developed at a British university, which relieves the symp- toms acquired immune deficiency syndrome (AIDS) the fatal condition rife among American homosexuals--may languish for lack of extra funds. Scientists at Heriot-Watt University in Edinburgh are using biotechnology to produce a protein in large quantities which can "filter" the blood of AIDS patients and remove the agent causing the worst symp- toms. But the uncertain future of Speywood Laboratories, a small biotechnology company which funded the project threat- ens its development. The protein, Protein A forms part of the cell wall of the bacteria <1Staphylococcus>1 <1aureus.>1 It binds IGG, one of the major immunolobulins produced by the human blood in response to attack from outside and which doctors think they be destroying the AIDS patients' own tissues. At the moment protein A is so scarce it costs between #5 and #10 a milligram. Charles Brown, professor of micro- biology at Heriot-Watt University received about #30 000 from Speywood last year to develop a method of producing protein A continuously. It could yield kilo- grams of the material at a time. He needs an extra #100 000 to scale up his method to 10 litres--big enough for clinical trials. Brown says that additional funding looks uncertain. Speywood recently ran foul of its backers, the British Technology Group and Prutec, the investment arm of Prudential Assurance company (New Scientist, 17 March, p 704). These companies expected Speywood to spend their #4 million investment on perfecting and marketing blood proteins from conventional technology--not funding new biotechnology projects. Some of these projects, including Charles Brown's, are currently under review. Most are destined for the chop. The DHSS has pumped #21 million into a new blood purification complex at Elstree which has its own research and development facilities. This factory, should make Britain self-sufficient in blood by 1985. At the moment the health authorities import between 60 and 70 per cent of their blood from the United States, where AIDS and hepatitis B contamination is rife. The laboratory has no money to pick up Charles Brown's project, should AIDS arrive here with a vengeance, although it is currently collaborating with Speywood to adapt its conventional purification tech- nology to human blood. [] <2Searching for cases of AIDS in Britain>2 LATEST figures show that, in Britain, there are eight cases of acquired immune defi- ciency syndrome (AIDS). One of these AIDS victims was a haemophiliac who probably contracted the disease through infected blood products. The number, which has been confirmed by the National Communicable Disease Surveillance Centre in London, is about half the figure frequently reported. The discrepancy, however, is the result of the problem of defi- ning AIDS. Diagnosis is a problem on several counts. Its early symp- toms are similar to other disorders. One of its features, a lessening of the body's. resistance to infection, seems to be common in homosexuals who do not have AIDS. Indeed, as many as four out of every five American homosexuals may have acquired such an immune deficiency which has not yet reached the full-blown AIDS stage. And, finally, a skin cancer called Kaposi's sarcoma, which is an indication of the syndrome, occurs in any case in elderly people who do not have AIDS. The problems of confusing AIDS with other conditions has led the Commu- nicable Disease Surveillance Centre, based at Colindale, north London, to adopt a set of criteria. A person has AIDS if: o The patient has a reliably diagnosed disease which is at least moderately indica- tive of a defect in part of the body's defence mechanism. Such a disease is likely to be "opportunistic"--in other words it is unable to attack a normal healthy body where the defence system is fully oper- ational. The most common opportunistic infections associated with AIDS is a type of pneumonia, called pneumocystic pneu- monia, and Kaposi's sarcoma. o There is nothing else to explain the suppression of the immune system, such as the use of drugs, such as steroids, that are deliberately used for their immunosuppressive effects. And, to eliminate the possibility that Kaposi's sarcoma may be the normal type, the patients must be under 60. Armed with this definition, the director of the centre, Dr Spence Galbraith, and his colleague, Dr Marian McEvoy, have collected information in an attempt to assess accurately the extent to which AIDS has penetrated the population of Britain. Two AIDS cases have been identified from death certificates where Kaposi's sarcoma has been mentioned as cause of death. Three were from weekly hospital and public health laboratory reports, and two more from clinical reports from vene- reologists and dermatologists. But Galbraith admits there will be more: "Our results could seriously misrepresent the total number because sufferers do not necessarily go to venereologists or dermatologists. Many could well be going to ordinary doctors," he said. Since publishing an open letter in the <1British>1 <1Meaical Journal>1 asking all doctors to coop- erate in the survey, the centre has had enquiries from doctors who may have patients suffering from AIDS. There seems little doubt that AIDS is caused by an infective agent of some kind. This would explain it prevalence among the more promiscuous gay communities, and its apparent transmissibility through blood products such as factor VIII used by haemophiliacs. If the agent does exist then it must attack one of the two sub-divisions of the body's immune system. This sub division, the cell-mediated immune response, is hampered by a lack of a certain type of white blood cell called the T-cell. The other sub- division, the B-cell branch, is left untouched. But the lack of T-cells which are supposed to recognise foreign pathogens entering the body, means that antibodies made by the B-cells are just not produced. It is the T-cells which stimulate the B cells to produce the necessary antibodies to protect the body against an invading pathogen. This explains why AIDS victims are prone to pick up, and indeed eventually die from, opportunistic infections, such as pneumonia caused bv the protozoon <1Pneumocystis>1 <1carinii.>1 But why is it that AIDS victims suffer from such obscure infections? (<1P.>1 <1carinii>1 is well known but by no means common.) One suggestion is that AIDS patients, like the rest of us, have suffered from common viral and bacteria infections and so already possess the necessary anti- bodies for defence. There are fears that AIDS is the fore- runner of new, and disturbingly dangerous breeds of viruses which attack the body's defence mechanism. However, many viro- logists, including Tony Waterson from the Royal Postgraduate Medical School, disagree. It is just that there have been "colossal changes in the transfer mech- anisms", he says, and widespread use of blood products, and increased homosexual promiscuity are but two of them. As a group of German doctors has said in a letter to <1The Lancet,>1 the AIDS epidemic does not even indicate a new infectious agent but "merely the introduction of an old patho- gen into a group whose promiscuous life- styles would ensure rapid spread". In America the Centers for Disease Control have identified 1336 AIDS cases. As AIDS continues to spread the 80 per cent mortality rate underlines the need for a greater understanding of its root cause.[] <2Nurses as researchers>2 ---------------------------------------------- Nurses are admirably placed to carry out clinical research. Now the few who do are banding together ---------------------------------------------- Georgina Ferry THE nearest a nurse usually gets to clinical research is to act as the research team--consisting mostly of doctors. She collects samples, takes measurements and no doubt makes the tea: she is not expected to analyse the data she collects, and certainly not required to think about the ques- tions those data might answer. But there are exceptions. A growing number of nurses are making a personal contribution to clinical research: some are even aiming for a higher degree. In any one institution such nurses are very few, and they find them- selves cut off from research nurses elsewhere, as well as from nurses engaged in routine work on the wards. Some are now attempting to combat the isolation they feel by setting up a national body to repre- sent their interests. But they have had to overcome a surprising amount of resistance to the idea that nurses should be involved in research, chiefly from within the nursing profession itself. In London, a small group of clinical research nurses based in the St Peter's group of hospitals and the Royal College of Surgeons has been meeting every two months for over a year, calling themselves the Clinical Research Nurses Association. At these meetings, the members can share their experiences while presenting papers on their own research. The five research nurses at Abingdon Hospital, near Oxford, inde- pendently came up with the idea that there was a need for a national representative body. The obvious place to start was the Royal College of Nursing (Ren). The Ren has its own research society, to which a number of nurses in clinical research had individually applied for membership. Apart from the more obvious advantages of belonging to such a group, the Ren had an indemnity insurance scheme that clinical research nurses wanted to join. But they were rebuffed on the grounds that the society was not concerned with clinical research. Its members conduct studies that are mainly sociological in nature, looking at questions like the importance of counselling to hysterectomy patients. To some of the clinical research nurses, the attitude of the Ren seemed to be that clinical research was simply not the province of nurses. Disturbed but not deterred by this unpromising start, the Abingdon and London groups independently approached the Ren to discuss ways in which their activities could be brought under its wing. It was clear to the research nurses that they needed the Ren's administrative and financial backing in order to get a national association off the ground. Tom Keighley. research adviser to the Ren, took up their case. He admits that there has been "a degree of concern" within the Ren about the activities of nurses in clinical research. But he maintains that the attitude is now far more open-minded. He is currently involved in negotiations with the Abingdon and London groups of nurses with a view to providing a forum within the Ren through which nurses in clinical research can main- tain contact, but could not say exactly when such a forum might the established. Keighley has also been involved in the arrangements for the first conference of clinical research nurses, at which they hope formally to launch a national organisation (still to be named). The conference takes place at the Royal Society of Arts on 27 Mays. It will air questions such as the role and status of the clinical research nurse, legel aspects of her work and industrial relations; speakers will include the nurses themselves and the doctors they work with. What do the nurses hope to achieve by the launching of a national body? Their chief concern is that their status as skilled specialists should be recognised and respected. Patsy Poppleton, who is engaged in research at the Pain Relief Clinic at Abingdon Hospital, resents the fact that research nurses are often looked on as "slaves and handmaidens to doctors". She feels that a period in research should be an asset to a nurse's cv, rather than being regarded as time off from "real" nursing. Chiew Kong, who is working towards her PhD in cardiovascular physiology at St Philip's Hospital in London, agrees that a research nurse's expertise is often under estimated. She also outlined another misconception that could explain the suspicion research nurses often encounter among other nurses. This is the idea that research is incompat- ible with the nurse's prime concern. which is the well-being of the patient. Chiew sees her own research as being exclusively directed towards the well being of patients. Moreover, she feels that the presence of a nurse on the research team offers tremendous reassurance to the patient under going tests. "Patients respond to nurses in a way they don't to doctors or other researchers." she told <1New Scientist.>1 A nurse who is trained in research methods is ideally placed, she believes, to act as a go-between between doctor and patient. Her manner and her uniform will secure the patient's trust as she carries out the necessary investigations, while her understanding of the aims of the study will ensure that she can offer the research tearn more than columns of figures. Doctors who include nurses in their research teams are enthusiastic about the contribution they make. Professor J. P. Payne of the Department of Anaesthetics at the Royal College of Surgeons insists that a research nurse should always be present when tests are carried out on a patient. He has given the nurses every support in their efforts to gain recognition, and will speak at their conference. Dr Derek Thompson of St Philip's Hospital, who is supervising Chiew Kong for her PhD, says that research nurses perform a "very useful and vital role". He points out their skill in handling patients is something doctors often lack; he regards the combination of basic nursing tech- niques and reseach expertise as a valuable asset to his investigations. Nursing is often a thankless occupation: the work can be tedious and dirty, the hours are unsocial and the pay is derisory. It is hardly surprising that some nurses, without wishing to leave the profession to which they are dedicated, should welcome the intellectual challenge of clinical research. But inevitably they have had to run the gauntlet of disapproval excited by any members of a well-defined class who seem to be getting ideas above their station. Doctors and nurses tend to keep each other at arm's length; no wonder the Ren had trouble coming to terms with the status of nurses who seemed to be engaged in the same work as doctors. The research nurses are establishing the principle that the rigid compartments of the medical profession are not as watertight as they appear. Science can have a human touch. [] <2The Empire's last stand>2 ---------------------------------------------------------------------------- Victorian sewers, water mains, railways, sea walls, reservoirs and Canals are the last vestiges of British empire-building. They are still the roundation of most British cities. Now they are crumbling away. Who will step into the breach? Fred Pearce and Mick Hamer IT IS almost one hundred years since the abolition of the Metropolitan Board of Works in London brought to an end a 20 year orgy of building in the Victo- rian capital. Now the board's successor, the Greater Lon- don Council, says that Victorian "infrastructure"-- the sewers, water mains, rail- way lines and the rest that serve the capital--needs re- placing. The cost over the next 10 years will be several billion pounds. London is not alone. The same story of crumbling Victorian buildings, tunnels, pipes and walls is being repeated in all Britain's major cities. Almost every- where these edifices of civil engineering, the basis of life in urban Britain, have been taken for granted. The Victo- rians built to last. But not forever. The same is true of wherever Britain's Victorian engineers ventured as they built the empire in those last decades of the 19th century. The sewers of Cairo, the build- ings of Bombay, the railways of Africa. All are overloaded or broken down. SEWERS: At the end of last year the House of Lords' select committee on science and technology reported on the state of Britain's water industry. "There is a significant risk" the committee said, "of decay in the sewerage system getting beyond the water authorities' control . . . Too little has been spent on maintaining the system in the past and the industry, faced with signs of accelerating failure rates, does not yet appear to be doing enough to contain the rate of decay. The current recession makes the task more daunt- ing but does not remove the need to tackle it." Most people in Britain have heard about Man- chester's crumbling sewers. The city's chief engineer, Geoffrey Reed, and the gentlemen of the North West Water Authority have made sure of that. For most of the past decade at least one major thoroughfare in the city centre has been shut to traffic while workmen repair a large hole that has opened up in the tarmac after the collapse of one of the old brick sewers beneath. There are tales about the double- decker buses that have fallen in (actually they have not yet, but the holes are big enough) and of the sewage flowing through the basement of the Midland Hotel. But Manchester is just the best-known centre of decay- ing sewers. There are 5000 collapses and blockages every year in England and Wales and, says the Water Research Centre, the number will carry on increasing in the coming decades. The Corporation of Manchester was among the first to build sewers. The giant brick struc- tures were laid during the earliest days of the industrial revolution in Manchester, several decades before London got down to the task of comprehensive sanitation for its citizens. Sewers are showing serious signs of decay in other northern theatres of industrialisation; the steel town of Sheffield and the pottery towns around Stoke-on-Trent, for example. The Lords' select committee reported: "Since the UK has one of the oldest sewerage systems it is the first to face the demand for rehabilitation." The problems are not restricted to the big cities and indus- trial towns. Take the case of Tholthorpe village, just north of York on the main railway line to Edinburgh. A one-page note to the policy and resources committee of the Yorkshire Water Authority last summer revealed that: "The village is drained by sewers via two outfalls to the Derrings Beck. The flow from one outfall is discharged to the Beck without treatment, whilst flow from the other outfall passes through a small overloaded septic tank. The stream is at best grey and turbid and can be black and septic. On the three days per week that animals are slaughtered at the village abattoir the stream is discoloured by the resulting effluent." It goes on: "Frequent blockages and collapses occur in the sewerage system. . . the sewer running along the road into the village from the south is overloaded and flooding of the fore- court of the New Inn and the highway near the village green occurs several times a year." Yorkshire has dozens of other villages with similar problems from primitive, overloaded sewage systems. ln its 1982 review, the National Water Council records that, to replace the sewers of England and Wales, it would cost #31 000 million at today's prices. The Lords committee, noting this, estimated that "assuming the average life of the sewers to be about 100 years. the industry ought to be spending #310 million a year to main- tain the system in its existing state. Annual investment in renewal is in fact running at about #205 million. This, as the NWC modestly puts it, "appears too low". Quite apart from the fact that the existing state of the sewers is hardlv good enough, the assumption of a 100 year life-span for sewers becomes increasingly unrealistic. Today's sewers are built with an expected life of 60 years. The principal faults of the sewerage system, says the Lords report, are twofold: "first, the hydraulic over- loading: and second, physical deterio- ration due to age, ground subsidence, disturbance by traffic or excavation by other utilities." Subsidence due to coal mining is a notorious problem in such towns as Barnsley in South Yorkshire. It describes, too, the extraordinary effect of the collapse of quite a small sewer. "When a sewer is damaged and hydraulically overloaded, internal water pressure forces water out into the surrounding ground: when the pressure falls the water re-enters the sewer, bringing silt with it, thus clogging the pipe and, more important, weakening the external support, thus accelerating the failure. A void grows around the sewer or in the vicinity of it and eventually the ground above collapses into the void. The hole resulting from a damaged 250 mm sewer can be big enough to swallow a double-decker bus." ln London the Greater London Council has found that "most of central London's sewer system is more than 70 years old, of which a substantial proportion is more than 100 years old. Yet the Thames Water Authority has been cutting its spending on sewer renewal. lt spent #18 million in 1980, but plans to reduce that to #13 million in 1984. The introduction of newly-developed linings that can be pushed down into crumbling sewers to extend their life is widely touted as bringing the cost of renovating the nation's sewers down somewhere nearer to what impoverished water authorities can afford. But many of the plastics being used in the linings are relatively untried and nobody has any real idea how long they will last. As Donald Rees from the South West Water Authority told the Lords: "It is all very well lining a sewer at half the price that it would cost to build a new one, but in fact it may only last a quarter of a normal funding life and may therefore be uneconomic." But the temptation for the quick fix is obvious for every water authority, faced with both its customers and its government demanding that water rates be kept down. As water engineers have begun to face up to the problems of crumbling sewers, they have become concerned too, with the effect of holes in their other, paral- lel supply system, the water mains. Because most of the water system is not metered, there is a fair amount of guesswork involved in establishing how much water is lost. But most esti- mates put the figure at more than a quarter of all the water that enters the water mains network. Throughout the 1960s and 1970s, when water engineers were falling over themselves to find new pieces of countryside to flood for reservoirs that would meet rising water demand, their water mains were leaking more and more water. (Ironically, now that the extent of this leakage is being understood, there are too many reservoirs, collecting too much water for a depressed industrial market that is switching away from water-guzzling businesses like iron and steel.) London's water mains leak 29 per cent (up from 19 per cent in the mid 1960s), Birmingham loses a similar amount. But parts of Liverpool and Glasgow and much of rural Scotland lose as much as 50 per cent of the water. Meanwhile, many thousands of kilo- metres of water mains are corroding. Soft acid waters, from the peaty hills that supply much of Britain's water, are eating into the iron water mains of many towns. That not only increases leaks, it also discolours the water and increases metal-contamination. Last year the Yorkshire Water Authority tentatively began the first stage of a #150 million programme to eliminate the worst coloured waters from such towns as Huddersfield, Halifax, Bradford and Sheffield. A report as long ago as 1975 found that more than two-thirds of Yorkshire's water mains needed replacing, mainly because of corrosion. It found that "little or no investment . . . on renovation or renewal works had taken place in the last century". The task of maintaining and improving Britain's crum- bling system of sewers and water mains is gargantuan. The water authorities, which are widely blamed in Britain for raising water rates, are wrestling with a timebomb, inherited from local councils in 1974. Water is an industry where investment is spent on pipes that, once laid underground, may be forgotten about for up to 100 years. The potential for expediency in planning is vast. Cut spending and nobody will notice in the short-term. Increase spending and the same is true. But the long-term consequences of those decisions, as Britain is starting to discover, can be immense. ROADS: Almost every week traffic restrictions are placed on a new section of the motorway network, reducing the number of lanes open to traffic from three, to two, or even one. The M 1 is perpetually being resurfaced, a task that seems to be as endless as repainting the Forth bridge. Parts of the M 1 and the M6 are being dug up and rebuilt, because the roads are not strong enough to bear the weight of traffic. The M5 motorway at Taunton is undergoing major repairs that began in 1980--five years after the section was opened. And on the Links motorway, the 13 kilometres of elevated road at the core of the country's motorway network, repairs are so frequent that the Department of Transport issues weekly press releases. The decay of Britain's roads has been reflected in a sharp increase in central government's spending on road mainte- nance. The Department of Transport is responsible for main- taining the 25 000 kilometres of trunk roads in England. (The Welsh Office and the Scottish Office maintain roads in Wales and Scotland.) ln 1971 central government spent just #34 million on maintaining roads. In 1981 it spent #252 million. Road maintenance now takes a quarter of the roads' budget, compared with roughly 10 per cent as recently as the middle 1970s. Yet, on minor roads as well as trunk roads, studies find that standards are declining. Part of the reason for this increase is that most of the motorway network was built during the 1960s and that, as the roads have got older, so they need more spent on them. But the major reason is that the large increase in the number of heavy goods vehicles was not foreseen when the motorways were designed. The roads were not strong enough. Tests by the American Association of State Highway Officials in the early 1960s are the basis for calculating the damage to roads. The rule-of-thumb "law" that emerged from these tests was that road damage varied with the fourth power of the axle load. Thus a 10 per cent increase in axle weight would increase road damage by nearly a half. And the damage inflicted by one lorry axle weighing 10 tonnes is 160 000 times that of a car axle. The number of heavy goods vehicles using the roads did not change much over the past 10 years. However, there was a very large increase in the number of the heaviest vehicles. The number of lorries weighing over 10 tonnes unladen shot up by 230 per cent. The M 1 in Northamptonshire was designed to carry just over 8000 heavy goods vehicles a week by 1979. In that year it carried up to 26 000 heavy goods vehicles a day. An additional problem is that the most heavily used motorways are being designed to carry 100 million standard axles over 20 years. Yet the government's Transport and Road Research Laboratory has tested roads only up to 20 million standard axles. The DOT has cut the research budget of the TRRL. The Commons's select committee on transport recommended "in view of the present lack of empirical data on the performance of road pavements capa- ble of carrying more than 20 million standard axles this appears to us to be an extremely short-sighted cutback by the Department." RAILWAYS: British Rail's problem is that most of its structures were built in the last century. Many are more than a hundred years old. And yet at the same time the government is squeezing the railways' investment budget in an attempt to improve efficiency. In 1976 BR told the government: "Current levels of investment are quite inad- equate to keep pace with the rate at which the system and its assets are running down and wearing out . . . to main- tain the present investment ceilings would entail the progressive decline and even- tual closure of a substantial proportion of the system and its services." Since 1976 BR's invest- ment budget has been cut still further. In 1976 it spent #24 million on renewing its railways and structures--#48 million at today's prices. Yet in 1981 it spent only #38 million on repairs to railways. Last year its total capital spending was smaller than that of the Paris metro. And, although BR has not closed a railway line since 1976, it cannot afford to repair a major structure. Further closures are inevitable. The prime candidate is the line across the Pennines from Settle to Carlisle. BR estimates that it will cost #6 million to repair just one structure on that line, the viaduct over the Ribblehead valley. The viaduct is not yet dangerous, but it will soon become so. In 1980 an attack of shipworm on the Barmouth viaduct nearly forced the closure of the Cambrian Coast Line, from Machynlleth to Pwllheli. The viaduct, near the Machynlleth end of the line, crossed the beautiful Mawdach estuary. It is a wooden structure and was built in 1867. But the Teredo woodworm has eaten into 69 of the viaducts pillars. BR then said it would cost #21/2 million to repair. Eventually it carried out a more tempo- rary repair, costing #500 000, which entailed replacing 30 of the 113 wooden piles. The bridge was closed for six months. From the junction at Machynlleth a southern spur runs to Aberystwyth. Both this spur and the line to Pwll hell run along the coast. At high tides, sections of the line are submerged, whilst the sea scours away the track bed. It is probably the only railway in Britain where trains run axle deep in the sea, and where high tides can disrupt services. Another closure that is on the cards is of Marylebone station. The immediate stimulus for this closure is the cost of replacing the signalling between Marylebone and Neasden. The signalling dates from the First World War and will cost #400 000 to replace. But these examples are the tip of the iceberg. All over the railway system, similarly expensive bridges and tunnels are nearing collapse. Many of them will need major works this century. If BR's investment budget remains at its present level then many more lines will close. Ironically the most modern tunnel in BR's possession is that under the Pennines at Woodhead. This line between Manchester and Sheffield was opened only 30 years ago. BR closed it last year. [] <2Keeping a finger in the dyke>2 ENGLAND has almost 3000 km of coastline. One of the most important of the island's defences is against erosion or inundation by the sea. Some 1100 km of coast is protected. An assessment of those walls, banks and groynes, published last year by the Department of the Environ- ment, found that "many go back to the 19th century and so, not withstanding that over #2 million per year is spent by the district councils on maintenance, heavy expenditure on renewals continues to be needed". "Many properties have been lost around the coast," it went on and even more open land is being lost because "works could not be justified". A 70-page list of sea defences in the report finds that long stretches of England's coastline defences are in "poor", condition because of old age. One notorious example is the 5.5 km of the east side of the Spurn peninsula on Humberside. The report describes the old military walls as "derelict and mostly inef- fective". The sea is widely expected to break through the narrowest point of the peninsula any year now. Worst hit is the Isle of Wight, where most of the north coastal defnces from Ryde to Cowes is listed as old and poor. [] BRITAIN has some 2000 large reser- voirs, each of which can hold more than 23 million litres of water. Many are redundant. More than half are 100 years or more old. Many are unsafe. According to the Lords's report on the water industry, 191 do not appear to have an owner and valid engineers' reports (required under the 1930 Reservoirs Act) can not be found for more than half of them. Any one of these may be a disaster waiting to happen. In Lancashire towns like Oldham, Bolton and Bury, old reservoirs that once supplied mills or local water supplies sit unused and uninspected, threatening to inundate surrounding resi- dential areas. Since 1969 there have been six "near misses". The Institution of Civil Engineers said in evidence to the Lords: "The fact that there has been no major disaster in the United Kingdorn for more than 50 years should not be allowed to obscure the ever-increasing danger in the future. The need for properly enforceable means of control over the supervision and maintenance of reservoirs becomes increasingly urgent." After the Lords's report was published, ministers decided to implement the 1975 Reservoirs Act, which had lain on the statute hook for eight years. It gives local authorities the right to carry out repairs to reservoirs and charge the owner--and the duty to compile a national register of large reservoirs. [] THE CANALS are the oldest of Britain's post-industrial revolution transport systems. Like British Rail, the British Waterways Board, which controls 3600 km of inland waterways, loses money. But it also suffers from a lack of govern- ment confidence in its usefulness, either as a carrier for freight traffic or for pleasure cruising. In 1976 a consultant's report (the Fraenkel report) supported the BWB's case that the maintenance backlog was #38 million. When Dennis Howell, the envi- ronment minister appeared before a House of Commons select committee, he defended the government's inaction by cast- ing doubt on the competence of the BWB. The select committee retorted: "Once the case for funds to meet the maintenance backlog has been made out it is the task of the department and the minister. . . . to use their best endeavours to get the money made available." In 1981 the BWB chairman, Sir Frank Price said: "The Fraenkel figure for the backlog of maintenance at today's prices must be about #100 million. We are still waiting for an assurance that the money will be forthcoming. The longer we delay the worse it will become and the more expensive it will be to the nation at large." At the end of 1981 the government finally gave way, or at least a little bit. It increased the HWB's grant from #29 million to #38 million. As a result work has started on the Blisworth tunnel (on the Grand Union canal), which was closed to navigation because it was dangerous, and a reservoir which feeds the Oxford canal. [] <2Bad workmanship, bad materials and bad design>2 IF ANYTHING is falling apart as fast as Britain's Victorian infrastructure it is Britain's post-war infrastructure. A mix- ture of bad workmanship, bad materials and bad design has left an extraordin- ary legacy of decay. Headlines about local councils demolishing blocks of flats 10 years old or younger have be- come commonplace in the past four years. So, while the GLC estimates that there are almost one million houses in London that are more than 70 years old and in need of modernisation, councils are having to give top priority to repairing more modern buildings. Almost 40 per cent of Britain's #8000 million annual building repair bill is spent on structures built since 1960. Overwhelmingly the biggest problems in post-war british buildings (and other struc- tures such as bridges as well) is con- crete. The jinx has often centred on the simple art of mixing concrete. Millions of pounds have been spent demolishing houses and blocks of flats that were built using concrete with too much calcium chloride in it. As a result, steel reinforc- ing in the concrete has corroded, cracks have formed and water has penetrated the walls of the building. Calcium chloride- induced corrosion is behind the recent demolition of blocks built on the Reema design system in Wandsworth, south London. There are 40 000 more Reema houses in Britain. In Sheffield, Leeds, Manchester and Hull, millions of pounds are being spent on blocks that used the 5M system. This system, developed by the ministry of housing itself, has left a legacy of cracked and crumbling concrete wall-panels fall- ing out of their steel frames. 50 000 families all over Britain live in flats built using the Bison Wallframe design. Here, too, metal reinforcements set in the concrete have rusted because the concrete contained too much calcium chloride. Already councils have demolished 5000 Bison homes. Calcium chloride was used to speed up the hardening of ordinary Portland cement and to allow concreting to continue through cold weather without the threat of damage from frost. Decaying concrete is behind the rush by environment ministers to act to repair the thousands of pre-fabricated houses built in Britain in the early 1950s. Airey houses, for example, were built with concrete planks fixed with copper wire to vertical reinforced concrete columns. In some 20 000 Airey houses the concrete has carbonated, making it porous. The steel tube rein- forcements in the columns are corrod- ing causing cracks in the columns. The national bill for recti- fying those faults alone is some #350 million. As the effects of calcium chloride continue to unravel, a new concrete "disease" has been uuncovered in the past year. It is known, as the alkali-silica or alkali-aggregate reaction. More widely, engineers call it the concrete cancer. It happens when alkali chemicals, present in cement, react with silica, which is in some aggregates that are mixed with cement to make concrete. The reaction cause's a gel to form around the stones. This absorbs water and expands, so cracking the concrete. More water then gets in. After freezing it will widen the cracks further and may rust steel reinforcrments. Bridges have been hit by concrete cancer in Devon, Derbyshire, Surrey and Birmingham. A recent outbreak of aggregate reactions in the concrete transformer bases of 11 electricity power stations has thrown up the possibility that most of the sands and gravels removed from the Trent valley (one of the most widely-used sources of aggre gates in Britain) may contain the forms of silica that cause the reaction. [] <2CAN WE CONTROL INSECT PESTS?>2 Pesticides alone are not the best way to achieve long term control of agricultural pests. A carefully orchestrated combination of biological, cultural and chemical controls may be the answer. GRAHAM MATTHEWS OVER the last four decades, man has relied increasingly on powerful armoury of chemicals in the war against pests. The "silent spring" predicted by Rachel Carson some 20 years ago has not arrived, even though global sales of pesticides last year totalled #7 billion; but persistent organochlorine insecticides, such as DDT, are damaging some populations of birds, and chemicals sometimes destroy beneficial natural enemies and thereby actually create pests. Pests are also becoming resistant to some pesticides. Above all, when toxic chemicals are used in ignorance, as in many tropical countries, they can be hazardous to human health. These serious problems have led to the concept of integrated pest management (IPM) which does not rely solely on pesticides. A great variety of techniques for controlling insects are now available, and, sensibly co-ordinated, they could safely and effectively keep pests under control. In an ideal integrated scheme, cultural controls (changes in farming habits) and biological controls would largely supplant pesticides; in practice, chemical control continues to be an essential component of IPM when a pest population has to be reduced rapidly to minimise financial loss. Even so, integrated management can reduce the farmer's reliance on pesticides. He can time an application of a pesticide more accurately to ensure that the cost of pesticide is justified by the increase in yields, use more selective chemicals, and apply the minimum dosages needed. Such careful spraying can go hand in hand with long-established techniques such as crop rotation, and planting resistant varieties. But for IPM to become a reality our scientific understanding of pests must be transmitted to the man in the field; IPM demands soph- isticated techniques for decision-making and management. Subsistence peasant agriculture and highly mechanised intensive farming are two extremes in agricultural development which present quite different problems for inte- grated pest management. Peasant farming, low on capital but rich in labour, relies on traditional cultural controls. Farmers employ mixed cropping systems and plant local cultivars with some resistance to pests. So most farmers in the Third World use few pesticides. But in the 1960s attempts to increase crop production in these areas introduced higher yielding varieties of the "green revolution". These varieties need pesticides and more fertiliser; and the fertilisers often exacerbate attack by pests. Farmers use the cheapest products, such as parathion and endrin, which are highly toxic. Health hazards arise since full protective clothing is unbearable in a tropical climate, even if the poor farmers could afford it. More people now die of pesticide poisoning in Sri Lanka than from certain important diseases, including malaria. Some countries now list only the less toxic pesticides, but official recommen- dations have not always stopped the marketing of more hazardous products. Few countries have satisfactory legis- lation on pesticides or the trained manpower to enforce it. In contrast, extensive use of pesticides in Europe, North America and Japan is backed by government legislation or voluntary schemes that provide farmers with detailed advice. A demand for undamaged fruit or vegetables for freezing has led to excessive "cosmetic spraying". Here, the first step to IPM has been the introduction of "supervised control" which avoids routine, or prophylactic treatments. We can begin a programme of IPM by growing varieties of plants resistant to particular pests. Plants are highly resistant to the great majority of potentially damaging species, and plant breeding programmes continue to manipulate genes that confer immunity or a high degree of resistance. But this resistance to one particular race or biotype of a pest ("vertical resistance") is often only temporary because the pests them- selves alter. And resistance to one pest can often increase susceptibility to another. So the search is on for a more durable, "horizontal" resistance to a variety of races of a pest. At the Plant Breeding Institute in Cambridge, Dr Martin Wolfe is mixing different genetic lines of a cereal crop to create "multilines" that effec- tively break up vast tracks of a single crop, which normally present the pest with a continuous sea of susceptible hosts. By mixing the varieties intimately within the field, the spread of a fungal infection can be drastically slowed. Spores of the fungus developing on one particular plant are likely to land on neighbouring plants that are so genetically different that the spores cannot germinate and grow. Multilines of spring barley, designed to control the fungal infection, powdery mildew, are now marketed by the Sinclair McGill seed company; such heterogeneous barley now covers 60 000 hectares in the UK and a larger area in Denmark. Dr Wolfe, working with colleagues at the Plant Breeding Institute and Dr John Barrett of the University of Liverpool, is now devising ways of making things even more difficult for the pathogens. Next year the National Seed Development Organisation will release a mixture of three varieties of spring barley, all with good malting qualities (good for beer and whisky), but with different resistances to mildew. One of the varieties of seed will also be treated with a fungicide against mildew. The pathogen will thus be faced with four problems simultaneously--the need to develop resistance to three different resistances in the hosts and the fungicide. In future years, crop breeders may be able to juggle varieties and fungi- cides endlessly, to keep one step ahead of the pests. Even a small reduction in the susceptibility of a crop to a pest can be important, especially if it complements other control techniques in an IPM programme. If a pesticide has to be used, a smaller dose may be adequate. Unfortunately some regulatory authorities believe that smaller dosages will increase the possibility of selecting for resistant pests; the idea is that whereas large doses should kill everything stone dead, small doses allow marginally resistant individuals to survive and breed. Actually, resistance is more likely to occur with prolonged exposure to <1large>1 dosages which leave only a few "resistant" individuals to form the nucleus of the next gener- ation; and it is one of the characteristics of pests that a few individuals can rapidly give rise to large populations. Agricultural advisors often inadvertently encourage pests by advising farmers to cultivate more intensively. In a number of large irrigation schemes in the tropics, the continuous cultivation of a range of crops now allows certain key pests to survive; previously a closed season or fallow provided a natural check on insect populations. In the Gezira region ofthe Sudan, for example, growing more sorghum and groundnuts has increased the cotton pest, the <1Heliothis>1 boll worm. Broad-spectrum insecticides used against bollworms have in turn decimated the natural enemies of whiteflies (which live on the undersides of the leaves and are thus protected from aerial sprays). The whiteflies then damage cotton; they downgrade the cotton lint by sticking to it. A completely fresh IPM approach is needed, to consider the overall cropping policy as well as pest control. Removal of crop debris after harvesting can successfully control many pests--provided all the farmers in a region accept the policy. In parts of India, goats graze the cotton after harvest and so remove the last few infested bolls and stalks. But independently-minded farmers need to be convinced of the benefits. In many African countries, where uprooting of cotton stalks is legally enforced, pink bollworm is no longer a problem, but in other areas, where the stalks are a valuable source of fuel, pink bollworm remains a serious pest. Biological control can be a powerful tool in an IPM programme. In classical biological control, a natural enemy is introduced to control an organism that has become a pest in its absence. This method of control is principally limited to geographical or ecological islands and perennial crops. The Commonwealth Institute of Biological Control has been involved in various successful programmes. In Peru, a chal- cidoid parasite, <1Aphytis roseni,>1 attacks the pest, the rufous scale, <1Selenaspidus articulatus.>1 In Tanzania sugar cane scale insects <1(Aulacaspis tegalensis)>1 are kept in check by a coccinellid (ladybird), <1Lindorus lophanthae.>1 A Brazilian moth, the sugar cane borer, <1Diatraea saccharalis,>1 has been controlled by introductions of its natural enemy, a small wasplike braconid, <1Apanteles flavipes.>1 In Sri Janka the coconut leafminer, <1Promecotheca cumingi,>1 is controlled by a larval parasite, <1Dimmockia javana.>1 Unfortunately natural enemies alone may not be sufficiently effective, especially in annual crops where climate or agricultural practices destroy these enemies at a critical period. Natural enemies then need to be introduced repeatedly, and special skills are needed to produce the "product" to sell to farmers. In Europe specialists rear the parasite, <1Encarsia formosa,>1 a chalcid wasp, that attacks the glasshouse whitefly <1(Trialeurodes vaporariorum),>1 a pest of cucumber and tomatoes. Also reared is the predatory mite <1(Phytoseiulus persimitis)>1 which attacks the red spider mite <1(Tetranychus urticae)>1 , another pest of cucumbers and toma- toes. Elsewhere there is renewed enthusiasm for rearing <1Trichogramma,>1 a small wasp which parasitises the eggs of many species, such as <1Heliothis,>1 the cotton bollworm. New techniques of mass-rearing should generate large numbers. Pheromones offer all kinds of possibilities: they are chemicals produced by an animal that effectively act as external hormones, and attract (or repel) members of the same species. Many pheromones have now been characterised and can be produced artificially. A simple tech- nique is to spray the crop with a pheromone of the pest species--which may encourage the parasites to stay on the crop. But so far, this type of biological control has been most successful in China, where plentiful labour is available for intensive rearing of natural enemies. Much more can be done to improve the conservation of natural enemies in the field. ln California, blackberry bushes are kept in some vineyards to provide a secondary host for a parasite, enabling it to survive the winter. Care is needed to avoid killing these helpful predators by a sudden change in Pesticides: many orchards, for instance, have strains of natural enemies, such as phytoseiid mites, that are resistant to commonly-used pesticides such as azinphos methyl. Applications of pesticides also need to be carefully timed both to have maximum effect on the pest and to avoid dam- aging natural enemies. Often the first application should be delayed as long as possible to avoid killing parasites and predators. Some parasites of insect eggs, for instance, can stop a small population of pests from increasing, so application of a pesticide has to he timed according to the number of the larvae of the pests rather than the number of their eggs. Crops should be routinely monitored to determine what is happen ing. Sometimes rapid chemical treatment is the best strategy; early spraying can check a population of some migratory insect pests before the natural enemies arrive. Ladybirds, for instance, often reach a crop after their aphid prey. Here a non-residual insecticide such as pirimicarb is essential. Careful timing and choice of chemical can greatly enhance the effectiveness of a natural enemy. But few insecticides now marketed are adequately selec- tive, so some natural enemies are invariably killed. There are a few exceptions: one selective insecticide, diflubenzuron, affects the synthesis of chitin, the tough protein that forms the insect exoskeleton. It kills the larvae of lepidoptera (butter flies and moths) which eat it, but does not harm the predatory mites such as <1Typhlodromus>1 that come into contact with spray deposits. Baited sprays offer another approach. They selectively attract a pest to spray deposits; these were used in the much publicised campaign against the Mediterranean fruit fly in California. More studies are needed to ensure that the baits effectively attract the pest. The selectivity of a pesticide can be greatly enhanced by using sex pheromones, the insect's own chemical attractants. A pheromone, unique to a particular species, can lure the pest to traps laced with insecticide. The idea is to attract the pest, usually the males before mating occurs, in order to reduce the number of viable eggs. Recently, flies have been successfully controlled by mixing one of its pheromones with a non- specific insecticide. But scientists from the UK's Centre for Overseas Pest Research, evaluating mass trapping of the cot ton leafworm, <1Spodoptera littoralis>1 in Egypt, found that vandalism of pheromone traps can be a problem. An alternative is to spray a pheromone, instead of an insecticide. The aim is to permeate an area with so much pheromone odour that males are confused and unable to orientate and mate successfully. UK scientists have used special micro-encapsulated formulations to achieve greater stability of the pheromone in sunlight, while in the US the pheromone is released from short lengths of polythene capil- lary tubing or laminated chips, applied with a special sticker. Gossyplure, the sex pheromone of the pink bollworm, has been the most successful to date. Repeated applications add to the costs but, unlike broad-spectrum insecti- cides, pheromones are more likely to complement rather than interfere with the biological control of the pests. Pheromones can also be used to detect and monitor pests; some farm- ers in the UK already use pheromone- baited collecting traps to time sprays against pea and codling moths, which are pests of apples. In the US and India, pheromone traps are being used to study the movement of the cotton bollworm. So far few commercial companies have taken an active inter- est in pheromones; in the UK most of the identification and synthesis is confined to government and university laboratories, notably the ODA Tropi- cal Products Institute, the Agriculture Research Council, and the Wolfson Unit of Chemical Ento- mology at Southampton University. Evaluating the toxicology of any new pesticide is now so expensive that few new compounds are reaching the market. Those there are are generally broad-spectrum chemicals, in order to achieve the large sales needed to recoup the costs of development and registration. An IPM programme needs more selective products. Few companies have taken an interest in biological insecticides such as <1Bacillus thuringiensis>1 (its spores or toxin) or the baculoviruses, because of lack of protection by patent and difficulties in producing formulations that are stable in the field. There is also consumer resistance to products that do not have an immediate effect on the pest populations- several days may pass before the bacteria have a noticeable effect on the pest. Unfortunately these bacterial products have usually been considered as alternatives to a synthetic pesticide, rather than as a component of IPM. A baculovirus spray (containing viruses specific to insects) might, for instance, reduce a population sufficiently for predators to cope with the remainder, and so make it unnecessary to apply chemicals. Alternatively, farmers could treat plants that act as alternative hosts for the pest and so reduce the number of pests migrating to a susceptible crop. In practice, IPM will have to incorporate broad-spectrum chemicals for the foreseeable future. But decisions on whether and when they are needed can be much improved. The ratio- nal application of pesticides is stymied by ignorance of how pest attack at different stages of crop development affects yield. Some plants can compensate for quite heavy infestations provided there is sufficient time, and weather conditions remain favourable. But a very few aphids trans- mitting a virus can cause havoc to sugar beet, for instance. Fluctuations in market prices of some produce also make it difficult to assess an economic threshold to time sprays. furthermore, changes in the weather affect the accessibility of crops, so farmers often have to spray "when they can". Despite these prob- lems, excessive use of sprays can be avoided by monitoring crops for pests or damage. In Nicaragua, many cotton farmers applied over 30 sprays a season, while in Central Africa, where scouting was introduced in the 1960s, most farmers apply fewer than 12 sprays. Short-term monitoring rather than forecasting is still the main emphasis. Only for the black bean aphid <1Aphis fabae>1 has it been possible to develop over the last 25 years an effective system to predict migration several months ahead to advise farmers when to spray. In winter, the aphid lays its eggs only on the spindle bush in hedgerows in the UK; so counts of eggs on these bushes enables Professor Michael Way at Imperial College to predict how prevalent the aphids will be in the summer. The technology of application of chemicals can also be improved. A major disadvantage of chemical control is that pesticides are now applied very inefficiently. Beekeepers are concerned about the insecticide that drifts outside treated albeit in small volumes; and small amounts of drifting herbicide can also seriously damage neighbouring crops. Moreover, 70 per cent or more ofa spray aimed at foliage can be lost in droplets falling to the soil, and deposits on plants may be washed down by rain. Spraying methods have remained basically unchanged for more than 100 years. Farmers still dilute a concentrated formulation to apply it in 200 litres or more of water per hectare. Most accidents occur when the concentrate is measured out, and as spray equip- ment is seldom calibrated, there is no guarantee that the right dosage is applied. Even with tractor equipment, time is wasted transporting water to fields. An alternative to water in sprays is needed in the tropics where water is often very difficult to obtain or transport to the field; it evaporates so rapidly that the effectiveness of sprays is further reduced. Research on ultra-low-volume (ULV) application with greater control of droplet size has shown that pesticides can be applied with much less liquid. In many crops as little as 0-5 litres/hectare is effective and this permits oil-based formu- lations to be used economically and increases the persistance of deposits after rain. with volumes this low, the farmer no longer has to dilute his sprays and risk contamination. Prod- ucts can be marketed in a container which plugs directly into the applicator. Small droplets are spun out by rotary discs rather than being forced through a nozzle, Taming the droplet But so far, few agrochemical companies have registered formulations suitable for ULV. Farmers are reluctant to use ULV sprays partly because some formulations are more expensive than wettable powders. There is also the fear that small droplets might drift and contaminate the operator; but drift need not be a problem if droplets are electrostatically charged. ICI in the UK has developed a new low-energy system of charging sprays suitable for both hand-held and tractor mounted sprayers. More research is needed to optimise the size of droplets and their charge-to-mass ratio for different pesticides and targets. For instance, if the pesticide is best deposited on the top of a plant, a small droplet with a high charge is best; while a large droplet with a smaller charge which is readily attracted to the ground is best for killing weeds. By selectively applying pesti- cides to, say, the outer part of a crop canopy, much lower dosages will control exposed pests, such as aphids on cereals. The ground is less contaminated and beneficial predators, such as carabids, spiders and ants, may survive. In an IPM system, the pesticide is applied by whatever means best suits a particular case. Granules, seed treatment, bait sprays or selective placement may be better than an overall spray. Aerial ULV sprays will be most suitable when a pest has to be controlled over vast areas at the same time; in the control of tsetse fly, for example, and in forests, to control pine beauty moth in Scotland. We now have a range of techniques for controlling insects, so why has the implementation of IPM remained dis- appointingly low? Some authorities feel that policy makers remain unconvinced about the practicality of integrating several methods of control, while others blame the overriding influence of the chemical industry. Most important, perhaps, is the lack of trained manpower to interpret changes in popu- lations of pests and decide what action to take. Already, independent specialist consultants are helping farmers by walking fields and assisting with advice for strategies of control. Their decisions are largely based on experience, but computer programs are needed to store all the relevant data and assist the decision-making process. In the US in particu- lar, scouts can telephone data to a regional computer and receive advice. Farmers in the UK can now get up-to-date advice on pest attacks using teletext, and some consultants have access to a Dutch disease-forecasting system, EPIPIRE. A multidisciplinary approach to pest management is essen- tial. Integrated pest management will undoubtedly become more widespread once meteorologists, engineers, biologists and systems analysts--to name only a few--all work together. [] <2The landscape of rational agriculture>2 A great range of techniques are now available for controlling pests. Taken singly none is completely reliable. But a judi- cious combination of methods can effectively control pests for years on end. Sowing a mixture of resistant varieties of harley in a field reduces losses and slows the spread of fungal pests. Pests overwintering in stubble or soil are foiled by clearing stubble after harvest or letting the land lie fallow. Natural enemies of the rest are introduced and encouraged to thrive in hedge- rows. Pests are monitored with traps laced with attractive pher- omone's, to time limited chemi- cal spraying with ultra-low- volume equipment. Small droplets of the pesticide are electrostatically-charged to reduce the tendency of the liquid to drift. Pheromone traps laced with insecticide selectively kill the pest, while spraying pheromones disrupts mating. [] <2Integrated pest control in action---Success in Texas>2 The history of cotton-growing in post-war Texas vividly illustrates the value of inte- grated pest management. New insecticides, first DDT and then organophosphorus compounds, designed to kill the key pests of cotton, the boll weevil and the cotton fleahopper, also killed the natural enemies of minor pests, the bollworm and tobacco budworm. With no predators to keep their numbers down, these minor pests became major ones. More insecticides applied more often hastened the evolution of strains of pests resistant to the chemicals. The cotton industry was virtually destroyed in 1968 when the bollworm and tobaccco budworm became resistant to all available insecticides. Salvation for Texan farmers came in the form of an integrated control scheme designed by P. L. Adkisson and his colleagues at Texas A & M University (<1Science>1, vol 216, p 19). The scheme gave farmers a way of controlling weevil and fleahopper without causing an outbreak of bollworm and budworm. By destroying stalks after harvest, farmers kill most weevils during diapause (that is, in pro- longed "resting" periods of low biological activity). Early and limited spraying in spring then kills most of the remaining weevils and the fleahoppers, without deci- mating the natural enemies of bollworm and budworm. New varieties of cotton grow more rapidly and quickly pass through their most vulnerable phase (when the carpal surrounding the cotton is thin). Cotton is profitable again--Texas produces half of the US cotton crop--and much less insecticide is now applied to the crop. [] <2MONITOR>2 <2Who made the Laetoli footprints ?.>2 SOME extraordinary and conflicting interpretations of the famed foot- prints found at Laetoli in Tanzania have emerged in recent days. The prints were left by hominids (human-like creatures) in moist volcanic ash about 3-5 million years ago. The gait of the creatures that made the four sets of trails has been described in detail, two weeks ago, at a conference in Berkeley, California. lt has been claimed that modern-day pygmies of Malaysia and the Philippines or the Indians of Central America could have left similar prints. "If the prints were undated or if they had been given younger dates most experts would probably accept them as have made by <1Homo>1," said Russell Tuttle from the University of Chicago who has recently analysed casts of the footprints at the inv- tation of Mary Leakey. "A small barefoot <1Homo sapiens>1 could have made them." But a central conclusion of Tuttle, that the creatures of Laetoli were not the same species as "Lucy" and the "first family" of Ethiopia, was challenged immediately. Researchers at the University of California at Berkeley have reconstructed a skeletal foot to show that Lucy and her kind <1Australopithecus afarensis)>1 would fit into the footprints at Laetoli. The conference that gave rise to these events was organised by Donald Johanson, who unearthed Lucy's half complete skele- ton in 1974 at Hadar in Ethiopia, about 1100 miles north of where the Laetoli trails were uncovered between 1976 and 1979. In an effort to find some common ground, Johanson brought together leading palaeoanthropologists who have argued intensely for years, both scientifically and personally, about the uniqueness of the Hadar bones and the Laetoli footprints. The other scientific argument raging at the conference involved the start of bipedality. Was <1A. afarensis>1 a crude biped and still dependent on the trees 3 to 3-5 million years ago, or had man's ancestors well and truly learnt to walk, possibly more than three million years before? If the former case is true <1A afarensis>1 could be the "missing link" between apes and humans. Tuttle said that if the four bipedal tracks at Laetoli three were made by hominids; the other he suggested was made by a bear. Of the two most discernible hominid trails, one was made by a creature 112-129 cm tall while the height of the other was 130-149 cm. The great toe often left a deep impres- sion similar to the final toeing-off by humans before swinging their foot. Both individuals were walking slowly with mean steps of about 43 cm and mean strides of about 87 cm. But the angles of their feet varied noticeably. The larger creature placed its feet straight forward while the other toed-out markedly, suggesting a pathological condition. "In all discernible morphological features, the feet of the individuals that made the trails are indistinguishable from those of modern humans," Tuttle declared. The tips of the long curved toes of Hadar specimens were not evident in the foot prints, suggesting two species. They were also not curling their toes under as has been suggested by jack Stem and Randall Susman from the State University of New York at Stony Brook. The footprint controversy is as much political as it is scientific. Tuttle, by main- taining that the Laetoli prints are closer to <1Homo>1 than the Hadar footbones, and that different species were involved, is falling very much into the Leakey camp which believes that the earliest"true man" is more likely to be found in Tanzania than in Ethiopia, and that <1A afarensis>1 is no more than a sub-branch of evolution. Tuttle stopped short of saying that the Laetoli footprints <1were Homo-->1fossil remains of the same age were needed at Laetoli, he said. Tim White from the University of California at Berkeley, who fell out with Mary Leakey not long after the footprints were discovered, challenged Tuttle. Along with Johanson, White maintains that <1A>1 <1afarensis>1 was common to both Homo and the <1Australopithecus>1 line that died out some 1-5 million years ago. He agreed with Tuttle that the Laetoli creatures were not curling their toes under which would be one way to suggest that proof of compatibility--and one species--is based on a composite foot spanning both time and species. White and a graduate student, Gen Suwa, amalgamated the talus (ankle), metatarsal heads, and calcaneus (heel) of a 1-8 million-year-old foot from <1Homo>1 <1habitis>1 at Olduvai Gorge in Tanzania- Lucy's closest relative according to White--with large phalanges (toes) from Hadar, and the toe extremities of a chim panzee. Some scaling down was done, but this was legitimate based on Lucy's talus, said White. The result was a small foot in the proportions likely to be found at Hadar. It easily fitted the Laetoli prints. Lengthy debates over the beginnings of bipedality also split the conference. Argu- ing for Lucy's dependence on arboreality were Stern and Susman from State Univer- sity of New York at Stony Brook whose recent paper in the <1American Journal of>1 <1Physical Anthropology>1 (vol 60, p 279) was summarised in <1New Scientist>1 (20 January, p 172). They said Lucy could walk but she needed trees for protection and food. Their case for arboreality rests largely on Lucy's long, curved and heavily muscled hands and feet which suggest grasping. The hip and knee indicate that she could have existed on the ground, providing evidence that she was close to the point of divergence. Dr Owen Lovejoy and his student Bruce Latimer from Kent State University confronted Stern and Susman head on: <1A.>1 <1afarensis>1 was fully adapted to bipedality in "excruciating detail". Also, it was a "totally unique" animal, and too far along the evolutionary ladder to be the missing link which occurred much earlier. The lower limb, said Lovejoy, had been totally reorganised for upright walking and there were no compensatory modifications in the upper limb to suggest the creature was still partly arboreal. Lovejoy and Latimer would not accept that curved phalanges (toes and fingers) necessarily meant grasping and hence arboreality. Stern and Susman, they say, have described minor anatomical variations and traits much like the differences between <1Homo sapiens,>1 such as Samoans and Basques. "There are thousands of traits all with different intensities of selection--you have to look for the critical ones- for Lovejoy and Latimer that means the joints of the knee and hip, not the hands and feet. Moreover, they regard the <1A afarensis>1 toes as comparatively short and say that Stern and Susman have been selective in the toe bones they analysed, concentrating on those from the largest specimen. "The length of the toe bone is one of the last things to change in the loss of arboreality," said Lovejoy. Also, Lovejoy and Latimer say that predation--and therefore Lucy's supposed need for protection in the trees--is the most overrated selective factor in all of evolu- tion. "What kills animals," said Lovejoy, "is infection, injury, thirst, and starva-tion: predation accounts for a minor amount." [] <2Separating metals with sunshine>2 ASCHEME for filtering metals from sea water using sunlight sounds like science fiction. Yet chemists are work- ing on just such a project and one group in Japan has managed to selectively filter copper, using light to drive it through a filtering membrane. Akira Ohki, Tadashi Takeda, Makoto Takagi and Keihei Ueno, of the Faculty of Engineering at Kyushu University, have published details of their system in <1Chemistry Letters,>1 (1982 p 1529). It is against the laws of chemistry for a dissolved substance to move of its own accord from a solution of low concentration to one of high concentration. When two solutions of different concentrations are separa- ted by a membrane the tendency is for movement through the membrane to occur so that eventually both solutions are exactly the same strength on both sides of the membrane. There are, however, natural systems in which dissolved metals move freely through membranes, irrespective of concentration. For example, when a muscle contracts, sodium ions (Nas) enter through the cell wall mem- brane and then seconds later have to be pumped back out again. This involves an input of energy and the sodium is trapped inside a large mole- cule, called an ionophore, that acts as a ligand (a compound that can attach itself strongly to a metal). Exactly what this natural ligand is remains undiscovered, but inorganic chemists have made several ligands that can complex metals in this way. To move copper ions, Cus, through a membrane, the Japanese researchers used the ligands, bathocuprione (I) and benzo-14-thiacrown-4 (II). They were contained in the liquid membrane ready to attach themselves to a copper ion but only when it had been reduced to Cus. The chemical change Cus--->Cus-, requires the copper atom to gain an electron, (that is, to be reduced), something which light on its own cannot do. The Japanese workers overcame this problem in the following way. First, light excites a molecule of tiethanolamine (TEA), which transfers an electron to acriflavine (AF). This in turn passes the electron onto methyl viologen (MV), now a sufficiently strong reducing agent (that is, an electron donor) to convert Cus to Cu. The ion enters the membrane forming a complex with the ligand, and passes through to the pure water on the other side where it is oxidised back to Cus (in other words, it loses the electron) by dissolved oxygen from the air. Copper will continue to pass through the membrane and into solution on the other side even when the concentration of copper in pure water exceeds that of the original solution. The copper is then moving against a concen- tration gradient, yet it will continue to do so as long as light is shone on the system to initiate the chain of transferred electrons. Japan is poor in natural resources, yet as the second largest industrial country, it is in need of them. One resource the country does have is the surrounding sea and in theory this could supply all their essential metals, in- cluding gold. Whether the Japanese will eventually be able to use solar energy to obtain metals from the sea remains to be seen, but they have proved that it is possible for copper at least. And while the sea may be too dilute to make recovery of copper possible the technique may eventually be used in reclaiming copper from industrial waste- water. [] <2Small---and fat---is beautiful for male moorhens>2 MALE moorhens (b1Gallinula chloropus) do most of the incubating of eggs, while females compete with one another to secure the services of the best males. This discovery is further evidence in favour of Charles Darwin's ideas about sexual selection--his suggestion that the competition for mates amongst like-sexed members of a species is a driving force in evolution. A recent investigation of moor- hens is one of the first studies to show exactly what it is that makes for a high- quality mate (<1Science>1, vol 220, p 413). Marion Petrie, of the University of East Anglia, watched moorhens over three seasons. Females tended to initiate court- ship, and there were often fights between females when a hen approached a courting pair. It is generally the heavier bird who wins these fights. There is no shortage of males in the flocks during pair formation, so hens are probably competing for high quality males. As heavier females win the fights, they should end up with higher qual- ity males. So by looking at the mates of the heavier hens, Petrie could discover what confers desirable quality in a cock. It turned out that victorious females selected small cocks, and small birds were more likely to be in good condition. This is probably because small birds need less food, (Condition refers to stored energy, in the form of fat deposits; Petrie used a complicated measure of condition to get rid of the effects of size alone. Big birds weigh more, but a smaller proportion of that weight is fat,) Corroboration comes from the observation that males who had not formed pairs were in worse condition than those that had. The importance of selecting a male in good condition is that a male with larger food reserves should be able to do more incubating. This was supported by the correlation between the male's condition at the start of the season and the number of days he spent incubating eggs. Petrie also found that there was also a strong positive correlation between the number of clutches a pair started in a season (when earlier ones were destroyed by predators) and the male's condition. High quality mates will always be in demand. Petrie has shown that in moorhens the male's fat reserves are important, and these are likely to be higher in small males. It would be nice to know whether females with high quality mates do indeed secceed in rearing more young than those with low quality mates; they ought to, but there aren't yet enough data to be sure. [] <2Minds and motions>2 CHRONIC constipation could be caused by an excess of the body's own opiate-like chemicals and cured by a drug which reverses their effects. Junkies have always accepted constipation as part of the price paid for the "high" offered by opiate drugs like heroin. We have also known for a long time that morphine cures diarrhoea. These observations persuaded Mary- Jeanne Kreek, Jack fishman, and their colleagues at New York's Rockefeller University, that patients with a history of constipation might produce an excess of their <1own>1 opioids <1(Lancet,>1 1983, vol 1, p261). To test the idea they gave the opiate- blocking drug naloxone to two of their patients. Both women had had severe constipation for over 25 years. Nine oper- ations between them had failed to cure the problem and they managed only by almost daily use of laxatives and enemas. In both patients naloxone led to great improvement, supporting the idea that endogenous opiates are involved in the condition. One explanation is that there is a simple excess of such substances in people with constipation; another that there is something abnormal about their opiate <1receptors.>1 The New York researchers now have data on eight people with extreme consti- pation. Naloxone produced an improvement in all but one case. Work is being extended to patients with irritable bowel syndrome (IBS). Three so far have been studied; naloxone helped relieve gas accumulation and discomfort in all of them. IBS is a confusing but common and distressing condition in which patients complain of recurrent abdominal pain and alternating bouts of constipation and diar- rhoea. Predominantly a disorder of young people, and especially of women, a psycho- somatic element seems involved. Sufferers show greater signs of anxiety and stress, though whether this is cause or effect is disputed. Gastroenterologists Virginia Alun Jones, John Hunter, and colleagues at Adden- brooke's Hospital <1(Lancet,>1 1982, vol 2, p 1115), have shown that some patients improve when their diet is altered, suggesting that IBS is an aspect food intolerance. But so little is agreed about the cause of the disorder that endogenous opiates may well be involved. The New York group is extending its naloxone work, says Jack Fishman, "to more common-or-garden varieties of constipation", such as those occur- ring after operations and in geriatric patients. These plans were aired at a recent confer- ence celebrating the 80th birthday of Hans Kosterlitz, one of the discoverers of endo- genous opiates. There was new emphasis on the action of these substances not just in pain and pleasure, but in more general regulatory roles in the body. Opiates affect respiration and temperature, food intake--and now perhaps excretion. They probably also influence our responses to disease. Emphasising these "good housekeeping" roles for opiates, Huda Akil. a profes- sor of psychiatry at Michigan, said "People want very glamorous jobs for this system to do. People think of "highs" and euphoria and addiction to our own "stuff". But it doesn't need to be like that to be important." Nevertheless, from brain to bowel is something of a comedown for the endo- genous opitates. But there is a certain appropriateness. Ten years ago, in the attempt to prove that the mammalian brain naturally contained a substance with opiate-like activity, Kosterlitz's colleague John Hughes regularly got up at 4 o'clock in the morning, collected pigs' brains from a slaughterhouse and pulverised them with a steel rod in the basement toilet of a laboratory in Aberdeen. [] <2Making a hepatitis vaccine from cowpox virus>2 WITH the eradication of smallpox by WHO's successful vaccination campaign, it may have seemed that vaccinia, the cowpox virus that is the basis for the vaccine and which gave vaccination its name, might at last be ready to bow out from its central role in world medicine. But if a team of biotechnologists at the US National Institute of Health has its way, its retirement may be only tempor- ary. Geoffrey Smith, and his associates Michael Mackett and Bernard Moss, have been experimenting with the possibility of modifying vaccinia to make a vaccine against hepatitis B--a major health prob- lem in the Third World--and it is begin- ning to look as if it might work <1(Nature,>1 vol 302, p 490). According to Smith and his col- leagues, about 200 million people are chronically infected with hepatitis B virus, which, if it is not immediately lethal, may cause death more gradually, either by slowly rotting the liver, or by inducing liver cancer. The hepatitis virus itself is too dangerous to make a vaccine--especially as viral genes hiding in a liver cell may cause tumours--and the vaccines cur- rently in use do not contain whole viruses. Instead, they are based on one vital surface protein--the hepatitis B surface agglutinin (HbsAg)--taken either from the blood of carriers of the virus, or from bacteria genetically engineered to produce it. These vaccines have two serious disad- vantages, however. First, they are not very efficient at eliciting immunity, and secondly, they are expensive to make. This has not prevented their use in Europe and North America, where in any case the risk of infection is confined to relatively small groups, such as medical personnel; but it rules out widespread use where a vaccine is most needed, in the Third World. Which is what inspired Smith and his colleagues to try to insert the gene for HbsAg into vaccinia. They argued that as vaccinia is cheap and easy to grow, and, even more important, to administer, persuading the vaccinia virus to make the hepatitis antigen might provide an answer to the Third World problem. The strategy they used involved an ingenious mixture of design and chance. The design part consisted of the construc- tion of a plasmid--a small circle of DNA--containing the gene for HbsAg along with DNA taken from vaccinia virus. The vaccinia DNA included a promotor region--a genetic "on" switch that would ensure the production of the protein from the HbsAg gene. To put this carefully engineered plasmid into the virus the NIH team had to rely on chance. The researchers infected cells with both the plasmid and the virus, and relied on simi- larities between the vaccinia genes in the virus itself and those they had put in the plasmid to ensure that occasionally, the two would recombine. The result, they hoped, would be a virus that had incor- porated the HbsAg along with the rest of the DNA in the plasmid. The strategy turned out to be successful, and the resultant recombinant viruses produced reasonably large amounts of HbsAg in infected cells. The next phase of the experiment was to test the ability of the virus to produce immunity to the hepatitis antigen--something that the antigen on its own is not very good at. Smith and his colleagues accordingly inoculated some rabbits with the recombinant and measured the amount of antibody the animals produced. This proved to be far in excess of the minimum required for immunity, and although the researchers cannot be certain of directly comparable results in man, the prospects seem reason- ably good. The hope cherished by the NIH team is that recombinant vaccinia viruses may be used to combat hepatitis B virus in Africa and Asia according to exactly the same strategy that succeeded so brilliantly with smallpox. But there are two snags. First, vaccinia itself is not without its dangers: vaccination against smallpox carries a risk of encephalitis serious enough to have prohibited its use in countries where small- pox was not endemic well before the success of the eradication campaign. That means the recombinant virus would never be used in Europe and North America. ln Asia and Africa however, Smith argues, the benefits would far outweigh the risks--as they did with smallpox. The other snag, ironically, arises from the very success of the smallpox vac- cination campaign--because, as a result, most of the population of the Third World is already immune to vaccinia and wouldn't allow it to flourish long enough to produce the hepatitis antigen. TECHNOLOGY <2Happy days for software pirates>2 OVER the past two months, an elec- tronic device called the 810 enhancement has been arriving in Britain from the US. The American company that makes the 810, Happy Computing of California, calls it the "Happy Chip". But the name raises no smiles among computer manufacturers and software houses: Happy Chip is a tool for copying programs for Atari home computers. Although it has innocent uses, devices like Happy Chip enable pirates to steal programs worth millions of pounds. Software companies are reluctant to say how much piracy costs them. Psion Soft- ware, which produces material for Sinclair home computers, puts the figure at #2-9 million a year--30 per cent of its turnover. The pirates are even less keen to say how much they make. The authorities have uncovered large-scale illicit operations in the US (a software company called Micro- Pro International won $250000 damages from one pirate) but no one has yet been prosecuted in Britain. This is partly because piracy in Britain is more a game for enthusiasts than a serious business. Operations centre around computer clubs and groups of enthusiasts who cannot resist the challenge of copying something designed to be uncopiable (see Box). The pirates play an intricate and expensive cat-and-mouse game with computer firms. As the pirates crack the latest "protection programming", or copy- guarding, the software writers devise new methods that are harder to crack. In Britain, the hobby is growing. Most groups of pirate enthusiasts number around 10, but <1New Scientist>1 has heard of a group of 60 people swopping bootleg soft- ware. At least six groups are operating in and around south-east London alone. One group turns out 40 copied programmes every week. The retail value would be around #1000. The pirates pay #200. The main target for pirates are computer games that run on microcomputers. Busi- ness programs and educational material have not yet been bootlegged on any large scale in Britain. Business programs are difficult because they generally need peri- odical back-up from the supplier, and computers are not well enough established in schools for educational piracy to be worthwhile. In the US, however, tutors with squeezed budgets are beginning to cause problems for software houses by pirating programs. For the enthusiast, making a copy of a games program is more fun than playing the game itself. The piracy scene already has its own folklore. One story is set at a computer show in Birmingham: a company representative visits the stand of a software firm, borrows its latest product, and runs off a copy in five minutes back at his own stand. But although many pirates see it all as a jolly jape, people in the software business are getting worried. David Potter, manag- ing director of Psion Software, says: "If piracy and copying escalate and become endemic, hardware managers will have to take drastic measures, such as setting all software in ROM [read-only memory] cartridges." Such solid-state software is more difficult than a disc to copy. But it is not impossible. Also it would involve computer firms in expensive re-designs. and at least part of this expense would find its way through to the end user. Some firms, such as Texas, already supply all their software in this form. However, some experts believe that tech- nical solutions are dealing only with the symptoms, not the causes of piracy. Mike Wilding, is a software analyst at Atari with special responsibility for combatting piracy. He points out that it is impossible to stop it completely: "There is no way to protect the electronic pulse. Record companies have tried and failed. lt is a little easier to protect computer software--but it is in the nature of the material that it <1must>1 be copiable to be produced." David Potter gives an example of "ultimate-copiability". One of Psion's engineers had collected a master copy of a new program. When he arrived at head quarters he discovered that he had, by mistake, been given a "slave" disc--one with copyguards built in. He decided it would take less time to break the copy- guards than to go back for the correct disc. It took him 20 minutes. The companies have one other option, apart from making copying technically difficult--take the pirates to court. Atari and Psion, like most companies, have noth- ing against enthusiasts making one or two copies for personal back-ups. But when money starts to change hands, they take a harder line. Atari says it takes legal action against as many pirates as it can find, and has two big prosecutions pending at the moment. Psion has a softer approach. The company writes to the culprits, invoking the copyright laws as it reads them. It also asks the pirates to hand over any illegally produced material and any money that the pirates have earned by selling it. Finally, the bootleggers have to promise not to do it again. Potter says this approach has been satisfactory, although he would not say how many times the company had used it, but Psion has never had to take a case to court. But the long-term solution to piracy will remain elusive, unless, in the words of an Atari engineer there is "an improvement in public morals". Psion's approach is to turn out software as cheaply as possible, making it uneconomical to bootleg copies. Steve Jobs, the founder and chairman of Apple, echoes the approach. He looks forward to the day when home computers are so wide- spread that the price for software will drop by 75 per cent. Even then, Jobs predicts, illegal copies will circulate, but computer companies will be able to cost piracy in much the same way as shops cost shoplifting--as part of their business plans. Jobs believes that losses will eventually stabilise at around 6 to 7 per cent. Meanwhile, tools like the Happy Chip continue to sell. Happy Computing claims that the chip will copy any disc that fits Atari's standard disc drive. A computer enthusiast in Britain used one last week to run off a copy of a #35 program in a little over four minutes. Further copies would take only 45 seconds. A blank disc costs #1.49. Possibly with a tongue in its cheek, Happy Computing warns its customers; "Before you violate the copyrights of others, remember that software supplies work very hard to provide these products. They are very deserving of the small charge they ask." And in the advertisements that the company places in magazines for Atari enthusiasts, it reminds customers to respect Atari's trademarks. And the firm has its own worries about pirates: it has erased the serial number of all the chips in the device. But at least one British pirate, who works as a computer engineer, is putting the problem of identi- fying them to his employer's IBM, which is normally employed in designing circuits. ln a couple of months. Happy Computing could have its own problem with pirates. <2The cat-and-mouse world of copying >2 EVERY type of home computer and data storage system has its own method of copying. Here, we describe how pirates working on Atari 400 and 800 computers with disc drives deal with the company's "copyguards". ROM cartridges. These are programs written on read-only memory chips. To copy them, the pirate has to feed the program into the computer's memory, then read it back onto a disc. Atari makes it more difficult by writing into the oper- ating system (the instructions that govern how the computer handles data) an instruction that does not allow the computer to talk to disc drive and cartridge at the same time. Enthusiasts <1can>1 overcome this by ramming the cartridge into its socket while the drive is running. But this is unreliable, and can cause the machine to crash. A much more elegant method is to re-program the computer's operating system to remove the safeguard. However only skilled programmers can do this. One group of pirates has made a small add-on circuit board which makes it easier to copy ROM cartridges. It sells for #90--only to trusted friends. Once the pirate has the program on the disc, he can transfer it onto a blank chip via an EPROM programmer. Standard disc drive: -This used to be the easiest data story to copy. All the pirate had to do was to read the source (original) disc into the computer's memory, replace the source disc with a blank one, and write the program back on it. Computer firms soon introduced copy- guarding measures to stop pirates doing : this. The most common defence is the "unreadable" sector. Each disc consists of 40 tracks, each of which contains 18 sectors. Each sector can hold 128 bytes of data. The drive reads these sectors, and feeds them to the computer. If the sector contains information, it goes to the computer, if it is blank the message is a series of zeros. An unreadable sector sends the message "I cannot read that". If the computer does not receive this message, it does not execute the program. Standard disc drives cannot duplicate the "I cannot read that" instruction because the unreadable sectors have data written on them at twice the normal speed. However, every drive has a speed control, so once the pirate knows where the unreadable sectors are, he can duplicate the message at the right speed. Software such as the Diskey program, which helps to investigate the format of discs, is essential for the process. <2Happy Chip:>2 The 810 enhancement and back-up is a circuit board which plugs into the standard disc drive for Atari computers. lt costs $230 in the US. When installed with a disc (the Happy back-up) it automatically performs all the oper- ations needed to copy a guarded disc. Happy Chip works by telling the drive to run at twice its design speed. It also has an "enable tracer" which cuts down the time needed to run off a large num- ber of copies from one disc. (An in- nocent enthusiast would have no use for this.) Finally, because some software houses have already started to write in copyguards that can detect Happy Chips, it has a "slow it down" mode which prevents the copyguard from spotting the intruder. [] <2Wet windmill>2 A WINDMILL <1under>1 a Canadian river is putting 20 kW of power into an elec- tricity grid. Officially, it is called a vertical axis hydraulic turbine, and is the brainchild of Barry Davis of Nova Scotia. The proto- type, which Davis built for Canada's National Research Council, is running under the St Lawrence River. The turbine is an adaptation of the stan- dard "egg beater" windmill. lt is 2.4 m in diameter, and made up of three vertical blades which rotate in the river's current, connected to a generator. The whole thing is moored to a floating platform. The blades, which turn at 30 rpm, do not seem to harm aquatic life. The strong currents in the area keep down aquatic plants which might foul the blades, and the whole system has less impact on the environment than the paraphernalia of a conventional hydroelectric station would. All an underwater turbine needs is a current of at least 1 metre per second in a stream or river at least 3 metres deep. The only problem so far has been to anchor the device when the current gets too strong. [] <2Calling all cars>2 A TAXI company in Kyoto. the ancient capital of Japan, is trying out a scheme to get cabs to customers more quickly. At busy crossroads throughout the city, the company has rigged up poles which transmit a continuous location code. When one of the firm's 600 taxis passes within 600 metres, its receiver picks up the code, stores it in memory, and simultaneously relays it back to base. A second code tells the controller if the cab is available. At the centre, dispatchers keep track of taxis on monitor screens, which can display up to five cabs per pole. Colour codes show how far each cab is from each pole. The system updates the information every 30 seconds, and the taxi's memory can retain a pole's code for up to three minutes. The scheme is an adaptation of an idea that the Japan Electronic Machinery Indus- try Assocation developed three years ago to guide ambulances and rescue vehicles through cities. The taxi company says it will cut costs all round: better directions will reduce the cost of fuel, and save manpower at the dispatching centre. It will also save radio time--controllers will not have to keep calling the cars. [] <2Acid test>2 JAPANESE scientists have discovered a new way of dating the fossils of crus- taceans and animals by measuring changes in an amino acid called isoleucine. Professor Kaoru Harada, who leads the team at the University of Tsukuba, says that after an organism dies, isoleucine changes gradually from an optically-active compound into one that is optically inac- tive. "Type L" isoleucine in the living organism and "type D" isoleucine in the dead organism have almost the same chem- ical characters, but when they are hit by light they deflect it in different ways. Harada says that checking the ratio of type L to type D in an amino acid analyser tells the age of the specimen. Tests on 16 fossilised shellfish showed that acid dating is as accurate as carbon dating. [] <2Open road for the busway>2 THE BUILDERS of a revolutionary busway in Adelaide have denied that escalating costs are hitting the project. Engineers with Daimler Benz, which is building the busway, said that the "0-bahn" will cost no more than conventional light railways. However, a report in last month's <1Rail->1 <1way Gazette>1 says the terrain has been a problem, and that complicated founda- tions for the all-concrete track bed are proving expensive. It says the O-bahn, which beat a light rail alternative on cost, will not have the capacity or the convenience of a tram and is certainly more expensive". The German engineers last week agreed that they had met problems with the clay earth, which had proved unstable because of burrowing by rodents, and because it dries out in the summer. Consequently, they had to drop concrete piles 8 metres into the soil to provide a foundation. The engineers said that the O-bahn scores over light railways because the same buses can pick up and drop passengers in local streets before driving at high speed along the busway. This gives feeder services at no extra cost. With a light rail system, only 13 per cent of passengers wouId not have to change vehicles, while on the O- bahn four-fifths of the passengers could travel without changing. The O-bahn consists of two main parts, the concrete trackway and the vehicles. The vehicles are modified Daimler Benz buses, powered by diesel engines or by overhead electric cables. The trackway has horizontal concrete slabs, which bear the weight of the vehicle, and vertical guiderails to engage the side- wheels which steer the bus. The buses can run at up to 100 km/h. Off the busway the buses operate on the streets in the same way as any other bus. The 12-kilometre route in Adelaide is the second commercial installation of O-bahn. The first section ofthis route was opened in November last year. The first O-bahn track, 1-3 km long, was opened in 1980 in Essen. This month it is to be extended by nearly 1 km. The final extension of the system is planned to be through a 2-5 km tramway tunnel, which is unventilated. The full route will then be able to demonstrate O-bahn operation with diesel and trolley buses and over normal roads, busways, tunnels and track which is shared with trams. * Heinz Kaufhold, from the Hamburg research centre, told the World Conference on Transport Research in Hamburg, that trials in Essen have proved that a trolley- bus with batteries is too expensive to run. He said that this "duo bus" needs an extra 173 hours a year for maintenance compared with a trolleybus, a diesel bus or any other bus. Despite the duo-bus's advantages, (it is quiet and pollution-free) "its application in practice is unsatisfactory. The batteries are still too heavy, require too much maintenance and are too inefficient and expensive for general introduction into public transportation."[] <2IBM beefs up the satellite business>2 IBM, the world's biggest computer firm, is lining up to grab a slice of a new growth market--processing images from satellites. The company's European scientific centres have developed a terminal-based system, now moving out of the prototype phase, with high reso- lution graphics and sophisticated soft- ware. Over the past year, IBM has presented the results of its work with a prototype to inter- national conferences in the developed and developing world. Now an IBM market- ing centre in Milan is selling the product, the 7350 and the associated Hacienda image processing system. IBM's exact marketing intentions are not clear. But competitors say the company may be testing the market as the world prepares to exploit the rapidly developing technology of receiving and processing data from satellites. The 7350 is now the only image processor sold by the company. IBM engineers in Europe claim that the company's European research on satellites is ahead of its US counterpart. NASA's reported spurning of IBM as a supplier of image-processing equipment may be a result of this. But IBM France, Spain and Italy have done a lot of work on the technology. IBM was an early participant in the simulation trials for SPOT, the Earth observation satellite that a French company plans to launch in 1984. IBM has processed simulated SPOT data (from high-flying aircraft) to analyse vegetation cover in Upper Volta and to look at vegetation growth after drought in the Sahel. The high resolution of SPOT pictures compared with the rival Landsat series is easily apparent from the results of these tests, and from the more novel images of the SPOT simulations. These clearly show the bridges of Paris: with pictures from Landsat B, they cannot be made out. Scientists at IBM say that governments, agronomists, land planners and mineral prospecting companies are all prospective users of data from satellites. The software is also usable in other research, such as studying the structure of viruses. In Rome, the system even helps to restore old masters. The system is expensive. What IBM describes as a "Rolls Royce" costs several hundred thousand dollars. Many other companies already supply products in the image-processing field, and some criticise IBM as always offering systems that are too centralised and need a big expensive central processing unit to work. But many will be encouraged: IBM's entry into a market gives it added credibility, and usually makes it grow even faster. [] <2Belgian revolution>2 THE GOVERNMENT of Flanders, the region of Belgium most badly hit by the collapse of traditional industries, has launched a "do or die" attempt to create economic growth through new technolo- gies. The most dramatic manifestation of the new drive was the success of the world's largest technology fair--the first Flanders Technology exhibition. More than 25000 visitors crammed the 500 plus stands on the first day of the show, in Ghent, to look at new developments in materials science, biotechnology, micro- electronics and engineering. The organisers predicted that 130000 people would visit over the whole five days. Flanders has an unemployment rate of 11 per cent. The Belgian prime minister, Wilfred Martens, said Flanders' "third industrial revolution" would be the only way of "reversing the collapse of our indus- trial base and the dramatic consequen- ces this would bring with it". Initiatives for a third industrial revolution began last year, with a series of conferences, when the government decided to encourage indus- tries based on micro- electronics, biotech- nology and new materials, [] <2Science museums on the move>2 Museums should not simply entertain, they should also educate. Some are showing how to do both at the same time Roger Miles and Brian Lewis UNTIL quite recently, even our largest museums tended to be very much on the periphery of the educational system. They have always been notable sources of reference for serious scholars, of course, and there have always been just a few teachers and parents who have made it their business, over the years, to arrange educational visits both for themselves and for schoolchildren. However, the overwhelming majority of lay (non-specialist) visitors did not come for predominantly educational reasons. They came more out of curiosity or (in the case of parents) out of a sense of duty. There were no great expectations that they would actually enjoy the experience, let alone learn anything really worthwhile. To a large extent, their numbers were made up of tourists, people who happened to be passing by, people with time to kill, and people who simply fancied doing something different. There are good historical reasons for this general lack of enthusiasm among the public at large. Not so many years ago, museums were definitely rather sombre and sobering places. They were the kinds of places in which learned authorities paraded such things as the wonders of nature, the relics of past ages, and the achievements of great people, before the eyes of a deferential and often uncomprehending populace. It was all too easy for the masses to feel humbled by experiences of this kind, and to leave with the conviction that the entire subject matter would be forever beyond their grasp. Over the past decade or so, all this has changed. The traditional museum activities of collecting, conserving, interpreting, and the like continue as before. But the public face of museums--particularly science museums--has undergone a radical transformation. The once-daunting atmosphere has become genuinely welcoming. The emphasis has switched to trying to <1reach>1 the general public, rather than talk down to them. Above all, increasing attempts are being made to provide the lay visitor, if he or she so desires, with a genuinely worthwhile educational experience. Museums are finally coming in from the cold, to take their place alongside a whole range of other cultural facilities such as zoos, botanic gardens, planetariums, libraries, and the like--all of which are striving, in their own distinctive ways, to contribute to the <1informal>1 education of the public as a whole. The transition has not been easy. In their quest for greater public acceptability, science museums have tended to move in two different, and not always compatible, directions. First of all, they have tried to shake off their somewhat starchy traditional image, by mounting exhibitions that are explicitly designed to draw in the crowds. Secondly, they have tried to arrange more and more of their long-term or permanent exhibitions (not just the ones that have transitory mass appeal) in ways that give visitors the opportunity to <1learn,>1 rather than merely to "gawp". The first of these trends, towards greater popularism, is easier to pursue than the second. Given the requisite financial resources, there is no great difficulty about launching exhibitions that have wide public appeal. lt is almost invidious to single out specific examples of success. But nobody can seriously doubt the popularity of exhibitions such as <1The Story of the Earth>1 (at the Geological Museum in London's South Kensington), or <1The Challenge of the Chip>1 (at the next door Science Museum), or <1Human Biology>1 (at the Natural History Museum). Similar success stories can be found in other countries--for example, the aerospace exhibition at the Smithsonian Institution in Washington. And there are numerous "science centres" the world over, generally having no collections of relics or precious objects, but dealing instead with "hands-on" experience and aiming at the lively presentation of ideas in preference to artifacts. Exhibitions such as those that we have mentioned have proved to be outstandingly successful, both in terms of the <1numbers>1 of visitors attracted, and in terms of the manifest <1satisfaction>1 that most visitors experienced. The only ques- tion that remains is whether, at the end of the day, the public has really had what professional educators would describe as "a good learning experience". The question is worth pressing. And it is worth pressing for two reasons. First of all, we can concede that most adults like to have their education packaged up as "entertainment". At the same time, we must recog- nise that not all entertain- ment is good education. Indeed, there is a danger that, in the very attempt to enter- tain its public, a museum may well be tempted to treat its subject matter in a super- ficial and even frivolous way. What this means is that the opportunity to make worth- while teaching points, about the subject matter on display, will be missed. Instead, the exhibits may (for example) be presented in a "gee-whiz, would-you-believe-it?" fashion. ln consequence, most visitors will leave the exhibition mightily pleased, but poorly informed. The possibility of their leaving mightily pleased, and well informed, will have been lost. The second reason for pressing the educational question is that only the largest of our national museums can afford to mount the kinds of "spectacular" exhibitions cited two para- graphs back. lf our much smaller, and much more numerous local museums are also to come in from the cold, the only route open to them is the one of providing a stimulating and memorable <1learning>1 experience. Let us therefore look at the salient issues more closely. It is almost impossible to visit a museum, and learn nothing from it at all. At the same time, it often seems well nigh impossible--especially for the tyro or greenhorn--to visit a museum and learn anything that serious educators would agree to describe as significant or "worthwhile". The most common result of a visit to a museum is to come away with only disparate fragments of information, together with some general impressions which may well be, in certain important respects, incipiently misleading. The trouble is that museum exhibitions, as traditionally arranged, do little more than offer a catalogue of seemingly-unrelated assertions about the subject matter. ln the limiting case, the visitor is simply told that certain specimens and artifacts have certain kinds of <1names.>1 In an effort to improve on this state of affairs, some museums have tried to regale the visitor with assorted snippets of information in ordinary English. As a result of this, it is now possible to leave a natural history museum knowing that some fleas can jump 130 times their own height, and that elephants cannot jump at all, and that as many as a thousand dead ants have been found inside the stomach of a single mole. This may be excellent fodder for people who are aspiring to be the next winner of <1Mastermind.>1 But it is not good education. And the reason why it fails as education is that there is no connecting story line--not even a glimmer of an informing theory that might help to relate one snippet of information to another. It is like giving a trainee taxi driver a sample list of street names, and nothing more. In the absence of any information about the relatedness of the streets, such a list is operationally useless. An essential characteristic of all good teaching is that it must foster, in the learner, the ability to continue learning. Contrary to much public opinion, good education is not just a matter of dispens- ing facts and theories. Rather, it is a matter of care- fully showing the learner what to do <1next.>1 At all stages of the learning process, the learner must either (a) know what to do next in order to take his learning a step further, or (<1b>1) know what sorts of questions need to be asked and answered in order to discover what needs to be done next. The trouble with isolated snippets of informa- tion is that, however delect- able they may be, they fail to provide this "ability to con- tinue". The museum visitor who is told that some fleas can jump 130 times their own height simply has no idea where to go from that point onwards. In effect, the requirement of the last paragraph is a requirement that visitors to museums should be <1initiated along a path of learning,>1 and should be given enough information (for example by means of captions, or via supplementary information in, say, the museum guidebook) to enable them to proceed further, if they so desire, in a reasonably sensible and efficient way. Fortunately, this is not asking too much. As one of us has shown (Miles <1et al, The design of Educational Exhibits,>1 George Allen and Unwin, 1982), it is entirely within the scope of most exhibitions to provide this kind of introductory guidance. What is more, it can be done without in any way making the exhibition more "dull" or "serious" than it would otherwise be. Visitors who have no desire to be educated will find that their enjoyment of such exhibitions is unimpaired. But the occasional visitor who is wanting more will find that he has a genuine opportunity to engage in worthwhile learning. In recent years, more and more science museums have been trying to meet the dual challenge of increasing their popularity and enhancing their educational provision. While some museums have courted popularity at the expense of educational considerations, an increasing number of museums are now trying to meet both challenges at once. The first goal is to get the public through the museum doors by providing exhibitions that they will really enjoy. The next (and more difficult) goal is to incor- porate the kinds of story lines and narrative that will enable those visitors (possibly a small minority) who want to learn something worthwhile to do just that. Does the second goal really matter? There are several reasons for believing that the answer is a resounding <1Yes.>1 First of all, we can note that there is nowadays an increasing emphasis on the idea of "life-long education"--that is to say education that continues through the whole of adulthood. One way in which adults can test their interest in some novel subject matter is to watch television programmes, or read books. A potentially much richer way is to wander through a learning environment, such as a science museum, that has been specially laid out to introduce visitors, in a systematic way, to particular kinds of subject matter. With the help of audio-visual aids, computer-assisted instruction, and other teaching devices, a museum can bring a subject alive in ways that compare favourably with a single television programme, or a book selected almost by chance from the local shop. The kind of help that museums can give to adults can equally well be given to schoolchildren, and to teachers who bring parties of schoolchildren for specific purposes. Museums should be rewarding learning environments, and any attempt to settle for mass popularity alone is to sell museums short. Finally, at a time when the demand for public account- ability has never been greater, it is worth remembering that many museums receive substantial grants towards what is supposed to be their <1educational>1 provision for the general public. Museums which take this grant, while doing little more than offer the occasional public lecture, or some minimal help to schoolteachers who arrive with their pupils, risk having such financial support severely cut back, or even withdrawn. Educational theory does not by any means have the answers to all our educational problems. But educational theorists do at least know that it is possible to do better than this. [] <2Vague records confuse plutonium issue ?>2 Britain's civil nuclear power programme is said to have produced a stockpile of plutonium big enough to make 14 000 nuclear warheads. No one is accountable for the destination and use of this material Roger Milne REVELATIONS that the National Nuclear Corporation is considering selling Magnox reactors to Chile and Bangladesh has set alarm bells ringing. The NNC, part-owned by the government, acts as agent to the elec- tricity supply industry for the construction of Britain's nuclear power stations. It is looking desperately for overseas orders to bolster a declining workload in the UK. There is no secret of its wish to market versions of Britain's first generation reactor to emerging developed countries. Turkey is a case in point. The reason for concern is the risk of the proliferation of nuclear weapons. As well as generating electricity, Magnox reactors are particularly efficient at producing plutonium. And plutonium, providing it is of sufficiently high quality, can be used for arms manufacture. It is not just the prospect of overseas sales which is troubling the nuclear industry and its critics at present. The question of plutonium produced from the UK civil nuclear power programme is equally a source of controversy, revived this month at the long-running Sizewell inquiry. Past links between the civil use of nuclear power and its military application are clearly documented, giving the lie to government blandishments that there are no connections. The UK's first so-called commercial nuclear reactor, the Calder Hall Magnox, Cumbria, began operating in 1956. It was designed, and that particular type of reactor developed, with a dual purpose in mind: to produce plutonium for the military with 50 megawatts of electricity as a byproduct. Calder Hall's four reactors and the quartet at Chapel Cross, Dumfriesshire, now run by British Nuclear Fuels Ltd, produce plutonium for the Ministry of Defence. Britain's civil nuclear power stations also produce plutonium. The official stance is that none of this plutonium is destined for anything other than non-military purposes. The accuracy of that contention has been challenged at the Sizewell inquiry where a Central Electricity Generating Board witness, board member John Baker, is to face his first bout of full cross-examinations. Back in January, at the beginning of the hearing, Baker said "No plutonium produced in CEGB reactors has been applied to weapons use either in the UK or elsewhere and it is the policy of the government and of the CEGB that this situation should continue." Recent statements, expressing similar sentiments, have been made by Energy Secretary Nigel Lawson and his parliamentary secretary John Moore. The facts suggest otherwise. In March, material provided for the inquiry by Robert Priddle, a very senior civil servant in the energy department, gave a rather different account to Whitehall. Among a bundle of parliamentary answers on the topic submitted by Priddle are extracts from <1Hansard>1 of 9 March this year. They include a written exchange involving under secretary John Moore. He explained that plutonium derived from CEGB and South of Scotland Electricity Board Magnox power stations was exported to the United States before 1971. Moore said that the plutonium was used for civil purposes in the US. In return, under a barter arrangement, a quantity of highly enriched uranium for the UK defence programme was obtained. "The export of plutonium has therefore bene- fited the UK defence programme," conceded the junior minister. He went on to say that there has been no subsequent transfers of plutonium from the British civil nuclear programme under the UK-US defence agreement which governed that deal. There have been other developments which point to an overlap between civil and military use or nuclear power. In the late 1950s the government asked the CEGB to modify the design of a number of advanced gas-cooled reactors. This involved changes in the refuelling system; it was a bid to make it easier to obtain weapons-grade plutonium. In the event only one station, Hinkley Point A, was modified. According to a government statement two years ago: "notwithstanding this limited contingency arrangement Hinkley Point A has not been operated in order to produce military-grade plutonium." The same year, 1981, details emerged of a proposed plan to export UK plutonium, from the civil programme, to the US for its fast reactor programme. Opposition to the deal built up because of fears that this plutonium would free US stocks for use in its weapons programme and so indirectly help US weapons expansion. The most recent government statements on the deal indicate that it has been shelved indefinitely. <2Plutonium exports>2 Since 1971, 1280 kg of plutonium produced in the UK has been exported for civil purposes principally to Belgium, France, the Federal Republic of Germany, Switzerland, Japan and the United States. BNFL has reprocessed 1-930 kg of material or overseas customers in this list plus Italy and Canada. According to a parliamentary statement in 1981: "All the above material was exported for civil use principally for R&D on fast reactor programmes or the recycling of plutonium in thermal reactors." As for the export to the US of Magnox plutonium prior to 1971 the government has refused to say how much was exchanged. The bulk of the material was in the form of "coupons" for research. A small quantity, probably over 200 kg, has been used in the manufacture of californium, for medical purposes. The residual plutonium is appar- ently being held in the form of highly radio- active waste. A small quantity went to Argonne and Battelle for experimental purposes. <2Private plutonium>2 One of the most bizarre elements of the whole plutonium jigsaw is the fact that at one time the government sanctioned, at cabinet level, two private companies to own some of the material even though it had been produced by the electricity gener- ating industry. The deals, under power pricing contracts, involved British Alumi- nium Metals, the operators of the now defunct Invergordon smelter, and Anglesey Aluminium Metals. In the case of the former the South of Scotland Electricity Board was the partner; the CEGB made the "sale" to Anglesey from Dungeness B in Kent. A 1968 White Paper, "Industrial investment in production of private aluminium", provided the rationale which Whitehall subsequently disowned. Now the Department of Energy is in the ludicrous position of trying to persuade the SSEB to accept responsibility for the Scottish plutonium, currently unseparated from the rest of the spent fuel, whiIe the CEGB is under pressure to take responsibility for the Kent plutonium. This, though, is only a "paper" trans- action, Dungeness B is so far behind sched- ule that it is only now providing energy to the grid. No matter, because of the agree- ment, the aIuminium company has enjoyed 13 years so far of the theoretical value of the plutonium which has not been produced. of course, WhitehalI feels the carrot of granting the companies title to the plutonium set a dangerous and perhaps foolhardy precedent. The catch is that the Department of Energy has had to offer financial terms for the SSEB and the CEGB to persuade them to take on the responsibility. How much cash is involved is being treated as commercial secret. <2Euratom safeguards>2 At the Sizewell inquiry, and in parliament and ministerial statements, much play has been made of the existence of inter- national safeguards and treaties designed to prevent proliferation of nuclear weapons and to block civil facilities being used for military purposes. The UK is a signatory to these. For instance, all Britain's civil nuclear installations are subject to Euratom (the Europen Atomic Energy Community) safeguards procedures; all the civilan uclear installations are subject to Intemational Atomic Energy Authority safeguards, enshrined in a tripartite agreement between Britain, Euratom and the IAEA. However, as a nuclear weapon state it has the right to withdraw any nuclear materials from safe guards "for national security reasons". The government has repeatedly said that it will not switch civil material to military use. But the fact remains that the safe- guards themselves do not prevent the government using civil plutonium for defence purposes. If a government went back on this assurance there would be no IAEA safeguards which could do anything about the position. <2Likely civil Stocks>2 At the end of 1981 BNFL was holding 14.5 tonnes of separated plutonium at Sellafield (the government has promised to update the tally as far as March 1983 and will report to parliament in June). It expects a further 29 tonnes on completion of the CEGB/SSEB Magnox programme plus an additional 34 tonnes of plutonium from the existing and committed AGR programme. This gives a total of 77-5 tonnes. A PWR is estimated to produce 8 tonnes of plutonium over a 35-year operational life, and an AGR 5 tonnes over 25 years. In addition to the 14-5 tonne plutonium stock referred to above, as at 1 April last year 5.5 tonnes had been leased to the United Kingdom Atomic Energy Authority for fast breeder reactor R&D. Plutonium still in AGR and Magnox reac- tor fuel was estimated to total 8-5 tonnes; discharged but not yet reprocessed plutonium amounted to 3-5 tonnes; while plutonium awaiting processing, but in an intermediate form, came to just half a tonne, the same amount as had been exported for civil purposes. <2The unanswered questions>2 To date the government, and the industry, has refused to be drawn on the isotopic concentration of the plutonium exported to America--critical information if objectors are to be wholly satisfied that no weapons-grade material got across to the other side of the Atlantic. There is the question: why was civil plutonium exported, in the first place, under a defence agreement? There are doubts over the efficacy of the safeguarding arrangements, particularly in the light of the difficulties of"policing" the system. In addition, with co-processing the order of the day at Sellafield, it is proving very difficult to keep tabs on who has what grade of plutonium where at any given moment. CND, for instance, which has already had one stab at teasing out answers from the Department of Energy, is trying to persuade the CEGB to provide annual information on which reactor was contributing to the growing plutonium stockpile--and when. Ironically, it has been one of the board's own employees, Dr Ross Hesketh, who has been questioning many of the rather bland assurances on what has happened to British plutonium from the civil programme. He has received short shrift from the CEGB, having recently been moved to an out-of-the-way office at Barnwood, the central headquarters base for the board's engineering development and design teams. <2The plutonium path>2 Plutonium is a by-product of the controlled fission reaction in a nuclear reactor. Before the plutonium can be used it has to be reprocessed, the process by which it is sepa rated frorn the unused uranium and radio active waste in spent, irradiated fuel. In the UK, all reprocessing takes place at BNFL's Sellafield (formerly Windscale) complex. At present BNFL has insufficient reprocessing capacity to handle the antici- pated spent fuel from the AGR programme, though the position will be eased once THORP (thermal oxide reprocessing) comes on stream in 1990. Plutonium is graded by its isotopic composition. This is determined by the burn-up of the original uranium and varies with the reactor type and the length of time the fuel remains in place. As the fuel is burnt-up, nuclei of uranium are converted into plutonium 239. If the fuel remains in the reactor for long enough, other isotopes--such as plutonium 240--are produced. The higher the burn-up, the lower the proportion of plutonium 239 remaining. On-load refuelling, the ability to remove and insert fuel rods while a reactor is running makes it easier to control burn up. Reactor-grade plutonium typically con- tains between 50 and 80 per cent of the plutonium 239 isotope; weapons grade normally contains more than 90 per cent. Magnox reactors produce pluton- ium containing about 75 per cent of plu- tonium 239 isotope; AGRs 50-60 per cent; and PWRs (like the proposed Sizewell B) 57 per cent. fast reactors, which are still at the development stage, are fuelled by plutonium or a mixture of plutonium and uranium. Because the neutrons are not slowed by any moderator at fast speeds it breeds further plutonium in the blanket of uranium which surrounds the reactor core. This plutonium has a high proportion of plutonium 239 isotope and is normally considered weapons-grade. The UK's growing stockpile of plutonium is being stored in preparation for the development of a commercial fast breeder which is unlikely to take place until well into the next century, if at all. Atomic bombs need between 2 and 10 kilograms of plutonium 239. Britain's stock of plutonium from the civil power programme is estimated to be enough to build 14000 missile warheads. Once Calder Hall and Chapel Cross, the military reactors, reach the end of their operational life it is conceivable that the military could produce weapons grade plutonium from the civil nuclear programme in one or more ways. Reactor operation could be manipu- lated, though for a PWR this would involve shutting down the reactor more frequently than usual. Ordinary reactor-grade plutonium from the stations could be puri- fied to weapons-grade using lasers (research is fairly well advanced) or plutonium could fuel a fast reactor in the UK or abroad which would create weapons-grade plutonium. (Source: <1Sizewell B and the>1 <1Bomb>1 by Rob Edwards.) [] <2Is unemployment a health hazard ?>2 Being on the dole may well be bad for people's health. But can this be measured? Ian Miles IN THE 1970s everyone seemed to be worried about the dangers of work: Jeanne Stellman and Susan Daum, with <1Work is Dangerous for your Health,>1 and Patrick Kinnersley, with <1The Hazards of Work,>1 were among the authors to bring the problems to our attention. In the 1980s the emphasis is changed. One in seven of the working popu- lation is unemployed: 31/4 million people. Now we must ask, what are the dangers of being out of work? Already there have been many studies of the effect of unemployment on health; do they give useful results? Not surprisingly, except perhaps to those who maintain a Poor Law mentality, unemployed people are generally more unhappy than their employed counterparts. A survey I carried out in Brighton last summer indicated that un- employed men report themselves to be much less satisfied with their lives than do employed men. On a scale ranging from 0 (least satisfied) to 10 (most satisfied), unemployed men's average rating was 4-7--compared with a rating of 7-4 for a group of employed men in Brighton, and 7-5 for men in general in a nationwide survey by the Social Science Research Council (SSRC) in the mid-1970s. This dissatisfaction may reflect more than being hard up. Being unemployed also means that you are deprived of the social contacts and involvement in collective activity that goes with paid employment; you lack its external stimulus to activity and time-structure; and you lack the status and sense of value provided by a job and expressed in terms of social esteem as well as money. These are good reasons for expecting <1psychological>1 well-being to be at risk. Ill-health is another matter. There is a lot of evidence that unemployed people tend to be less healthy than their employed counterparts. (The Figures opposite show that this is particularly true for those out of work for more than a year--the long-term unemployed.) Less systematic evidence is provided in the observations of doctors and other profes- sionals, as well as by the well-publicised cases of suicide and mental breakdown following job loss. A longitudinal study by John Fox and Peter Goldblatt of City University, London, showed that men unemployed at the time of the 1971 census were much more likely to have died over the subsequent four years. Most startling was their conclusion that the death rates from accidents and violence (of which suicide is a major component) were more than twice those of employed men. <2The anchors of employment>2 There are various ways of explaining such results. We might argue, as does Professor Marie Jahoda--a found- ing mother of social research on the experience of unemployment--that the social contacts, collective purposes, time-structure and status provided by employment are im- portant factors in anchoring people to the real world. Without these anchors, perhaps people are less likely to take good care of themselves, are more likely to suffer psychosomatic prob- lems, and to succumb, and succumb deeply, to infections and environmental threats to health. But such arguments are not easy to substantiate. For one thing, illness renders workers more likely to become and stay unemployed: health problems may be the cause of unemployment rather than vice versa. Of course, this cannot apply to <1all>1 the unemployed: our unemployment levels today are due to government policy and the world economic crisis, rather than to worsening national health! But it is one factor that is likely to influence the statistics. More importantly, the unemployed are far from a random sample of the workforce. Those occupational categories where health is likely to be worse--those with lower incomes, unskilled jobs, etc--are hardest hit in a recession. So the average unemployed person is likely to be less healthy than the average employee. It could be argued then that the basic problem is one of poverty. Even though they come from low-wage jobs, unemployed people are usually financially hard-hit by job loss. Stories about being better off out of work apply only to relatively few people, those with large families and particularly exploitative em- ployers. (In 1977, 88 per cent of unemployed men received benefits that were at least 10 per cent lower than their previous earnings, according to DHSS data.) Such workers themselves are often married to other breadwinners, and when, as is not uncommon, both partners lose their jobs, the drop in income can be dramatic. Despite nutri- tionists' demonstrations that an adequate diet can be maintained on a pittance, family food habits are hard to change. The costs of dur- ables, housing and energy are crippling for many poor people, and reasonable warmth and hygiene levels may be hard to achieve. Poverty itself, then, remains a major factor in the produc- tion of ill-health. Clearly the problems are deeply intertwined. This inevitably means that any attempt to analyse macro- sociological data--aggregate national statistics for levels of income and unemployment, trends in infant mortality and the like--is bound to give rise to controversy. One of the key proponents of macrosociological analysis in recent years is Professor Harvey Brenner, from the School of Hygiene and Public Health at Johns Hopkins University. After making considerable impact with his American research, he announced on a <1World in Action>1 television programme that the results of his American research held just as strongly for Britain. For every 1 per cent increase in unemployment in the UK between 1950 and 1975, he said, there was a 2 per cent increase in mortality--something like 17 000 deaths per year. Brenner followed up this report later in the year in <1The>1 <1Lancet.>1 using data spanning the period 1936-76. Again he found the same link between unemployment and mortality. By the end of the year he was predicting, in research carried out for the Scottish National Party, that unemployment would have a worse impact in Scotland than elsewhere in the UK, reflecting the poorer social conditions prevalent there. Claims such as Professor Brenner's can be political dynamite. But as numerous commentators have observed, and opinion polls bear out, most people now regard unemployment as a fact of life to be endured, almost as beyond politics as the weather. Even the unemployed place government policy rather low in a list of factors responsible for high unemployment levels. FataIism is thc order of the day--and if your best efforts have failed to find a job, it's understandable that you might feel fatalistic. Gerald Vaughan, the health minister, simply reiterated in October 1981 that there is no clear evidence to link unemployment and ill health, and summarised the results of a DHSS study as showing that unemployment has different effects on different families (to the chagrin of the researcher involved, who argued that his findings showed that the unemployed were more likely to suffer from physical, and psychological, ill health, unless they worked in hazardous occupations. Brenner's remarkably assured claims did, however, invite considerable interest and scrutiny among social and medical researchers. If his results were correct, they fitted well into emerging perspectives on the social causes of disease. Some researchers saw Brenner as further confirming the connec- tions between undesirable life events, stress and illness indicated in community medical research. Others were sceptical about his con- clusions, and his methods. Basically, Brenner's methods involve applying the statistical technique of regression analysis, such as econometricians use to identify simple equations whereby the value of one variable can be predicted given the values of other variables. Brenner used it to identify the relations between macroeconomic variables and the measures of national health during the economic cycles that took place in the course of several decades. These regressions indicate that mortality is related to swings in unem- ployment levels, over and above the improvement in mortality related to the long term trend for disposable incomes to increase. The specification of such regression models necessarily involves many assumptions. For example, Brenner assumes, on theoretical grounds, a time-lag between an increase in unemployment and that in mortality: but Joseph Eyer argues that Brenner's <1tagged>1 relationship between unemployment and ill-health (roughly, unemployment at time <1A>1 causes illness at time <1B>1) really reflects a close relationship between work and ill-health (roughly, employment at time <1B>1 causes illness at time <1B>1). Eyer suggests that the equations are really measuring increased ill-health due to the intensification of work during periods of higher prosperity! Mortality statistics, too, are not restricted to the unemployed: how do we know that they are not indicating greater health problems in the population <1as a whole>1 during times of economic difficulty? Other researchers have pointed out that improvements in medicine and nutrition, and changes in other health-related variables, such as the types of jobs that people do, are not taken into account in Brenner's study. Such variables may be particularly relevant when data from the interwar period (when both unemployment and mortality rates were high) are included in the analysis. In 1981, H. S. E. Gravelle and G. Hutchinson at Queen Mary College and J. Stem of the Centre for Labour Economics at the London School of Economics, reported that Brenner's equations were not stable over time. In Cambridge, J. M. Winter of Pembroke College, argued that infant mortality data were better explained in terms of long run socioeconomic changes than short-term fluctuations. The controversy that has surrounded Brenner's results illustrates how difficult it is to draw firm conclusions about the role of any one factor, such as unemployment, on a state as loosely defined as "health". So are there other sources of evidence that can offer deeper insights? Most of the surveys showing that unemployed people are less healthy than the employed are cross-sectional in nature (comparing different people at the same time, rather than following the changes in people's lives over a period of time). Even if statistical controls are used to make sure that people with similar work histories and social backgrounds are being compared, the key question of what causes what remains unanswered. There are an increasing number of surveys which have looked at individuals over a period of time. D. Metcalf and S. J. Nickell, also of the Centre for Labour Economics, analysed retrospective data from the National Training Survey, in which information was elicited from respondents about their careers from 1965 to 1975. These data suggest that unemployment had a greater impact on major illness (sick ness lasting three months or more) over this decade than vice versa, although the effect is not tremendously powerful. On the other hand, Sue Ramsden and Clive Snee found that most unemployed men said their health had stayed the same since losing their jobs in 1978 and roughly equivalent small proportions thinking it had got worse or better. Ramsden and Snee used data from the DHSS Cohort Study. The Cohort Study interviewed a group of unemployed men one, four and twelve months after they had registered as unemployed in 1978, before the massive wave of job losses, to discover their attitudes to job finding and being out of work. It provides no evidence for worsening health among the unemployed for that period, but it did not ask its respondents about <1symptoms,>1 just about overall health or disability. My Brighton sample, admittedly smaller, tends to report more worsening of health, although what is meant by "health" is often psychological well-being. Other studies have surveyed panels of respondents over a period of time--though few have physiological measures of health (such as changes in blood pressure) or doctors diag- noses, and most use self-report questionnaires of one sort or another. An exception is the famous American study by S. V. Kasl, which showed that the threat of unemployment, as well as unemployment itself, had physiologically stressful effects, including increased blood pressure. British research has made much use of the General Health Questionnaire (GQH), which despite its name actually seems to index minor psychiatric morbidity (ill-health). This preliminary psychiatric interview is regarded as a predictor of more serious psychological problems, and has been used extensively by researchers from the MRC's Social and Applied Psychology Unit at the University of Sheffield. They have consistently found large differences between compara- ble groups of unemployed and employed people in cross sectional surveys, as I do in my Brighton study (even when those who lost their jobs through illness are excluded). lt would be surprising if prolonged periods of psychological morbidity are not reflected in declining physical health. <2Morbidity among school-leavers>2 One test of the hypothesis that unemployment causes morbidity was performed by M. H. Barker and P. R. Jackson of the Sheffield group. They interviewed young people over 20 months, including the transition from school to work--or worklessness. At the first interview there were no GHQ differences between those who subsequently found work and those who did not, but the groups diverged on leaving school. Such interviews provide an index of "psychological malaise" but they are a long way from producing the more concrete statistics provided by medical consultation, prescription levels and mortality rates. The implications from this are several. One is that the present level of unemployment must concern the health service (and 1 have said nothing about other social prob- lems!). They strengthen the case, that was forcefully made by the Black report, <1Inequalities in Health>1 (Penguin 1980), for some redirection of effort towards those most at risk. We need to identify the unemployed people who are most likely to suffer; several studies suggest that support from friends or relatives can help. My own survey in Brighton indicates that GHQ scores and reported deterioration in health are highly related to the extent to which unemployed men can maintain the social contact, collective purpose, activity, time-structure and status that Jahoda identified. Such research should help us identify which <1aspects>1 of unemployment are crucial. Some community services are making serious efforts to provide more resources to the unemployed, hampered though they are by limits on local government expenditure. Nothing in this article negates the evidence that poverty itself is a major contributor to ill-health. Improving welfare bene- fits to the workless--however contrary to current government policy--would undoubtedly be an important step. If unem- ployment has its own effects on health, however, this would not be enough. Unemployment means more than poverty. There is no reason to expect that unemployment will recede of its own accord, and Britain is far from alone in these problems. The way a society faces up to unemployment now may largely determine its social and economic health in the future. [] <2STUDENT BOOKS>2 <2New books for ivory towers>2 The student's grant includes #165 for the purchase of course books. According to a recent survey, students buy an average of 11.33 books a year for their courses--three each of set texts and recommended books, and five further academic titles, When one considers that British academic books in 1982 cost an average of #21.76 for science and technology titles, it would seem that, contrary to popular belief, students are spending more than their grant allocation. But out of 9000 books published in science, technology and medicine during 1982 (plus the many thousands of titles in print), how does the student make his or her selection? The most important influence must be the lecturer's reading list, followed by value for money. So this year, for <1New Scientist's>1 student books issue, we invited nine people to select what they thought were the most interesting books for students published during the past year. <2Revolution in Earth science>2 ---------------------------------------------------------- Peter J. Smith THE MAJOR revolution of Earth science in the 1960s was rapidly followed by a minor revolution in Earth- science publishing, at least as far is general texts were concerned. The plain-covered wordy tomes of earlier years, largely in- distinguishable in appearance (except for the use of photo- graphs) from their Lyellian and Geikien predecessors of the century before, suddenly gave way to glossy volumes with brightly coloured covers, two- colour diagrams, larger page sizes and a higher picture-to text ratio. The techniques of the coffee- table brigade were seeping through into academia, which was no bad thing. Unfor- tunately, the process soon got out of hand, for reasons having less to do with student need or the demands of a rapidly advan- cing subject than with profes- sional status, it seemed that every American academic (the glossy-text phenomenon is almost exclusively American) would have to have his (seldom, if ever, her) own textbook. The volumes began appearing in their scores and have not stopped. Hardly surprisingly, given the size of the market, most of these texts have never gone beyond their first editions. So when we come across one that has reached its sixth, we must sit up, take notice and ask why market forces have singled it out for special success. ln the case ofthe Leet-Judson-Kauffman <1Physi->1 <1cal Geology>1 the reason is clear; the book is a crammer and none the worse for that. The sections are short, the text is sharp and to the point, the definitions are emphasised in bold type, the diagrams are pertinent and the coverage is vast. The volume is really little more than a set of expanded notes; but what it lacks in inspi- ration on that score, it more than makes up by its sheer usefulness, not least as an exam- ination primer. It's hard to imagine a better, although Arthur Strahler's <1Physical>1 <1Geology,>1 a book constructed along similar lines, at least equals it. Personally, I prefer something a bit less blatantly mechanical and perhaps slightly less glossy, in which connection I can recommend <1Earth>1 as the best of the many general texts now on offer. As this is the book I have turned to first ever since the first edition appeared in 1974, it's gratifying to note that enough people agree to take the volume into its third edition with a fourth on the way. The tragedy of the publishing overkill in this field, of course, is that many authors are persuaded into grinding away for years to produce something that is perfectly competent but totally unnecessary. The latest victims are Allan Ludman and Nicholas Coch, whose <1Physical>1 <1Geology>1 has a slight edge over its competitors in that it includes a series of rather hand- some colour plates. Even these, however, are insufficient to disguise the fact that, though more than adequate, the book looks, feels and reads like a hundred others. On the other hand, it is still possible to be more original, as <1Surficial Geology>1 demonstrates. The title and format might lead the casual observer into expecting just another general text; but closer inspection reveals that the book assumes some basic knowledge of the subject and then goes on to concentrate on those aspects of geology of most direct relevance to society--seismic and volcanic hazards, landslides, erosion, subsidence, groundwater, and so on. It's a most impressive volume concerned with topics that are seldom dealt with in detail outside research papers and monographs. It's not quite the first student text in the field, however, for there are already <1Geology>1 <1and the Urban Environment>1 and <1Environmental Geology,>1 although only the latter is as comprehensive as <1Surficial>1 <1Geology.>1 My only regret here is that all are from North American authors and thus quote largely North American examples. No such problem arises with <1Atlas of Igneous Rocks and their>1 <1Textures,>1 a laboratory manual of several hundred thin-section photographs in colour and accompanied by notes. Nature displayed in this form beats abstract art hands down; the pictures are so breathtakingly beautiful that one can wallow in them for hours and quite forget geology. Here is a volume you could present to someone with no interest in science and expect a pleasurable reaction from any but the most insensitive. Much the same could be said of <1Atlas of Ore Minerals>1 except that the impact is reduced by the smaller and very slightly less well reproduced pictures (of polished sections here). The purpose of both volumes, however, is not aesthetic but utilitarian; it is to help students identify rocks and ores. Tradi- tional though that activitv is of the students of past gener- ations who have gone into the making of the tradition can have had such impressive aids. Modern photographic and printing techniques made both atlases feasible; but there is also a much more fundamental way in which technological innova- tion has changed, and is continuing to change, the study of igneous rocks. The Earth science revolution was made possible in large part by a new found ability to examine the oceanic crust, at first by remote sensing and later through direct access; and the techniques that revealed the importance of the ocean floors in the plate-tectonic scheme of things were also to provide ever-increasing quanti- ties of data on the rocks hitherto concealed by the world's oceans. Given reasonable compe- tence, therefore, the latest book about igneous rocks is almost bound to be the best in the sense that it will contain the most up to-date information on a fast- moving field. Just such a text is <1Igneous Rocks,>1 which succeeds admirably in synthesising of both the old and new data into as coherent a picture as is possible given current levels of ignorance. That a single volume is no longer sufficient to cover every thing an undergraduate should know about igneous rocks merely means that Daniel Barker's book will never com- pletely replace others con- centrating on more particular aspects of the subject. The new (second) edition of <1Petrography,>1 which also covers sediment ary and metamorphic rocks, is thus welcome for that reason alone. Over the years the first edition (1954) of this work has acquired something of the aura of a "classic" text, and one or other of the editions is never likely to stray far from the student petrographer's microscope. All the same, it has a curiously old fashioned format, not least in its eschewal of the art of photo- graphy in favour of the tradi- tional sketch diagrams. As time goes by it is likely to find itself under increasing pres- sure, for the glossymen are now moving into the petrology field too. For example, <1Igneous and>1 <1 Metamorphic Petrology,>1 which does make use of photographs, succeeds by the sheer wealth of information it contains (more than twice the pages of the corresponding sections in <1Petrography).>1 So does <1Sedi->1 <1mentary Petrology,>1 whose pages outnumber those of its older competitor by three to one. Although bulk is no guaran- tee of quality, the standard of both these new petrology books is high. Whether they will emerge as market leaders will depend on whether the pub- lishers flood the market with comparable volumes, as they have done with general geology texts. [] <2Star crop for astronomy>2 Archie Roy with Charles Boyle, John Brown, David Clarke, Robin Greene, Alastair McDonald, John Simmons and Ian Walker MANKIND'S eternal quest for knowledge about the Universe has been fired afresh in recent years by the spectacular revolution brought about by new astronomy and space technology. The large number of astronomy books published may be varied in their presentation, topics, standards of difficulty and reader- orientation, but they have one thing in common: the author's open enthusiasm for the subject and desire to put it across. On the well-known principle that new dog and cat foods are best tested by feeding them to, respectively, dogs and cats, the vintage crop of 1982 books on astronomy was therefore fed to members of the Astronomy Department of Glasgow Univer- sity experienced in both teach- ing the subject and researching in it. Their ruminations after their meal are given below. The 1980s have been forecast as the Golden Decade of com- etary research, seeing the first return of Halley since Mount Wilson was built and the launch of four cometary probes. Antict- pating the associated upsurge of interest, J. C. Brandt and R. D. Chapman present a survey of contemporary understanding of both the nature and origin of comets, together with its histori- cal context. They cover most aspects of cometary science with a simplicity and lucidity that make the bulk of <1Introduction to>1 <1Comets>1 readable while contain- ing enough hardcore physics and chemistry to recommend the text at all undergraduate levels. Despite some notable omissions, such as the recent provocative work by Bill Napier and Victor Clube on interstellar origins, this introduction is a valuable addition to the rather lean cometary literature. Jupiter is no ordinary planet: it almost made the grade as a star and a study of its wondrously complex magneto- sphere--so different from the Earth's--provides valuable in- sights into distant astrophysical objects. The excellent <1Physics of>1 <1the Jovian Magnetosphere>1 is worthy of the high standard of the Cambridge Planetary Science Series. A. J. Dessler has collected a wealth of infor- mation and knowledge in 12 articles written by 23 of the world's leading space scientists to provide an up-to-date picture of the magnetosphere of jupiter. Ground-based data as well as the latest information from the Pioneer and Voyager spacecraft are used in developing both physical descriptions and theoretical understanding. The text, in its careful planning and execution, and the care taken in its production, is invaluable as a major reference work for planetary physicists, which final undergraduate physics or astronomy students will also profit from. <1The Milky Way,>1 presumably a translation of Ludwig Kuhn's earlier German text of 1978, is an attempt to give an up-to-date view of the Galaxy and to com- pare it with other galaxies. Kuhn discusses the Galaxy's structure and development first by describing its star population and the interstellar medium, paying attention to the intimate relationship between the two. The dynamics of the Galaxy is considered and he discusses the evolution of the various features making up the Galaxy. The book is non-mathematical and assumes little technical know- ledge on the part of the reader. It might therefore have been recommended were it not that it would take someone already well-versed in galactic studies to interpret a text badly translated and poorly proof-read, abound ing in misspellings and obscure turns of phrase. Robert Robbins and Mary Hemenway have compiled a set of practical experiments in <1Modern Astronomy: An Activi->1 <1ties Approach.>1 The range in skill required for these experiments is, however, too large; some can be carried out by a student at home while others require specialised equipment costing thousands of pounds. The early experiments involve simple angular measurement of the coordinates of celestial objects for star positions, planetary motions etc. Later are projects dealing with spectroscopy. lt is unfortunate that the authors avoid mathematical formulae which are essential for a proper understanding of the experiments. And several simple errors occur. Having said that, the book could be useful as a guide if the student were to read other texts to fill in the gaps. Of the hundreds of recent astronomical texts, few provide material of any worth on the practicalities of observation. <1Astronomical Photometry>1 is therefore virtually unique in that Arne Henden and Ronald Kaitchuck devote the whole of its close-on 400 pages to the single topic of photoelectric photometry, dealing with all the facets that need to be known and appreciated by the would-be photometrist. In its lively style, the chapters cover the basics of instrumental design--both mechanics and electronics--the standard photometric systems, and the calculations necessary to apply the corrections required to appreciate the quality of the results; all with clear exposition and laced with good advice. Undoubtedly this is an excellent text for giving the first- or second-year undergraduate the all-round flavour of the subject. Although <1Introduction to>1 <1Special Relativity>1 appears as a new title, in reality it is a heavily revised, expanded and updated version of Wolfgang Rindler's <1Special Relativity>1 (now out of print). Apart from the use of SI units, the new form and notation is compatible with current usage in general rela- tivity, and familiarity with the material in the book would be extremely useful for the student of gravitational physics. In addi- tion to the usual material, it contains useful chapters on elec- tromagnetism and relativistic fluids and introduces the reader to tensor notation. Its predecessor was extremely clear in dealing with the funda- mental concepts of special rela- tivity and although Rindler's new book is more discursive and modern in its approach (for example, its treatment of space time) it does lose out in succinctness. One is usually wary of text books which avoid the use of calculus. However Howard Goldberg and Michael Scadron, using Newton's laws, basic elec- tricity and magnetism, and the fundamentals of modern phys- ics, have created a thorough introductory course in astro- physics and cosmology. Their approach is informal and <1Phy->1 <1sics of Stellar Evolution and>1 <1Cosmology>1 reads like a scientific detective story. The topics include the techniques and tools of astrophysics, the Galaxy and interstellar matter, stellar struc- ture and evolution, degenerate states of matters, pulsars and compact X-ray sources and, finally, cosmology and big-bang physics. The book maintains an air of excitement throughout, at some points asking questions and leaving the reader to ponder until a later chapter. S. A. Kaplan has succeeded in presenting an informal yet exciting introduction to stellar structure and evolution. Starting with a chapter which describes the observational characteristics of stars, <1The Physics of Stars>1 goes on to explain these charac- teristics by describing the struc- ture of main sequence stars, dwarfs, neutron stars and black holes: Two chapters not usually found in textbooks on stellar structure, one on variable and non-stationary stars and the other on protostars, make good reading and help the student to understand cepheid variables and the origin of stars. Kaplan uses little mathematics, and although the book does not represent a thorough course in astrophysics it will make excel- lent supplementary reading. Little new <1factual>1 material is to be found in <1Revealing the>1 <1Universe>1 but where the editors, J. Cornell and A. P. Lightman, do score is in their unique combination of subjects and authors, and the approach to the material. Based on popular lectures at Boston Science Museum, 13 Massachusetts astronomers present their personal slants on research problems ranging from heating of the solar corona to Einsteinian gravitation, empha- sising the complex interplay between theory and observation and the problems and frus- trations of research, rather than just its outcome. While the style is in places a shade facile, the collection provides, in a quick good read, a flavour of the real- ities of research. Jay Pasachoff has a refreshing approach in <1Astronomy: From>1 <1the Earth to the Universe.>1 Throughout the text he puts considerable effort into empha- sising that any conclusions we may arrive at about the Universe and the objects within it are only scientific theories. The book is well-presented with several explanatory appendices, glossary and a very complete index. The first of the six sections is mainly historical and traces the development of astronomy from megalithic observatories through to the latest electronic observational techniques. ln subsequent sections, he deals with the Solar System, the stars, stellar revolu- tion. the Galaxy, other galaxies and beyond. Pasachoff manages to present, in a non-mathe- matical way, astronomy as a science with a rich past and an active present. An ideal modern source of background material for a first-level astronomy course. [] <2Revealing the secrets of maths>2 Ian Stewart MOST mathematicians are bored with their subject. This will outrage my colleagues, to whom mathe- matics is not merely one shining strand in life's rich tapestry, but the tapestry itself, with the wall that it hangs on thrown in. The statement is false, terribly so; but it is the impression that most mathematics texts manage to convey. So in judging this year's crop, I'll award points for books that convey the dreadful secret that <1mathematics is fascinating.>1 "There was once a bumper sticker that read 'remember the good old days when <1air>1 was clean and <1sex>1 was dirty?' Some of us are even old enough to remember the days when Maths was fun, not the ponderous THEOREM. PROOF, THEOREM, PROOF. . ." So says Donald Newman in <1A Problem>1 <1Seminar,>1 offering his alterna- tive: 109 problems, followed by hints if you get stuck, followed by solutions if those don't help. Another black art is revealed- in D. Solow's <1How to Read and>1 <1do Proofs.>1 I confess that I'm a little taken aback that a proof is something that you "do": it seems to me you build it, construct it, devise it, or cobble it together. But it has long been a mystery to me why students find proofs to be a mystery, and Solow has the answer: it's not enough to tell them "a proof is a sequence of logical steps". You might as usefully say "a symphony is a sequence of musical notes". Then we have the "get your hands dirty" philosophy, admir- ably pursued by R. P. Burn in .A <1Pathway into Number Theory.>1 "Have you ever followed each step of a proof, and yet felt you did not understand what it was about?" The remedy is to take a juicy example and pull it to bits yourself. Solow's book might mislead you into assurning that theorem- appreciation is best learned by appreciating theorems; but Burn looks at the raw materials that make theorems possible. Judith Gersting's <1Mathe->1 <1matical Structures for Computer>1 <1Science>1 provides a lucid path through set, theory, logic, algebraic structures, and combi- natorics; she covers them all in a quarter of the usual space, show- ing what a mess plenty of mathe- matics texts manage to make. Analysis. . . Oh dear, analy- sis, the perennial problem. "Pupil learns calculus. Pupil goes to college and is taught analysis. Pupil becomes teacher, and teaches . . . calculus." 1 don't believe you can solve the problem with the analysis- made-easy approach: it is hard, and if you can't face that fact, you've had it. W. R. Parzynski and P. W. Zipse's <1Introduction>1 <1to Mathematical Analysis>1 makes a better job than most, and it does it all in the right order, from special to general, what you might call the Reverse Bourbaki Gambit. In contrast the new edition of R. P. Boas's <1Primer of Real Functions>1 does it all exactly the wrong way round (Baire before Binomial) but redeems itself with its dedica- tion: TO MY EPSILONS. One of the year's true originals is J. Pottage's <1Geometrical>1 <1Investigations.>1 The first time I saw this I thought "Oh God, not another Socratic dialogue" but I was impressed on reading it. In hourly discussions Galileo's characters, Salviati, Simplicio and Salgredo, tackle a problem in geometry. for which figures are the ratios area of figure: area of circle perimeter of figure: perimeter of circle equal? My only qualm is that the final answer (any shape, but get the size right) is a bit of an anti- climax and, despite the author's deftly misdirecting hand, those with a feeling for scale-changes will spot it sooner than they ought to. But, this is the right way to <1do>1 geometry. Geometry breeds topology, or once did. For years topology has been forced to labour under the analysist's yoke; all open sets and compact spaces instead of knots and Klein bottles. In contrast, P. A. Firby and C. F. Gardiner's <1Surface Topology>1 is accessible, includes vector fields and the homotopy group, and it's got pictures! So has the more advanced <1Introduction to Differ->1 <1ential Topology>1 by Th. Brocker and K. Ja/nich, which is superb. We often deprive our students of the ability to apply their mathematics. By this I real, genuine, good, proper mathematics--that is, like I do- used to do something <1else.>1 <1Applying Mathematics>1 by D. N. Burghes <1et al>1 tackles (with simple mathematical tools) the art of model-building. This is what applied mathematics should be about; not the falla- cious misproofs of Stoke's theorem or the relentless linearisation of the nonlinear that put me off applied mathe- matics for 15 years. How about these for page-headings? Forestry management; Drug therapy; Rats; A replacement policy for fuel injection pumps. All in a space of 13 pages. I rank D. K. Arrowsmith and C. M. Place's <1Ordinary Differ->1 <1ential Equations>1 very highly indeed, because it tackles a diffi- cult and urgent problem: to equip applied mathematicians with 20th-century tools. The theory of differential equations, as usually taught, is a rag-bag of special tricks, many of which are a waste of time and are included only because they can be calcu- lated without having to think about what's really going on. The modern qualitative theory can be fleshed out into a quan- tiative technique; and this book tells a coherent and deep mathe- matical story, packed with insights and ideas. This is how differential equations should be done. Finally, some "general interest" books, for expanding your horizons. <1The Mathe->1 <1matical Gardner>1 is a collection of essays in recreational mathe- matics in tribute to the grand- master of the genre, the inimitable Martin Gardner. <1The>1 <1Fractal Geometry of Nature>1 reminds us that mathematics can surprise us with insights into the world in which we live; it has the most beautiful graphics 1 have ever seen in a mathematics book. And <1Winnning Ways>1 is widely, and rightly, considered to be the most original and attractive mathematics book of the year. You can do more than <1play>1 games: you can add them, multiply them, and develop them to a remarkable depth. [] <2Developing psychological thought>2 ---------------------------------------------------------- David Cohen WHEN the consensus was that psychology was "the science of beha- viour", textbooks were simpler. There was no need to confuse students with too much contro- versy. The sharp growth in psychology and its associated controversies in the past decade, however, means that even the most introductory book has to recognise that what is a fact to one psychologist is a nonsense to others. Most general books leave the feeling that authors are uncomfortable with this dilemma. Oh for a discipline in which everybody accepts the same facts, more or less, and students could get on with learn- ing an agreed list. One emerging controversy concerns consciousness. For years, most psychology was writ- ten as if human beings were unconscious; in fact, it was only psychologists who were uncon- scious of consciousness. Now, as the rather skimpy <1Consciousness and Behaviour>1 outlines, psychologists are having to face the fact that most people don't spend most of their time behaving like witless rats. We think and, even, think about ourselves. Worrying for the once-dogmatic. Apart from controversies, psychology (in common with other subjects in search of grants) has become more diverse and "relevant". The British Psychological Society and Macmillan have collaborated to produce a series of books of "Psychology for .. . [different professions]"; among the lucky recipients are doctors, nurses, occupational therapists, teach- ers and, even, hairdressers! <1Psychology for Hairdressers>1 is perhaps less "relevant", given the academic recession, than "Hairdressing for Psycholo- gists" would be. The therapist who will listen to your problems and give you a good perm is bound to avoid the dole. If I am flippant, it's perhaps because nearly all these books are awkward about the limitations of psychology. Despite recent progress, psychology is still a subject in which there are far more ques- tions than answers, and far more good questions than good answers. For example, we know more about how people learn nonsense syllables than about how we remember which train to catch. Instead, most books make pious noises about more research being required rather than offering a frank discussion the of areas of ignorance. <1Mastering Psychology>1 illus- trates the problem; the title trumpets that psychology has been mastered. All eager students need do is to plough through its well-presented 600 pages and they will emerge as masters too. The book covers much: a potted history of psychology, the development of the fetus, learning, psycho- therapy and IQ. Pictures are good, the layout is snappy and there are witty illustrations. Part of the chapter on motivation deals with obesity, and the discussion of theories of why we stuff ourselves is capped by a cartoon in which a chef tells a fat blimp "As far as I know there's no such thing as a diet cream puff." Such touches make one go on reading. Lester Lefton and Laura Valvatne even use psychology to help the student with pep paragraphs about being motivated and a stream of "progress checks" and "self- tests" so that students can monitor their mastery. lts pro- fessional presentation, and cosy assumption that there are clear answers, is all very American. A quite opposite book is Elisabeth Valenline s <1Concep->1 <1tual Issues in Psychology>1. With- out so much as one cartoon, Valentine offers a brisk intro- duction to such perennials as the role of introspection, free will, the relation of psychology to physiology and some newer "big questions", such as the role of humanistic psychology and computers in studying behav- iour. She rehearses the outlines of arguments efficiently and is not frightened of forcing students to grapple with tough material. Disappointingly though, in her efforts to be balanced Valen- tine rarely lets us see just where she stands; though she does make a strong plea for psycho- logists not to ape either physics or sociology. Studying human behaviour should be a science in itself, with its own distinctive methods. Despite that, the book has a curiously impersonal flavour, which is a pity. Whatever else it is, <1Psychol->1 <1ogys Sanction for Selfishness>1 is not impersonal. I have included it among the general books because Michael and Lise Wallach offer as total a critique as I have seen recently of the assumptions underlying most psychology. From freud, to the behaviourists, to the new fangled cognitivists, psycholo- gists share the belief that we do things essentially for ourselves. We manoeuvre in the world constantly looking out for Number One. Psychologists have legitimised this in the most peculiar ways: Sigmund Freud claimed that people who are not selfish enough are probably pathological, and therapists now urge us to "actualise ourselves" in between bouts of learning how to assert ourselves. You fail to do so at your peril: "Me" is now categorical imperative. In a challenging hook, the Wallachs question both the empirical and moral assump- tions in such theories. Though I disagree with much of the book (for I enjoy actualising and asserting myself) I recommend it as one of those rare books that puts a fresh perspective on a wide body of data. lt certainly stands out by comparison with <1Psychology>1 in <1and People.>1 The ultimate text in the BPS/Macmillan series. Anthony Gale and Tony Chap- man have edited together may chapters, on everything from bargaining to being interviewed to death, from the other books in the "Psychology for . . ." series. Though there are good contributions by Michael Argyle on social situations and Don Bannister on personal knowl- edge, the book feels cobbled together. Why didn't they call it "Psychology for People'? Are people not to be trusted with too much knowledge of themselves? One of the signs of diversity of the field is the sheer mass of specialised publications, ranging from <1Focus on Vision>1 to <1Anoma->1 <1listic Psychology.>1 The latter isn't animalistic gone haywire, but a comprehensive study of anoma- Iies, extrasensory powers, UfOs and such eternal questions as how many bent forks you can stand on the head of Uri Geller. With consciousness raising its head again, K. J. Gilhooly offers a useful review of the latest thinking on thinking--in a book aptly entitled <1Thinking.>1 He concludes (which once would have been controversial) that human beings do think--and even creatively. A further sign of how specialised textbooks have become is <1Behaviour Modi->1 <1fication.>1 which takes 522 pages to outline for the novice how to modify behaviour and when it's not likely to work perfectly. Nothing in psychology hopes ever to work perfectly. Developmental psychology is probabIy the liveliest speciality just now. Once it used to be just child psychology but we now know that we develop all the time from womb to tomb. Grace Craig in <1Human Development>1 even includes a section on death as the last development of all, the final growth experience. She devotes the major part of the book to showing how psychol- ogy's view of children has become more rounded. Even the infant is studied as a social, emotional and cognitive creature and one, it turns out, who is cleverer younger than psychologists allow. <1The Handbook of Develop->1 <1mental Psychology,>1 edited by Benjamin Wolman, covers nearly everything one could think of--as it should weighing in at #69 and nearly 51/2 kg! I cannot claim to have read it all but it seems massively compre- hensive, and certainly useful for libraries. Pity the poor student faced, last year, with something close to 2000 books that deal with psychology in some way or other. The sheer mass of words does not mean that research is inspiring. As I've tried to suggest, many books do not face up to the very evident problems confronting psychology, but it does seem that the discipline is alive and kicking. Though perhaps it could do with kicking itself a little more. [] <2A new breed of biology books>2 ---------------------------------------------------------- Garth Chapman THREE years of under- graduate study now re- quires 10 to 20 paperbacks where formerly a few solid text- books sufficed. Accordingly I have looked at examples from evolutionary biology, physi- ology and the study of groups of organisms or of habitats, In 1982 Darwinism was defended and neo-darwinism re-affirmed; <1Evolutionary Prin->1 <1ciples>1 is Peter Calow's contri- bution to the central concept of biology. The book is divided into accounts of history, genetics, adaptation, evolution and development, and macro- evolution. It covers much ground in 100 pages and 100 references and does so clearly. Adaptation is one of the longer chapters since it exemplifies fitness. It may be impossible to draw up a fitness league table but it is possible to say that complexity has increased during evolution. However the driving force which brings this about, and the relation between fitness and organised complexity is not discussed. Although endowed with Calow's flair for seeing the wood the book gives the impres- sion of being rather hastily put together. <1Population and Evolutionary>1 <1Genetics:- A Primer>1 by francisco Ayala is a clear, concise account starting with Mendel, going via the population genetics of the neo-darwinian position and concluding with speciation, macroevolution and the vital contribution of molecular substitutions. It is well illus- trated and includes problems and a glossary but has a scanty index. <1Five Kingdoms: An Illustrated>1 <1Guide to the Phyla of Life on>1 <1Earth>1 by Lyn Margulis and Karlene Schwartz gives. 160 of its 270 pages to the Monera (bacteria), Protoctista (roughly protozoans & algae) and fungi. The remainder is for animals and plants. A comprehensive survey with this emphasis is very welcome. However logical may be the derivation of Protoctista as the name for a kingdom, I wonder if it is more likely to catch on now than it was in 1861. If the book does less than justice to large, compIex organisms one can argue that <1sub specie aeternitatis>1 the authors may have got it about right! It is difficult in one book to present an adequate account of the nervous system and to tell the reader how an animal actu- ally carries out all the behaviour which it needs to survive and to propagate. <1Animal Behaviour:>1 <1Ecology and Evolution>1 by C. J. Barnard tries to do this and can be admired for its temerity and useful list of 344 references. But one's confidence is undermined by infelicities such as "protozoans get by quite nicely with only rudimentary sense cells" and grams for kilograms in the abscissa of a figure. The author seems more at home with that part of the study in which behavioural activities are inter- preted by reference to models. Such faults and inadequate figures regrettably prevent the book from attaining its worthy object without hard work on the reader's part. <1Mammal Ecology>1 by M. J. Delany does not set out specifi- cally to connect ecology with evolution but it does provide a great deal of information packed into its 150 pages. The study includes a varied range of examples taken from the 4000 living species and the author can speak from first-hand knowledge of a variety of habitats. I liked the last chapter on "Some applied problems" in which the importance of a thorough knowledge of mam- mal biology for controlling or conserving particular species is outlined. This chapter comes much nearer to explaining <1methods>1 of gaining knowledge and solving problems than do other chapters. Of three books on the nervous system G. M. Shepherd's <1Neurobiology>1 is called an intro- duction but since it is a system- atic study of 600 pages it is rather more than that for most of us. It has four major sections, cellular mechanisms, sensory, motor and central systems, is lavishly illustrated, and is comparative, developmental and ethological where required. The index is good but the references do not provide easy access to the research literature. <1Comparative Neurobiology>1 by Peter Mill has a lot about standard muscle and nerve physiology, nothing about development and too little about the status and functioning of the invertebrate nervous system. Perhaps the title led one to hope for more of a synthesis than is possible in the space. J. F. Stein's <1An Introduction to>1 <1Neurophysiology>1 is a very good introduction because the chain of argument from observation to explanation is maintained. It is, perhaps, too man-orientated for general use but it is an excellent "reader" on a complex subject seen in its historical context. Unlike other large animal phyla all nematodes look alike except for size--at least to the common man. Their interest lies in the versatility of their design, their marvellous phys- iological adaptability and their ecological importance. In <1The>1 <1Natural History of Nematodes>1 it is unfortunate that George Poinar Jr does not throw off the armour of nomenclature which prevents the general student from getting to grips with these creatures. <1Modern Parasitology>1 is an uneven book and needs a stan- dard text to go with it accord- ing to the editor, T. E. G. Cox. The first two chapters are largely familiar but C. A. Arme's on nutrition, the editor's on immunity and R. M. Anderson's on epidemiology are useful. Strangely I could find only one mention of a plant parasite--in a figure. I thought we were all biologists these days! At least two books on marine ecology have surfaced. R. S. K. Barnes and R. N. Hughes's An <1Introduction to Marine Ecology>1 plunges into "the nature and global distribution of marine organisms". Assuming some knowledge of the living world it deals with "trophic, competitive and environmental inter- actions . . ." with a strong, some what colloquial story line which provides an enticing intro- duction to marine ecology. Jeffrey Levinton's <1Marine>1 <1Ecology>1 "aims to be a text or background reading" at degree level. It is more orthodox in arrangement and pedestrian in style but the subject is well covered, the explanations clear and the illustrations well drawn and captioned. References are plentiful and there is a good index and glossary. Barnes and Hughes for a good read, Levin ton for the examinations. Of the many more books worth seeing, <1Electrical Conduc->1 <1tion and Behaviour in "Simple">1 <1Invertebrates>1 edited by G. Shel- ton (Oxford UP #40) researches the parts other neurobiologies cannot reach. <1The Senses>1 (Ed. H. B. Barlow and J. D. Mollon, Cambridge UP #30, pbk # 12 - 50) is of wider interest than the medical field at which it is primarily aimed. <1Hibernation>1 <1and Torpor in Mammals and>1 <1Birds>1 by C. P. Lyman <1et at>1 (Academic #24-80) is a useful survey but has very little about birds. <1Biological Timekeeping>1 edited by J. Brady is a genuinely biological example of the S. E. B. Seminar Series (Cambridge UP #22.50, pbk #9.95) and <1Gills>1 (editors D. F. Houlihan <1et al,>1 Cambridge UP #27.50) compre- hensively covers a subject ideally suited to comparative treatment and hitherto neglected. <1Animal Energetics>1 by A. E. Brafield and M. J. Llewellyn (Blackie #16.95, pbk #7.95) covers individual and ecological energy flows in one volume. M. J. Dring's <1The Biol->1 ogy of Marine Plants (Arnold #7.50) is, usefully, a little inclined towards seaweeds. <1The>1 <1Fundamentals of Nitrogen>1 <1Fixation>1 by J. R. Postgate (Cambridge UP #20, pbk #7.95) is an important advanced synsopsis. About 10 of the Insti- tute of Biology's Studies have appeared (Arnold #2.50) taking the total to 149. [] <2Chemical investment>2 ---------------------------------------- John Emsley CHEMISTRY at degree level is broadly divided into inorganic, organic and physical and this is reflected in the texthooks. The under- graduate chemistry student will find about five titles to choose from in each branch: these books, bought upon starting a BSc course, will serve the student throughout his or her three years. So what is on offer and what should we expect from these weighty tomes? The model textbook should have 800-1200 pages divided into 20-30 chapters covering the major topic areas. Each chapter should have references to keynote original papers, and to books covering the subject in greater depth. To help the reader test his or her understanding there should be several problems at the end of chapters, and worked examples in the text where a general law or difficult point is discussed. Units should be SI. Good quality artwork--figures, graphs, form- ulae, diagrams etc--is verv important in the teaching of chemistry and a comprehensive index essential. How do current textbooks meet this ideal? In inorganic chemistry, F. Albert Cotton and Geoffrey Wilkinson's <1Advanced Inorganic>1 <1Chemistry>1 (Wiley) has domi- nated the market for 20 years. First published in 1962 and now in its 4th edition (1980), this fulfils a role as a reference text rather better than a student text, in that it has hundreds of refer- ences but no problems. Unlike other inorganic texts it is based on the periodic table, although 500 of its 1365 pages are on introductory or special topics, such as organometallics, cluster and bio-inorganic chemistry, which have developed in recent years. Despite its heavy style and dull format, it remains the best inorganic book and will serve its owner beyond the BSc stage. What of the competition? Keith Purcell and John Kotz's <1Inorganic Chemistry>1 (Saunders) appeared in 1980 and approached the subject via in- organic concepts. Much better illustrated than Cotton and Wilkinson and with problems, it is a better teaching aid, but it over-emphasises bonding theory in a rather daunting way. More recentIy Therald Moeller's <1tnorganic Chemistri'>1 (Wiley) has appeared. This author's 1952 work on the subject was a masterpiece: half the book devoted to concepts, half to the chemistry of the elements. The present book is only concepts. Unfortunately, like Purcell and Kotz, he has kept to the old non-SI units. He has also retained a lot of the old emphasis. In some respects this is good when it means such topics as non-aqueous solvents are covered, but bad when the reader searches this chapter in vain for information about new solvents. Some diagrams are poorly reproduced and tables printed vertically. Even so the book has a lot to offer and is packed with useful data. Another update of an old text is the second edition of Bodie Douglas, Darl McDaniel and John Alexander's <1Concepts and>1 <1Models of Inorganic Chemistry>1 (Wiley, 1st ed 1965). This has several good points--SI units, worked examples, problems (but no answers) and several references. Unfortunately, the subject is covered unevenly and to insuficient depth for a final year British undergraduate. Nevertheless as an introduction it is the best of the bunch. The organic textbook market of the 1970s was dominated by the third edition of Robert Robert Boyd's <1Organic Chemistry>1 (Allyn & Bacon, 1st ed 1959). This year the fourth edition, which is 120 pages longer, has appeared. The increase is due mainly to extra material on mechanisms and neighbouring-group effects. Pre- viously they treated the latter as a special topic, now it is inte- grated into the chapters on alco- hols and arenes. The chapter on alicyclics has moved forward to accompany stereochemistry, and the one on alkyl halides to introduce substitution mech- anisms before the chapter on alkenes, which covers elimi- nation reactions. These changes in emphasis are welcome but may be too late to reinstate Morrison and Boyd as the brand leader. In other respects the book has changed little and there is no attempt to deal with organic synthesis in its own right. T. W. Graham Solomons's <1Organic Chemistry>1 (Wiley) was said to be so like Morrison and Boyd that a court case for alleged copying ensued; but Solomons was cleared. He has now produced a shorter version called <1Fundamentals of Organic>1 <1Chemistry>1 (Wiley). The basic material is still there however. Solomons's book comes near to meeting all the criteria for a student text. Another organic textbook is <1Introduction to Organic Chem->1 <1istry>1 by Andrew Streitwieser Jr and Clayton Heathcock (Collier Macmillan, 2nd ed 1981) which has an excellent approach to spectroscopy. In this book the text is accompanied by inter- esting asides explaining a theory or providing extra examples. Unfortunately, there are no answers to the many problems, and instead of references there is a chapter on the use of the chemical literature. But for its lack of references, which surely are essential for the chapters on spectroscopy, the book that could be said to be the best is Ralph and Joan Fessenden's <1Organic Chemistry>1 (Willard Grant, 2nd ed). Beau- tifully produced in two colours, it has worked examples, prob- lems and answers. All the above organic texts follow the same conventional approach to the subject, starting with the alkanes. Only one major text has attempted to break out of this mould and that is Daniel Kemp and frank Vellaccio's <1Organic Chemistry>1 (Worth, 1980). They begin the story of organic chemistry with methanol, follow this with other alkyl derivatives, then mech- anisms of reactions. Part two is devoted to structure, part three to the hydrocarbons and we first meet the alkanes on page 500. Visually superb, with problems and some answers, it deserves serious consideration. Finally physical chemistry; here the most important advance in the past 25 years has been the study of the interaction of matter and radiation: spec- troscopy. How a textbook deals with this is highly relevant and we can use this as a guide. P. W. Atkins's <1Physical Chem->1 <1istry>1 (Oxford UP, 2nd ed) is undoubtedly the best and devotes 160 pages to spec- troscopy. Despite minor errors the book has a style and a use of simile which make it a pleasure to read. Even the problems have unexpected black humour, such as the last-breath-of-a-dying man problem. Worked exam- ples abound and the units are strictly SI. At the other end of the spec- trum, J. Philip Bromberg's <1Physical Chemistry>1 (Allyn & Bacon, 1980) has less than 20 pages on spectroscopy, of which only 4 are on NMR! The book is firmly rooted in the olden, and golden days of classical physical chemistry and could have been written <1circa>1 1960--perhaps it was. Indeed the bibliography has only 15 references to material of the past 10 years. The units are not wholly SI and the answers to the 800 problems available only in a study guide (#8.75). One redeeming feature is the inclu- sion of many pen-portraits of famous chemists. Walter J. Moore's <1Physical>1 <1Chemistry>1, which first appeared in 1950 and ran to five editions, has now been succeeded by his <1Basic Physical Chemistry>1 (Prentice-Hall). This is a completely rewritten text and is a determined effort to capture a first-year student market with its worked examples and lots of problems (but without answers). It has the touch of the old master. Well illustrated and easy on the mathematics, it has 100 pages on spectroscopy. Arthur W. Adamson's <1Phys->1 <1ical Chemistry>1 (Academic, 2nd ed 1980) has less spectroscopy (80 pages) although it has a useful table of 65 spectroscopy abbreviations. Adamson splits each topic into three parts: essential theory, commentary and notes, and special topics. The book is beautifully produced and suitable for all levels. Its high mathematical tone is in strange contrast, however, to the author's nega- tive attitude to SI units. Finally Arthur Lesk's <1Intro->1 <1duction to Physical Chemistry>1 (Prentice-Hall) has a disap- pointingly small spectroscopy section (28 pages). Perhaps to compensate it does classical physical chemistry well, and has one eye on the expanding market for physical chemistry books among life-science students. The teaching of the subject is uppermost in the author's approach with plenty of worked examples. An unusual bonus is the list of source data. The essential features of these books, as far as a student is concerned, are given in the table. The three best buys are probably Cotton and Wilkinson, Fessenden and Fessenden, and Atkins. Together these cost #41 . 15 and provide 3500 pages of chemistry. As an investment for the future they are cheap at the price, [] <2Data for computer science>2 ------------------------------------------------------ by Chris Reynolds THE past couple of years has seen an explosion in the number of computing books on the shelves of even the most humble book shop. Publishers are rushing to get on the microcomputer bandwagon. Few of these books, however, have any relevance to teach- ing undergraduate computer science as they normally restrict themselves to a single language on a single make of computer. Anyone concerned with selecting a class book for teach- ing a language will face a wide choice of texts. I have selected four books on ADA, a new language initiated by the United States Department of Defense, to illustrate the variety. The first, An <1Introduction to Ada>1 by S. J. Young, provides a solid systematic coursebook, with adequate examples, and would be a good basis for an academ- ically orientated second-year course. For a more pragmatic course a book such as <1Programming>1 <1Embedded Systems with Ada>1 by Valerie Downes and Stephen Goldsack wouId be suitable. This uses a system for monitoring hospital patients to highlight the practical aspects of the language. Because ADA is a new language and not really suitable for introductory teach- ing, many students will have learnt other languages first. If they can be described as competent programmers, a book such as M. J. Stratford Collins's <1Ada, A Programmer's>1 <1Conversion Course>1 could be ideal. The final ADA book I exam- ined was a disappointment. Computer science is much more than just writing programs, and good books which combine systems understanding with language teaching are always welcome; Brian Mayoh's <1Prob->1 <1lem Solving with Ada>1 attempts to do this. It contains sections on high-level problem solving, tortoise (sic) graphics with no reference to LOGO, and various algorithms programmed in ADA. If the combination had worked it would have been a stimulating text. Unfortunately the book is too unstructured to be recommended. Communication between people and computers is becoming increasingly impor- tant and selected topics should be included in all undergraduate courses. I have therefore included three books on the "physical" interface and three on the "systems" interface. Visual communication, via graphics terminals, is an exciting activity, and J. D. Foley and A. Van Dam's <1Funda->1 <1mentals of Interactive Cumputer>1 <1Graphics>1 presents a mathe- matically difficult subject in an exciting way and deserves to become a major undergraduate text. It is excellent value with good colour illustrations, many well-presented diagrams and an up-to-date bibliography. The reverse process involves the computer recognition of pictures. This is covered in a relatively non-mathematical way by Ramakant Nevatia's <1Machine Perception,>1 but at four times the price per page of foley's book, students may be reluctant to buy it. Now that anyone can buy speaking toys, such as Texas Instruments's "Speak 'n Spell", it is essential to incIude some reference to talking computers in any undergraduate course. Ian Witten's <1Principles of Com->1 <1puter Speech>1 is too detailed for general study, but could form the basis of a final-year option. <1Computers in Society>1 by Nancy and Robert Stern is really two books in one cover. The first half is a conventional "this is how a computer works" book explaining concepts such as binary numbers and punched cards at a secondary school level. The second half is a useful elementary review of the impact computers are having in fields such as business, education, health and the arts. Both halves have their merits, but together their appeal will be limited to those teachers of computer science, who if lecturing on the impact of motor traffic on society, would spend half the time on the way sparking plugs work. This problem is not uncommon, as many books on computer applications, which might be suitable for under- graduate case studies, are pad- ded out with material readily found in any good general intro- ductory text. Michael Aldrich does not make this mistake in <1Videotex,>1 <1Key to the Wired City>1 This is a glossy paperback, with popular appeal, which could well form the basis of a case study for my own man/computer systems class next year. I will be backing it up with information from <1Teletext and Videotext in the>1 <1United States>1 by J. Tydeman <1et>1 <1al,>1 who include detailed infor- mation relating to current tech- nology and future development of such systems. The book high- lights the problem university staff have in keeping up with recent developments. Only a very few of the many references in the bibliography are earlier than 1979 yet undergraduates are already arriving at university with first-hand experience of videotext and expect their lecturers to know more! Finally I would like to men- tion three books which appeared during the year which we at Brunel are already using successfully. <1Essential Com->1 <1puter Mathematics,>1 by Seymour Lipschutz (in the Schaum's Out- line Series in Computers), was added to our first-year reading list in the autumn. The 2nd edition of <1Cobol for Students>1 by A. Parkin came out just in time for a course which started in January. Last but not least, volume 2 of C. J. Date's An <1Introduction to Database Sys->1 <1tems>1 appeared earlier this year. This fills the gap caused by the decision to split the book into two parts when the 3rd edition was brought out. [] <2Vintage year for biochemistry>2 -------------------------------------------------------- Anne Dell with Jean Beggs and Mario Panico -------------------------------------------------------- THE past year was a truly vintage one for biochem- istry texts--a new cell biology masterpiece, a new Lehninger, several up-to-date molecular biology texts and a number of monographs en- riched the existing literature. By far the most exciting is <1Molecular Biology of the Cell,>1 which will surely become the standard work for cell biology. It is delightfulIy written and beautifully illustrated, guiding the student from simple chemicals to macromolecules to cellular organisation and finally to multicellular organisms. Parts 1 and 2 examine cellular structure and function to cover material usually taught in first- year cell biology courses, while Part 3, on the behaviour of cells in multicellular animals and plants, is valuable for more advanced students. The text of each chapter is split into sections averaging less than a page, each of which is introduced by an imaginative, succinct heading. Albert Lehninger ernploys a similar presentation in his <1Principles of Biochemistry.>1 for many teachers the "old" Lehninger--<1Biochemistry>1 (2nd ed, 1975)--has been a biochemical bible, but can we be equally enthusiastic about the "new" Lehninger? Sadly, I believe, the answer is no. Lehninger reverts to the aims of the first edition, which was to provide a text suitable for American students; British students should begin first-year biochemistry with a better understanding of chemistry. But <1Principles>1 retains the first-class coverage of metabolism achieved in the earlier editions and I unreservedly recommend the new section on "Molecular transmission of genetic information". Meanwhile, I look forward to <1Biochemistry>1 in its 3rd edition. Molecular biology is often difficult to recommend text books for because of its multidisciplinary nature and rapid developments. But <1Nucleic Acid Biochemistry and>1 <1Molecular Biology>1 includes fairly comprehensive coverage with emphasis on the biochemical aspects. The early chapters give a basic knowledge of nucleic acid metabolism and chemistry, with an outline of elementary genetics. Prok- arvotic and eukaryotic systems have approximately equal emphasis and there is a useful chapter on recombinant-DNA techniques: but for many courses the treatment of bacteriophages and F plasmids will be inadequate. Throughout, the authors emphasise the importance of each topic to current research, and follow descriptions of techniques with an example of an application and a literature reference. Perceptive students may wonder why the cover features left- handed DNA helices. R. E. Glass presents a good account in <1Gene Function>1 of the molecular biology of <1E. coli,>1 including biochemical, micro- biological and genetic aspects. Plasmids and bacteriophage are well covered. This is easy and enjoyable reading and there are extensive bibliographies at the end of chapters. There are many figures. though their style is rather unattractive, and the index is extraordinarily un- helpful. But the book should be useful and popular for courses on microbial biochem- istry/genetics and the molecular biology of prokaryotes. B. Lewin's <1Genes,>1 is com- prehensive and admirably up to-date. It introduces molecular biology from both the biochem- ical and genetic viewpoint, assuming little prior knowledge. The classical order of DNA, RNA and protein is essentially reversed, resulting in a logical and well-integrated text which is divided into short well annotated chapters in 10 parts. There is a multitude of helpful diagrams, a useful glossary and critical discussion of recent results. <1Genes>1 probably goes beyond most introductory courses and will also prove useful to advanced students. It is much easier to read than the earlier "Gene Expression" series. Physical biochemistry is the area least well served by textbooks. Probably the best general text is the new edition of David freifelder's <1Physical>1 <1Biochemistry.>1 This retains the clear format of the first edition, but is considerably expanded with new topics ranging from metal-ion electrodes to DNA sequencing. It explains clearly most of the basic physical techniques used in biochemistry and has particularly good sections on microscopy, radio activity, sedimentation, chrom- atography and electrophoresis. But several areas, notably nuclear magnetic resonance, are less satisfactory. Can a single author cover the diverse techniques of physical bio- chemistry? A multi-authored text of the quality of <1Molecular>1 <1Biology of the Cell>1 is badly needed, or a series similar to "Outline Studies in Biology". The principles and applica- tions of mass spectrometry are adequately covered by M. E. Rose and R. A. W. Johnstone in <1Mass Spectrometry for Chemists>1 <1and Biochemists.>1 Unfortunately the book was completed too soon to reflect the enormous impact of fast atom bombard- ment mass spectrometry on biology. Students should consult H. R. Morris's <1Soft Ionisation>1 <1Biological Mass Spectrometry>1 (Heyden, 1981) for an intro- duction to this new technique. The "Outline Studies of Biology" series includes several new titles relevant to bio- chemistry. The main virtues of <1Muscle Constructiun>1 are that C. R. Bagshaw integrates information previously scat- tered in general texts and reviews, and presents difficult concepts (for example optical filtration), in an understandable manner. <1Glycoproteins>1 by R. C. Hughes also fills a definite gap in the literature. He covers concisely (if a little stolidly) our current understanding of the structure, biosynthesis and biological functions of glyco- proteins. The most readable of these monographs is <1Membrane>1 <1Biochemistry,>1 a perfect little book in which R. C. Hughes describes the structure, bio- synthesis and assembly (but not function) of biological membranes. M. D. Houslay and K. K. Stanley deal with a similar area (including function) at greater depth in <1Dynamics of Biological>1 <1Membranes.>1 This could have been a highly recommended text, covering in an imaginative manner processes which are fundamental to living systems. Regrettably, the text is marred by careless errors. Two other specialist texts deserve special mention. <1Bioenergetics: An Introduction>1 <1to Chemiosmotic Theory>1 is a well-written account by David Nicholls of a subject that students usually find extremely difficult. Finally, we recom- mend <1Protein Purification: Prin->1 <1ciples and Practice>1 for undergraduates taking advanced protein chemistry courses. Robert Scopes presents both the principles and practical aspects in an easily comprehended, readable manner. [] <2Enhancement for physics students>2 ---------------------------------------------------------- David Hurd and Daphne Jackson FACED with the task of selecting suitable books for physics students we were pleased to see the presence of several texts that enhance the physics literature at the under- graduate level. Among these new titles are some worthy of special note. Nearly all undergraduate texts on special relativity attempt to display the novel physical concepts of the theory at minimal mathematical cost. W. G. Dixon's excellent ac- count. <1Special Relativity>1) (now available in paperback) adopts a different policy. Following an elegantly presented critique of the physical foundations of the theory, he introduces the math- ematics (affine tensor analysis) that is required for the full for- mulation of the theory. The inclusion of speciaI relativistic accounts of thermodynamics and electromagnetism go far to bear out the author's claim that special relativity provides "a foundation on which almost the whole of modern physical theory has been built". Alan Walton's refreshing under- graduate text on the atomic constitution of <1The Three>1 <1Phases of Matter,>1 as the title has it, is now in its second edition. Some additional topics, a degree of reorganisation and a tighter mode of exposition enhance the value of this excellent book. Continuing developments in physics can lead to the establish ment of newly-defined areas of investigation. A good example is that branch of physics for which the title "optoelectronics" has been coined. To this new field an attractive introduction is provided by J. Wilson and J. F. B. Hawkes, simply called <1Optoetectronics: An Intro->1 <1duction.>1 The wide variety of devices and techniques the authors discuss include a sub- stantial introductory account of lasers and a presentation of fibre optics and their application to optical communication systems. Amid the flood of new text books and monographs there occasionally appear books of quite a different character which, if of suitable quality, are a valuable supplement to the more traditional undergraduate literature. From this year's out- put, several differing examples appear. All too often students find it difficult to appreciate links between concepts encountered in different courses, or con- fidently to recognise formal ana- logies between different contexts. John Shive and Robert Weber have tackled this problem in <1Similarities in>1 <1Physics.>1 They discuss a wide variety of general concepts, showing how these re-occur in different branches of physics. Because it is written at first-year undergraduate level, many students could profit from this volume. Problem-solving is an integral part of learning, and K. F. Riley has produced the useful <1Prob->1 <1lems for Physics Students.>1 This collection of problems, divided into sections, ranges widely over physics, and a special feature is the provision of hints and part answers for the more difficult problems, which precede the final list of complete answers. A third example is the <1Concise Encylopedia 0f Solid>1 <1State Physics.>1 One could well imagine a worthless volume masquerading under such a title. However, in this case the editors, R. G. Lerner and G. L. Trigg, drawing from their <1Encylopedia of Physics,>1 have produced a valuable reference collection of mini-articles from a varied and distinguished authorship. New books and revised editions on the quantum theory of atoms and molecules con- tinue to appear. Among this year's offerings is <1The Physics of>1 <1Atoms and Molecules>1 by B. H. Bransden and C. J. Joachain. Suitable for undergraduate or postgraduate use, it covers atomic and molecular structure, interaction of electromagnetic radiation with atoms, molecular spectra and coIlision theory. Apart from the section on appli- cations, which is brief and dis- appointing, this is an excellent book. The second edition of <1Molec->1 <1ular Quantum Mechanics>1 by P. W. Atkins is beautifully pro- duced but rather idiosyncratic. Almost half the book is con- cerned with "applications", but these are theoretical applica- tions of quantum mechanics, and this book is not so useful as such classic works as C. A. Coulson's <1Valence.>1 But Atkins's introduction and chapters on operators, perturbation theory and group theory are enjoyable. P. R. Fontana's aims to give, in <1Atomic Radiative Processes,>1 a unified treatment of classical and quantum theory, together with a description of atomic radiative processes. It provides a nice link between the semi- classical approach usually taught at undergraduate level and the more sophisticated quantum electrodynamics. Fon- tana's treatment relies heavily on the use of Fourier transforms, is very theoretical and mentions few applications. <1Quantum Mechanics of One->1 <1and Two Electron Atoms>1 was originally published in the "Encylopedia of Physics" in 1956. Here H. A. Bethe and E. E. Salpeter give a detailed mathematical treatment, to- gether with illuminating dis- cussions, of such physical phenomena as atomic spectra, the photoelectric effect and <1bremsstrahlung.>1 It is still the best foundation at postgraduate level. The articles on photo- ionisation in the new Volume 31 of "Encylopedia of Physics", <1Corpuscles and Radiation in>1 <1Matter I>1, lean quite heavily on the approach of Bethe and Sal- peter. The volume also contains articles on photoelectron spec- troscopy and the Auger effect. New textbooks on nuclear and particle physics are thin on the ground. The second edition of <1Introduction to High>1 <1Energy Physics>1 by D. H. Perkins requires considerable initial knowledge of quantum theory, but this does make it possible to use Feynmann diagrams imme- diately. AIthough this new edition emphasises quark models, there is little on the physics of the nucleus, except for beta-decay and an old fashioned discussion of electron-nucleus scattering. Perkins also includes some useful material for experimental physicists such as stopping power and multiple scattering formulae and definition of radiation lengths. The application of nuclear and radiation physics sees a steady increase in the number of titles. ln <1Neutron Sources,>1 various authors review a wide selection of neutron sources and discuss a limited range of appli- cations, with emphasis on nuclear energy. The production of radionuclides and their clini- cal applications are described by P. W. Horton in <1Radionuclide>1 <1Techniques in Clinical>1 <1Investigation,>1 while P. Mansfield and P. Morris describe, in <1NMR Imaging in>1 <1Biomedicine,>1 the important use of the phenomenon of nuclear magnetic resonance to medical imaging. Other new books on imaging include <1Imaging with>1 <1tonizing Radiations,>1 which describes imaging theory and applications with emphasis on physics, and <1Physical and>1 <1Biological Processing of Images>1 covering aspects of image analysis, perception and pattern recognition. [] <2Where to study>2 -------------------------------------------------------------------- The student book 1983/84 Edited by K. Boehm and N. Wellings, <1Papermac. pp 642, #6.94>1 -------------------------------------------------------------------- Ros Herman -------------------------------------------------------------------- IT IS EASY to forget how bewildering and nerve- racking are the choices of where to study. Compendia of infor- mation cannot solve all the problems associated with mapping out one's life, but they do provide basic and practical information. <1The Student Book>1 is full of facts about colleges and courses and helpful, if terse advice, about how to apply, and how to survive if accepted. A sensible place to start is with the section "What to study", with its brief but cogent essays by practitioners of each subject on offer. These are designed to whet rather than satisfy the appetite--but short biblio- graphies help would-be students to delve further. Once the subject is chosen, a neat cross-referenced list indi- cates which colleges offer that course. Then turn to the guides for information about atmos- phere, pressure of work, facili- ties, and even famous alumni, of the possible places of study. Scientists have rather a bad press in these lists, as well as tending to drag down the market value of their places of study, or so it seems to the editors. More seriously, though, the editors are not aware of where interesting, strong and innovative science departments are to be found. Even on more general matters, the amount of "inside informa- tion" varies from college to college. On the other hand, it is extremely useful to have univer- sities, polys and colleges compared on an equal basis, particularly as in these days of cutbacks many, who in earlier years could have relied on a university place, may have to hedge their bets by applying to polys too. It's also refreshing to have such a wide range of subjects in view. This is a book that all would- be students should have by their side to browse through. [] <2Question time for nuclear issues>2 --------------------------------------------------- THE nuclear de- bate can only heighten as the elec- tion draws near which means, as it will he mainly be between politicians, that it will get more incomprehensible. Coherence is not an objective and the ins and outs are further removed from public comprehension by the nomen- clature of the weaponry and the acronyms used to encapsulate the various bodies discussing disarmament around the world. lt is fast becoming a business where, to participate, you need a glossary. These morbid thoughts occurred to me while watching Question time (5 May) on BBC1. The antagonists, all equally concerned about the incomparable danger, included Michael Foot, who is at least consistent, David Steel, whose reasonable approach to all mat- ters has so far been unimpeded by office. Michael Heseltine, whose rationality, one might think, could be impeded by having it, and Ann Leslie, award-winning writer of the <1Daily Mail>1 whose appearance seemed to be a gesture towards including a woman. Another death for coherence was indicated early when Heseltine referred to the Russians as "the most cold, calculating, chess-playing people on Earth". This made me lose track of his subsequent drift as I struggled to imagine how chess-playing came to have such a pejorative connotation for him. Had he been a dunce at it or did his present situation, despite the opportunity it affords for the histrionics he so loves, make him feel like pawn? People think other- wise, surely, from politicians: more simply, about horror, fear, survival? ITV's schools series on Nuclear issues, with Peter Moth as execu- tive producer, seems to recog- nise this. The film I watched (4 May) addressed itself to the government's 30-page pamphlet <1Protect and Survive.>1 It opened with a siren and a man trying to follow as much of the government's advice as he could in the time available. This involved rushing several buckets of water into his improvised shelter. We saw a Civil Defence exercise in Tyne and Wear where local government offi- cials, obviously not in peak condition, rushed about and contemplated what they were going to do about the rain of megatonne weapons that was being dumped on their ratepayers. It was simple, objective, frightening. We were as vulner- able, the programme made clear, as the Japanese were in Hiroshima in 1945. The pamphlet could be seen as nought but a placebo for people whose intelligence is being seriously under-rated. The Russians have issued a more substantial document, telephone-directory size and with 240 pages. Their placebo has more meat. It gives more information and even tells you not just how to survive yourself but how to rescue others. One wonders how- pupils watching this programme reac- ted to the statistic that, if we followed government advice, we might escape with only 20 million dead (the Russians are more optimistic: they see 90 per cent surviving). A family simulating post strike conditions in an improvised shelter was asked how it thought it might cope with diarrhoea and vomiting and, understandably, didn't think it would cope very well. A man who finds a ready market among the ever-hopeful for his protective devices, was shown with a radiation-checker which would enable families to decide who is the one with the leastest and therefore the candidate to go outside first. I found it an impressive programme and I hope it will not lead Mr Heseltine to conclude that chess sets are less harmful potentially than tele- vision sets. What I would really like to see is Sir Robin conduct one of his <1Question Times>1 among a young audience who had completed the series. He could even use the same panel again. [] <2FORUM>2 <2Elephants on the move>2 Catherine Caufield helps to plan a major elephant drive in Indonesia WE WERE a small group: Soeriaatmadja, the head of Indonesia's Elephant Task Force, Graham Child, the director of Zimbabwe's National Park Service, Iain Douglas- Hamilton, who has spent many years observing elephants in East Africa, and me. My own study of elephants had been restricted to reading the <1Just So>1 stories, but I was only tagging along. Soeriaatmadja's boss, the Minister of State for Development and Environment, Emil Salim, wanted advice from Child and Douglas-Hamilton on how to save a village from marauding elephants. The problem arose when the government decided to clear the forest along the Sugihan river for a colonisation scheme. The idea was part of its gigantic Trans- migration Project, the highest colonisation scheme in the world. One million people from the most crowded islands, Java and Bali, have been moved to the outer islands--Sumatra, Borneo, and Sulawesi tween 1930 and 1979. The government plans to move another 2.5 million people by next year. The high targets set by the government have meant that colonies have often been badly located, and inadequately supported. When Air (meaning river) Sugihan was chosen as a transmigration site no one noticed that a third of elephants lived there. As the forest was cut down the elephants, over 100 of them, retreated into a small corner of forest that was left standing along the Suleh river. Striking out in search of food, the herd trampled crops, damaged buildings, and terrified the settlers. Accord- ing to the local people the elephants are particularly fond of bananas, coconuts, and rice. "People are in doubt about whether to plant out next season's crops," an Indonesian official told me, "because of the elephants." Soeriaatmadja showed me pictures of the devastation caused recently by one elephant in West Sumatra who had knocked down 170 homes. Elephants are a protected species in Indonesia. In Air Sugihan rumours flew that the settlers were to be shifted so that the elephants could be left in peace. When he realised the scale of the prob- lem, Salim formed a National Task Force on Elephants, and named Sueraatmadja to head it. The task Force decided to try to herd the elephants through the settlement to in as yet uncleared patch of forest 50 km to the south. Their problem was that no one knew how to go about it. In a similar situ- ation some years ago Rwanda had to aban- don a plan to move the last 146 elephants in the country 40 km through a densely populated area because they could not find anyone in Africa who could run the drive. They immobilised the 26 babies of the herd and took them away by helicopter, and the government "very reluctantly" shot the rest of the elephants. Sri Lankan elephant handlers told the Indonesians that a 50 km drive would take a year. The Sri Lankans move their herds in stages, allowing them to move freely back and forth between intermediate stages and gradually cutting off their retreat. But time was short at Air Sugihan. The task force couldn't allow the elephants to spend a year ambling through a crowded settlement. The first meeting to discuss the settlement's elephant problem had been held in February 1982 at the environment ministry. In June the Indonesian cabinet gave its blessing to the drive. But it wanted the job completed before the presidential elections in March. With the rainy season expected to begin in Decernber and last until February or March, that meant that the elephants would have to move in November. Child and Douglas-Hamilton were in Sumatra to see if that was feasible. But first we decided to try to see the elephants. Neither Child nor Douglas-Hamilton (nor I) had ever seen Asian elephants in the wild. We split up, with members of the task force, to look for the elephants, each small group of searchers gathering a large follow- ing from the children of the transmigrasi along the way. At the end of the day Child and I were cursing our luck and Douglas Hamilton was waxing lyrical about the placidity of the beasts, who had allowed him to come within six yards of them. The task force planned to cut a track through a corridor of forest running along the edge of the settlement to the elephants' new sanctuary 50 km away. The route would take the herd over seven, as yet unbridged, canals. Two hundred soldiers, assisted by thousands of transmigrasi, were to round the elephants up over a 390 sq. km area and drive them south using fire crackers, whirring chain saws, torches, megaphones, and tape recorders playing the sound of falling trees. The entire process was to take less than a month. Once the elephants were in their new sanctuary, called Lebong Hitam, the army would dig a trench, with one very steep side and a solar powered electric fence on the other, to keep them from returning to their old haunts. The plan was far from foolproof. The forest corridor through which the elephants were to walk was so heavily logged over that it had disappeared in some places. In others, the way was littered with fallen logs, some four or five feet high, obstructions to both the elephants and their would-be drivers. The corridor was so close to the fields and houses of the transmigrasi that barriers were needed to prevent the elephants from crashing through. Elephants elsewhere have been known to overcome both types of barriers that the task force was planning to erect. One elephant put an electric fence out of action by dropping an uprooted tree on it. Others have used their heads and tusks as battering rams to turn a deep ditch into a gentle wrin- kle in the landscape. "I've never heard of anyone driving as many as 100 elephants at one time," Douglas-Hamilton told the task force when we returned to Palembang. Still he and Child were able to give some advice based on their experiences with African elephants--and a few words of warning. They suggested pushing the elephants ahead by manipulating the down-draft from helicopters blades. Child also advised putting > < water in the barrier moat, as is done in Africa, so that elephants butting the electric fence get a stronger shock. Both Child and Douglas-Hamilton were worried that in rushing to beat the rains the task force was not allowing itself enough time for preparation. The Indonesians wanted to start within a month, but Douglas-Hamilton and Child urged them to consider delaying it so that the army could do a pilot run and the local people could be better informed about what to expect. Soeriaatmadja objected. "No, there is a momentum that would be lost with postponement. The Commander-in-Chief of the army is very ready to help now. And the people here are already so worried about the elephants, that it would be a mistake to delay." Another task force member, a young Indonesian zoologist named Jack West, added, "Also if we wait until next year and the logging goes on as it has, there will be no trees left to keep the elephants on their trail." The army moved into Air Sugihan on 15 November, 1982. Two hundred soldiers and an equal number of civilians pushed a rough track through to Lebong Hitam. Rounding up the elephants and trying to count them was a hellish job, much more difficult than anyone had foreseen because of the vast area, the confusing vegetation and terrain, and the impossibility of label- ling the beasts or keeping track of them once they had been found and counted once. The round-up took two weeks and the army found it had to move 232 elephants. Several thousand transmigrants helped drive the elephants on, but progress was slow. The animals, confused and frightened, let out wrenching bellows, and sometimes charged their drivers. The soldiers had orders not to shoot except as a last resort to save human life. It took 44 days, rather than the scheduled 20, to travel the 50 km to Lebong Hitam. "lt was, of course, 44 hard-working days, not only for the army, but also for the transmigrants who got involved," Soeriaatmadja later told me. "And it was just like a miracle because no body, none whatsoever, got hurt during the whole operation." [] <2Strange>2 <2tales from Sizewell>2 ONE OF the odder spectacles at the never-ending inquiry on the Suffolk coast is the presence of one Dr Keturah (known to all as Kitty) Little, a one-woman advocate of the case for nuclear power. Her expositions to the inquiry are given on behalf of Ridgeway Consultants. Who are they? you ask. So did we. Our man in Companies House reports that the good doctor is the sole shareholder of the consultancy (just two #1 shares have been issued) and, contrary to various Companies Acts, no accounts have been returned since 1978 (when total income was #260 in fees and #18 in donations). More entertainingly, it emerges that until 1972 Ridgeway Consultants was known as Rol-Oyo and was engaged in the business of making and selling ice cream and pastry products. Is this a subtle ruse? Does Dr Little secretly hold the franchise for ice cream sales on Sizewell beach? Does she reckon that an extra power station there will boost business? [] <2Counting citations makes sense>2 ------------------------------------------------------------------- Bernard Dixon thinks there is a lot to be said for following how research results are taken up GOOD to see (<1New Scientist>1, 28 April, p 203) that citation indexing is at last being considered seriously as an aid towards more reasonable decisions over research funding in Britain. Why has the idea been resisted for so long? And why even today are most scientists still profoundly uneasy about any such notion? Three reasons, I guess, but none of them amount to very much. First, many critics see this as a silly game of publication count- ing, a bogus guide to merit in the lab. Their misapprehension is hardly surprising. It was no less a figure than Professor Derek de Solla Price, in his <1Little Science, Big>1 <1Science,>1 who expressed one perspective on science in the following terms: "We may define a man's solidness . . . as the logarithm of his life's score of papers. The logarithm of the number of men having at least <1s>1 units of solidity will at first fall linearly with <1s,>1 then more rapidly as it approaches the fixed upper limit of 1000 papers, beyond which no man has achieved." Forget about such excesses. The body of work arising from Eugene Garfield's pioneering efforts in launching his <1Science>1 <1Citation Index>1 during the early 1960s has little or nothing to do with evaluation by paper scores (or their logs). The key concept is a little more sophisticated than that. If in a particular year Dr A publishes four papers, which are cited an average of 15 times by other authors during the following five years, then they have been useful. If four offering by Dr B all sink without trace, they have not been useful. That's all. What is interesting about Garfield and his Philadelphia-based Institute for Scientific Information is the many ways in which this elementary notion has become useful in charting the history and social intercourse of science. As a means of assess- ing <1quality>1 it is poles apart from crude paper tallies, which simply indicate quan- tity, persistence, and sometimes misguided ingenuity in rehashing work for more than one organ. Citation figures must be inter- preted intelligently, but they really do high- light the distinction between research which is significant and that which is not. Sceptics may care to consider ISI's performance in, for example, indicating future Nobel laureates. Dr Garfield himself has been as cautious as anyone in making claims for his technique, not least because it does not allow one to predict which specialities the Nobel authorities will single out for acclamation. Nevertheless, the record is impressive. ISI's "highly cited" list threw up six of the eight 1981 science prize- winners. And on 11 October last year, the very day when Dr John Vane's award was announced, <1Current Contents>1 (vol 25, no 41, p 22) carried as its "citation classic" Vane's 1977 paper on the production of prostacyclin by human blood vessels. Perusing a recent report on the most quoted bioscience publications for 1980-81 <1(Current Contents>1 vol 26, no 10, p 5), it is impossible to resist the conclusion that those which received an immediate burst of citations represent genuine hot spots in research. "I'm sure most of the papers iden- tified in this study will be highly cited for several years to come," says Garfield. "Citation frequency within a few years of publication is a reliable indicator of a paper's Iifetime citation expectancy." Unlike those pieces of biochemistry, genetics and microbiology which other biologists have totally ignored, these reports must represent work of real value. So what of the other reasons for rejecting citation analysis as a help in deciding where the money should go? One can be dealt with quickly. It comes from critics who imagine themselves penning formidable papers which, like those of Gregor Mendel, are hidden away or ignored for many years--and do not fare too well, therefore, in ISI's listings. But these complainants are not really likely to be in the Mendel class (are they?). Finally, there is a bunch of allegations that the ISI approach can exaggerate the significance of one piece, type, or area of work as against another. Commonly instanced at this point is O.H. Lowry and a paper he wrote in 1951 which between 1961 and 1975 was referenced on 50000 occasions--five times more than its nearest rival. It was not Nobel prize material, just a slick method of protein analysis, which many other experimenters had found immediately useful. But no grant-awarding body, looking at such evidence, would be misled into reaching the wrong conclusion about the place of Dr Lowry in the broader scheme of things. Neither would they be wrong-footed by another red-herring--that one way of being heavily cited is to publish a demonstrably erroneous paper, which will then be demolished in all quarters. So the criticisms levelled against citation analysis as guide to value in science distil down into a few fragmentary points that any sensible funding body would be more than capable of bearing in mind. The objec- tors really should reflect more on the alter- natives. Peer review has been having a hard time recently, while there is still widespread anger that the UGC's devastating 1981 economies were resolved with little or no explanation of the philosophy behind them. Time, I suggest, to inject a quantitatively-based qualitative element into science policy-making. [] <2Prescription for disaster>2 SPRING ushers in a less than joyful seasonal event for local authority officers. Giant hogweed (<1Heracleum>1 <1mantegazzianum>1) starts to grow, aiming for an enormous and intensely irritating presence by midsummer. Councils charged with caring for the public interest start to panic at the thought of this oversized weed with its excruciating sap. One council, Lancashire, has been trying to enlist the Ministry of Agriculture's help in control- ling their hogweed. Their attempts as detailed in Easter Monday's <1Guardian>1 might well misfire. Lancashire was reported as asking the ministry to "prescribe it under the Weeds Act 1959". Wasn't the idea to get rid of the stuff? [] <2Short words, short sight>2 ------------------------------------------------------------------ Peter Forbes on why words should not be like numbers SPELLING REFORM is a linguistic channel tunnel. The arguments in favour are sufficient to induce the periodic sinking of test-bores, but they always jib against the stony beds of linguis- tic conservatism. But, suddenly, there are more compelling reasons than ever before for wanting to reach that elusive shore where the native words are friendly and easy to learn. Valerie Yule ("Shorter words mean faster reading" <1New Scientist>1, 9 December, 1982, p 656) argues that the anomalies of English spelling constitute a serious hindrance to the smooth progress of infor- mation technology. I admit that there is something absurd in the notion of machines built around the simple rigour of Boolean logic having to cope with words whose spelling in some cases derives from Dr Sam Johnson's incorrect etymology. I'm sure there are many for whom informa- tion technology is a way of life who despise the imperfections and impurity of the English language. Nevertheless, I think we should be wrong to take these attitudes too seriously. Information technologists have no more right to dictate the course of language than any other group. Because information technology is new and exciting, its proponents naturally exag- gerate the cultural role it will take But do they really believe that we shall have noth- ing better to do in the future than to sit in front of a screen for hours, plugging into <1Which?,>1 mail-order catalogues, and the index to <1Chemical Abstracts?>1 I can think of huge chunks of life, important to me, in which information technology is irrelevant: sex, the arts, sport, food and drink, home decorating. These seem to be the topics that engage most of my time, and there is no evidence that any of them are freakish, minority interests. They are all empirical, qualitative, aesthetic, concrete activities, and they all have a rich vocabulary and indeed literature. About the only contribution information technology can make is to assist in the compilation of cricket statistics. And yet Valerie Yule is proposing that the glorious rich language in which these activities are carried on should be vandalised so that every text we read will look like the worst that Fleet Street compositors can inflict on us. Why should we endure such a travesty? Valerie Yule stresses that spelling is "a human invention, open to research and development, as much as every other tech- nological element in the spectacular and marvellous world of microprocessors, video, telecommunications--and, not least, of books". But words have a highly ambigu- ous status: they are human artefacts but they are also very like natural species. And while they certainly are open to research and development and to further invention, the way they evolve is as complex as the evolution of the natural world. To propose across the board mutilation is as dangerous as the idea of trying to perfect the genes of every species on Earth. Valerie Yule is also concerned with the difficulties of learning the language I love. I think that the vital point that is missed by psychologists and educationalists who have tried to devise techniques for learning language is that this process does not, and should not, take place in a vacuum. Chil- dren <1do>1 see words as natural objects, and the non-semantic associations Valerie Yule disapproves of ("visual, phonic, contextual, etymological, and so on") are the key to the child's interest in them. In the first attempts to write words the letters are just shapes to be copied like any other shapes. And chil- dren love poetic rhythms, alliteration, nonsense mutations. Any child who is <1encouraged>1 to explore these aspects will learn the awkward shapes of words quite naturally. But suppress these playful associ- ations and the child has no incentive to tackle what has now become a dull, forbid- ding task. It is notorious that many scientists do not love words: scientific language has already had a disastrous effect on the use of language throughout society. There are many people now, by no means all scientists, who habitually talk of the param- eters of a situation rather than describe what is happening, of infrastructural prob- lems when the trains are late, of the sexu- ally dimorphic aspect of <1Homo sapiens>1 when they mean that men are generally larger than women. Once you are afflicted with this inverted mentality in which every thing that exists is a pale shadow of some abstract general principle, expressed in a compound-noun sandwich, you are incapable of seeing or saying anything clearly. This became very noticeable during the Watergate saga, in which the convoluted and currupt language mirrored perfectly the perversions of the Nixon administration. There is a basic flaw in the arguments of the information technologists that is perhaps hard to detect beneath the super- ficial excitement that the subject generates. The flaw--the quantative fallacy--results from the all-pervading relativism of the age, and the attempt to replace value judge- ments by objective, scientific evaluation (for a time, in the 1960s the phrase "value judgement" was <1the>1 pejorative put-down). In this moral vacuum, measuring things, giving numbers to things, gives a spurious comfort--the world is tamed and the conclusion is sanctioned by an order beyond human whim; it is value-free. So, for Valerie Yule, language would be improved if words were shorter, if they could be learnt quicker, if they could be read more easily on VDUs. I think that what she really wants is for words to be more like numbers. This dislike of words is apparent from the scorn she pours on the value of the different levels built into words. It is precisely this multi-layered rich- ness that I wish to celebrate and protect from the ravages of the systemizers. This curious truth about language is that it is always perfect: when a user of the language fails to express himself clearly it is because some existing linguistic resource is not known to him, not because such a resource does not exist. There <1are>1 new things under the Sun, both in technology and social innovations, and new words and phrases are sometimes needed. There is no problem in this: language abhors such a vacuum and very soon there are countless neologisms vying for this new slot. What is remarkable is that a consensus is achieved by natural selection, and the neologism is usually accepted by most people fairly quickly. But these sports are introduced against a stable background. A complete change of spelling would instantly destroy a large part of what most people consider to be a fixed aspect of the world. It would be like arbitrarily changing all the colours and smells of the world. How can the orthographical redundancy Valerie Yule finds so distasteful be defended? The answer is that human beings are embellishing creatures: we have never been satisfied with the merely utilitarian. Most of the essential tools we use are decc- rative as well as functional. No one would choose to buy cutlery or crockery or curtains or chairs without considering their aesthetic appeal. Why then should words be shorn of their decoration? If we can't enjoy our essential tools, life becomes a grey and tedious affair. That for many people words are grey and lifeless objects is only too apparent from the way they use them, especially in print. It does not seem to be widely realised that words have shape and colour and rhythm as well as meaning. But you can't love words without appreciating these aspects, and it would be a more civilised world if more of those who have to use words extensively in their work did love them. [] <2Over the barrels>2 ------------------------------------------------------------------------- Debora Mackenzie may be the only journalist in Europe who doesn't want to find the missing Seveso dioxin (or some such) . . . EUROPE has been treated to high comedy for the past few weeks as governments, corporations and journalists have fallen all over each other in a made game of find-the barrels. Few have been laughing. Instead, there have been fear, outrage, and what the papers call "dioxin psychosis". But the only ones who should be scared, outraged or psychotic are the governments and corporations, and perhaps the journalists. The ordinary Euro- pean should be laughing. This barrel busi- ness is doing the beleaguered European environment a world of good. The story so far: dioxin-laden detritus from the factory at Seveso, Italy, which polluted its surroundings in a 1976 chem- ical explosion, was packed in 41 barrels of the type T used for nuclear wastes. But it seems that their notoriety was such that no waste disposer would take them, albeit Europe disposes of some 20000 tonnes of lethal chemicals a year, most not nearly so well wrapped. So the German waste handler Mannesmann insisted that the cargo's destination remain secret, even from those responsible for it, factory owner Hoffmann-La Roche, and the Lombardy authorities. These happily agreed, provided the stuff went to a safe dump outside either of their countries. Mannesmann promptly subcontracted the job, through a firm registered in super- discrete Lichtenstein, to one Bernard Paringaux, a waste handler who has lost his licence at least once, and presumably knows how to lose a touchy cargo. He lost this one. After apparently driving it to France, Roche nonetheless assured every- one that the dioxin was safe--up until two weeks ago, when it started offering to retrieve it "if it is not". No one in the case will admit to knowing where the barrels are except Paringaux, who languishes in jail refusing, as a point of honour, to talk. It is speculated that he is in no hurry to be released. It is astonishing that the disappearance as duly noted by the press <1(New Scientist,>1 vol 96, p213), and then the issue died, until our Parisian colleagues, <1Science et Vie,>1 ran a report on the matter at the end of March. This happened to coincide with the moment Hugette Bouchardeau took over as France's new, untried environment minister. The scandal of the missing dioxin then proceded to blaze in every newsroom in Europe. Better late than never. As waste dumps all over Europe are frantically dug up in the search, other interesting things have come to light. Dioxin was indeed found at one Rhone-Poulenc dump in France, its manager nervously explaining that it was good French dioxin, and not a nasty Italian import. The same dump, however, harboured 21 illegal tonnes of 7 per cent arsenic waste, a situation which had been revealed by the local paper a year previously, and then allowed to sit by local authorities. The arsenic is now on its way elsewhere, nor is the shake-up ending there, as dumps all over Europe have by turns found themselves at the end of the latest "lead". Nor are waste dumps the only things being shaken-up. As Minister Bouchardeau noted, "these 41 barrels are small compared to industry's wastes. The matter won't be closed when they are found." Indeed, the way in which they could disap- pear beyond the ability of governments and even their owners to find them is eliciting a long, hard look at Europe's toxic waste rules. Now, a transporter must only say where wastes are going "if it is known". Their nature and source must appear on the shipping form. Our 41 drums contained "halogenated aromatic hydrocarbons"- true, dioxin is one--from Milan--well, close. The directive requiring this much took effect in 1980, but Italy, after some sharp reminders, only altered its laws to conform last December. This may have prompted the barrel's September departure before a resting place, apparently, had even been found. Things will change. In February the European Commission proposed a direc- tive requiring waste producers, handlers and the governments involved to be informed of the details of each shipment, a measure with which the European Parlia- ment, on 14 April, roundly agreed. The 21-nation Council of Europe called on 26 April for a thorough examination of waste transport rules, and plans a special meeting on the subject next January. Germany has already changed some of its rules. All have prefaced the discussion by lamenting the barrels' disappearance. But should the barrels stay missing for a while, they are not likely to be posing much of a health hazard--at least, not nearly as much as that posed by the tonnes of poison whose storage, legal or otherwise, is now going to be looked into. lt seems that even the toxic cloud that rose over Seveso in 1976 has had a silver lining . . . [] <2Security is an open door>2 ------------------------------------------------------------------------- Dan Greenberg bemoans attempts by the US Department of Defense to protect scientific "secrets" IT IS doubtful that the world will be made safer by a little-noted US government ban against Libyan students studying aviation and nuclear-engineering at Ameri- can universities. But it is likely that if American universities tolerate this latest manifestation of the Reagan adminis- tration's medieval strategy for guarding our scientific prowess, there will eventually be considerably less to guard. The ban, initiated by the State Department and announced on 11 March by the Immigration and Naturalisation Service, is expected to affect fewer than 500 of the approximately 3000 Libyan students at American universities, and only those studying the proscribed subjects. But with- out denying grounds for concern about the outcast Gaddafi government that sent them here to acquire skills of obvious military value, there are also grounds for concern about the long arm government regulating traffic inside the boundaries of the academic world. One need not partake of overly senti- mental or nostalgic views about academic freedom and communities of scholars to recognise that open doors and free and easy communication--face to face and in print--are the fundamental difference between the aridity of Soviet science and the fabulous productivity of American science. That is the near-unanimous verdict of Soviet emigres and defectors and Western scientists who have participated in exchange programmes with the Soviets. The difference has spawned the nice turn of phrase "security by accomplishment", rather than by secrecy--which means keep- ing scientifically ahead of the Soviets by remaining free to move speedily, rather than by emulating their deadening security mania. Nonetheless, the Reagan administration has been moving toward building fences around the American scientific commu- nity. It has sown confusion and anxiety among researchers by giving birth to the ambiguous concept of sensitive but unclassified research. Last year, without warning, Defense Department officials pounced on a meeting of photo-optical engineers, and, invoking that concept, prohibited the release of some 100 research papers that had previously borne no restrictions. With competition for research funds now especially fierce, the Defense Department is using its bountiful budget for university- based science to "sensitise"--as one Defense official put it--university scientists to its security concerns. The process avoids any rough measures, simply calling for providing the department with a pre- publication copy of any paper based on research it supported. Defense officials say they have not yet forbidden any publica- tion, but insist that they possess the power to do so if they choose. The claimed legal instrument is an acrobatic interpretation of the Export Administration Regulations and the International Traffic in Arms Regulations--under which, the security pushers insist, public disclosure of scientific knowledge can be restricted, even if it is unclassified. The promotion of restrictions extends beyond the US. The State Department, for example, has blocked the funds of scientists bound for Poland as an expression of polit- ical disapproval of the Polish regime. The ban on Libyans in aviation and nuclear studies is not going to impair Colonel Gaddafi's military plans. He can send his students elsewhere or hire special- ists of other nationalities. And, by itself, the ban--though symbolically disturbing-- would be of little significance in the context of American scientific and academic strength. But, of course, it does not exist by itself. It is just the latest of many ignorant assaults that are damaging a national asset envied around the world. [] <2Nuclear power for peace but not for war>2 Tam Dalyell defends a friend and attacks a bugbear I am deeply inter- ested in the row that has been brewing for some time in the heat of the nuclear arms debate, and not only because the key figure is a friend of mine. Now the row has burst into the open round the broad shoulders of Monsignor Bruce Kent, threatening to blast the career of that redoubtable cleric by forcing him into an invid- ious choice between his cloth and his commitment to the Campaign for Nuc- lear Disarmament (CND). Who put a match to the stake, I am in no position to know. Cardinal Hume himself? The Pope? Michael Heseltine? The Duke of Norfolk, and other grandees of the English Roman Catholic Church, or was it the Irish-origin tendency? Or perhaps it was Lord Rawlinson, the Lord Chancel- lor manque and former Attorney General? Whoever it was, they are, I think, in error. Equally in error in my view are those who criticise Royal Edinburgh for express- ing his views, which, incidentally, I do not share, that Britain should keep some nuclear weapons. I believe in freedom of expression for Dukes, Monsignors, Uncle Tom Cobley and all! Perhaps Bruce Kent's critics should take a leaf out of the book of the nuclear lobby, which goes out of its way to substantiate the case for nuclear power for civil purposes. (I never cease to marvel at the continued confusion in the public mind between nuclear power and weapons: too many people stare at me as if I were some unnaturally hybrid creature when I mention to them that I am both a member of CND and a strident advocate of nuclear power stations to produce electricity. Since nuclear power stations are no longer the easiest route to nuclear weapons, I don't see myself as a double-headed monster. At the time of the building of Calder Hall, back in the 1950s, there were military impli- cations--but that was many Moons ago.) I commend to the people who want to send Monsignor Kent to the stake the self explanatory activity of the British Nuclear Forum. You may not have heard of it, but you ought to know about it. Its president is another Duke- Portland--and the chairman is J. C. C. Stewart, with a most effective director in Jim Corner, whose office is at St Alban's Street, London W1. The Forum has a Council of Manage- ment drawn from organisations with an interest in the nuclear industry, such as British Nuclear Fuels Ltd, UKAEA, Shell Nuclear, Whessoe Heavy Engineering, Fairey Engineering, GEC, Tube Invest ments, Vickers, Sir Robert McAlpine, the Central Elec tricity Generating Board, the South of Scotland Electricity Board, Kleinwort Benson and the British Insurance Committee. Through the BNF, these worthy enter- prises put the facts about electricity from nuclear power before the public through the many information channels available to it. A remarkable number of MPs attend BNF's annual autumn meeting, arranged precisely for people who do speak out and argue the toss on nuclear issues. This is quite a feat, for not even when a lunch is offered do my parliamentary colleagues turn up in droves for a morning seminar. But somehow the BNf pull in the parlia- mentary crowds, as they did last autumn, when it organised a heavyweight panel including Con Allday, managing director of British Nuclear fuels Limited, Ned Franklin, director of the National Nuclear Corporation, Sir Walter Marshall, chair- man of the CEGB, and Lewis Roberts, not only director of the Atomic Energy Research Establishment at Harwell, but now also chairman of the recently formed Nuclear Industry Radioactive Waste Executive directorate. MPs who on the whole tend to be rather casual about going to meetings organised for their benefit by pressure groups tend to go to the nuclear forum, because it is always well-structured and organised, and because they can then say to constituents who work in either the nuclear industry or the nuclear supply industry that they have been, and still are, taking an interest in the decisions that affect those particular electors. This is no small matter, especially as the civil nuclear issue is coming to the centre of the political stage again, summed up in the slogan--for sloganised it has become--"Sizewell B". At one political meeting last month, I was abused being at one and the same time a pro-Marketeer, an anti-devolutionist, and, horror of horrors, pro-Sizewell (AGR tendency only, I hasten to add). There are various species of Size- well men--no, in modern political circles, Sizewell people. I belong to the species that believes in replicated nuclear power stations, but I remain obstinately uncon- vinced by the pressurised water reactor option. For all their virtues, I wish fewer members of the British Nuclear Forum were so hooked on the PWR. I agree with them about the need for a comprehensive UK energy policy in order to provide primary energy substitution for oil. I agree with them that economic benefits will come from long-term policies of timing and continuity. I am dubious about their confidence that British industry is already experienced in a wide range of PWR work, and has technical and manufacturing competence at the levels ready to receive successfully the transfer of information specific to PWR components. I am a veri- table doubting Thomas about the indus- try's claim that if Sizewell B goes ahead, there will be significant benefits in the long term from an increase in the British share of the world market for PWR related prod- ucts and services. If the nuclear forum wants to be as effec- tive with the public as it is with press and politicians, it might be well advised to reflect whether it ought to hitch itself quite so fast to the PWR band-wagon. [] <2LETTERS>2 <2Rhynchosaur battle>2 Mike Benton in his article on the rhynchosaurs (7 April, p 9) suggests that the so-called battle for survival really took place between the food-source plants and not between the animals themselves. This notion is supposedly supported by the concurrent decline of major herbivore groups with that of their food plants, but to my mind the example she used in his article are naive and far from convincing. The first claim is that the dicynodont synapsides declined rapidly as the <1Glossopteris>1 flora disappeared. The second is that the rhynchosaurs themselves declined in association with the seed-fern <1Dicroidium>1 which was being ousted by the global spread of conifers. There appear to be several major faults in the reasoning that animal decline was the result of such plant disappearances. The <1Glossopteris>1 flora was only one of several widespread late Palaeozoic floras and was itself limited to the then Gondwana region. The Triassic <1Dicroidium>1 belonged to the small group of seed-plants, the Corystospermales, which were also restricted to the Gondwana region. Both <1Glossopteris>1 and <1Dicroidium>1 were trees forming forests. To suggest that the dicynodonts and the rhynchosaurs disappeared because of the decline of basically similar Gondwana arborescent plants would seem to indicate a rather simplistic view of both animal-plant in relationship, and plant evolution in general. Any sizable successful herbivorous group surely cannot have been relying on one such geographicallv limited group of seed-plants to the exclusion of all others. There were many types of seed plants evolving throughout the late Palaeozoic and Mesozoic periods which must have been excellent sources of food. The northern hemisphere Caytoniales lasted from the Trias to the Cretaceous, the cycads and cycadeiodes were both abundant in the Trias and well into the Cretaceous, and ginkgos (Maidenhair-trees) began their evolution in the late Palaeozoic and were very common in northern hemisphere Mesozoic floras. There must also have been a great number of ferns, lycopods and horsetails existing as understorey plants but also isolated in dense swards, There was not the widescale domination of conifers hinted by Benton which somewhat invokes a picture of foodless dark plantation-style forests. Mike Benton's ideas are reminiscent of those of Tony Swain and Gillian Cooper-Driver who proposed that dinosaurs became extinct through their dietary requirements They suggested that the development of alkaloidal synthesis of cyanogenic glycoside precursors in the early angiosperms made them unpalatable to the dinosaurs, effectively starving them into extinction. Both authors appear to belittle the great range of plants that were living at the times of their animals and the ways in which these plants were constantly evolving and migrating. <1Barry Thomas>1 <1Biological Sciences>1 <1Goldsmiths' College>1 <1London>1 <2Causative thoughts>2 The epidemic of the new disease AIDS offers clear evidence for Rupert Sheldrake's hypothesis of causative formation. That recent ideas about immunosuppressive drugs, autoimmune syndromes and tumour viruses should join together to express themselves as a disease seems only to be expected. The prevalence of the disease in the gay community may also be related to the upsurge of interest and propaganda about homosexuality, while its first appearance in the US correlates with the American investment in paramedical research, ensuring that these ideas are at a higher concentration than elsewhere. Perhaps Sir Fred Hoyle, as an authority on epidemiology, might be invited to see whether the origin of the epidemic may not lie close to the laboratories of the National Institutes of Health. Maybe we shall soon see legislation, necessary for the health of the nation, aimed at preventing individuals from harbouring dangerous thoughts. <1D. W. Ewer>1 <1London SW7>1 <2Robots to blame?>2 The arguments in Patrick Jenkin's article ("The unemployed cannot blame automation", 24 February, p 526) are identical to the way I saw things when first involved in computer-controlled machine tools over 20 years ago. With a company which can make use of new technology, jobs are certainly put at risk by resisting, or less than enthusiastically supporting, new technology. The crucial factor is whether technology provides new jobs at a higher rate in new activities than it eliminates in older industry. A survey of the electronics industry in this respect would, I believe, show a net loss. All the new products which can be envisaged, and most of the new services which can be imagined, are as likely to be automated as the old. The age of the robot-making robots is already with us. The forces which drive technological development have a close parallel in biological evolution. A somewhat irresistible phenomenon. There seems to be no escape from the fact that work substitutes will be needed to preserve society as we know it, carried out in such a way as to preserve particular countries' economic viability. <1H. G. Hayes>1 <1Hockney Engineers>1 <1Leeds>1 <2Compulsory fluorlde>2 Melbourne dental surgeon Geoffrey Smith ends his excellent article (5 May, p 286) with the statement--"Now, at last, scientists appear to be taking a long, hard look at fluoridation..." When I joined the battle against fluoridation in 1962, my opposition was wholly on the grounds that it was a treatment of people's bodies without their consent, or the consent of parents in the case of children--a form of "compulsory mass medication". When in 1963 I read <1The>1 <1American fluoridation>1 <1Experiment>1 by F. B. Exner and G. L. Waldbott, I was left in no doubt that besides being unethical, fluoridation was both ineffective as a prophylactic treatment and a needless risk, at least to some people, in the long term. What I found amazing, to the extent of keeping me occupied combating fluoridation during the past 20 years on a more-than full-time basis, was that the sales promotion experts employed to build the fluoridation bandwagon have been able for so long to lull most scientists and politicians into accepting this method of diluting and dumping waste fluorides at public expense. (According to the Royal College of Physicians, fluorides used in the fluoridation of public water supplies would be "derived from sources that would otherwise have been discharged to the sea as waste"--see <1Fluoride, Teeth>1 <1and Health>1, report of the RCP.) The one and only good thing that fluoridation has done is to provide an illustration of the urgent need for legislation to prevent the use of public money, the machinery of government, and the time of public servants for promoting private interests at the public's expense. A sound Freedom of Information Act might be the answer to this problem. <1P. Clavell Blount>1 <1Chairman>1 <1National Anti-Fluoridation>1 <1Campaign>1 <2IQ potential>2 If Richard Herrnstein ("IQ encounters with the press", 28 April, p 230) has been treated as shabbily as he says by American newspapers, that is unforgivable. If Richard Heber, former director of the Milwaukee project, is in Jail for embezzlement, this places a suspicion on the reliability of his claims of having substantially raised IQs of economically deprived children. But if Herrnstein believes that there is a conspiracy afoot to discount totally the influences of inheritance on intelligence, this is nonsense. Since this century began, few psychologists outside T. D. Lysenko's Russia have denied that intelligence bears a very direct correlation to inheritance. Environment is the soil and water that permits it to sprout and grow, or sees to it that it doesn't. We are born with all the potentiality for intelligence we are ever going to have, and there is not all that much we can do except to achieve as much of that potential as the world and its intelligence experts will permit us. The real issue, therefore, is not so much concerned with this or that IQ score and the chances of raising it as it is about the ingenuous, and often disingenuous, attempts to quantify the inheritance: learning ratio. Hans Eysenck, for instance, sees it in terms of 85:15 in favour of inheritance, or sometimes 80:20; he's not quite certain which. The point Herrnstein's "antihereditarians" are trying to make deals, not with the obvious genetic roots of an individual's intelligence, but, first, the exact <1meaning>1 of this thing we call "intelligence" and, secondly, the impossibility of separating and then quantifying the <1amount>1 of intelligence which is inherited and the amount which is subject to change. You cannot do this. There are simply too many variables, too many unknowns in the equation. Eysenck wrote in <1New Scientist>1 (vol 94, p 803), that "the age of six . . . is about the youngest age at which IQ tests can be properly applied". By that age there is no rational way to disentangle what has been inherited from what has been learned. Unless we can devise an IQ test that can be administered to an embryo in <1utero>1, there is no way of deciding what are the limits of a person's intelligence potential. That being so, teachers like me have the duty to do all we can to prod a child to the limit of that potential, but as we don't really know what that potential is, and no IQ test can tell us, we are obliged to keep prodding. An exhausting process for both child and teacher, but the only process I know that treats the child as what he is, a developing and developable human being. And not a number that says 103 or 129 or 86 or whatever. There is no royal road to teaching, and teachers are rightly highly suspicious of IQ experts who tell them there is. <1Ralph Estling>1 <1Ilminster>1 <1Somerset>1 <2Lefthanded count>2 Last year Stan Gooch said that among the Chinese lefthandedness runs at 18 per cent whereas the figure for Westerners (which we presume to mean Europeans) is only 9 per cent (Letters, vol 95, p 258). Every March we invigilate university examination, and to relieve the tedium we regularly count the lefthanders. Normally we don't keep a record of the results but now we have some hard facts. This year we found that out of 1176 students (about 10 per cent of the university population) only 37 were lefthanded (about 3.1 per cent). There are two problems with assuming that our figure applies to the whole Chinese population. First, we "measured" lefthandedness by looking at which hand each student wrote with. This is only a restricted definition of lefthandedness. Secondly, the student population may not be representative of the population as a whole. But we do not believe that any bias in the sample would account for the difference between Gooch's figure of 18 per cent and ours of 3 per cent. Would Stan Gooch please produce some evidence to prove his figures? <1F. J. Seward>1 <1I. S. Giles>1 <1National University of Singapore>1 <2Waste not, want not>2 John Gordon is right to say in "Save our wastelands" (21 April, p 130) that derelict land provides opportunities but to suggest that research and reclamation effort have not adequately responded to this is very far from the truth. Howcver, there is still much to be done. Far too many reclaimed sites, far from being well- manicured, still present problems of their maintenance in a condition that enables them to be utilised and enjoyed. In addition, derelict land was, until recently, being created faster than it could be reclaimed, Although there are areas of derelict land that should be retained for some of the reasons given by john Gordon, and research sponsored by the Department of the Environment has sought to identify these for appropriate use, there are still far larger areas that, due to the many constraints of the substrates that need to be overcome, are eyesores, health hazards and potential lethal risks to the communities that have to live surrounded by them. Because generations of hard working and poorly-paid working people have shown enormous resilience and good humour in coping with substandard housing, polluted air, lifeless local rivers and streams and degraded surroundings, it doesn't mean that efforts to improve air quality, the quality of rivers and also, no less, the quality of the surrounding landscape should not continue to be made with all the imagination and expertise the community has at its disposal. We are still a long way from moving from "the polluted pay" to "the polluters pay" principle. <1M. J Chadwick>1 <1Derelict and Reclamation>1 <1Research Unit>1 <1University of York>1 Yes, derelict land has played and does play its part in the conservation of rare plants and animals in Britain. But one has to be careful, a piece of derelict land can be like a magnet in attracting further dereliction. I could take John Gordon to many areas where disrespect for the environment has been virtually institutionalised after an initial piece of industrial dereliction has blotted the landscape. <1John Palmer>1 <1Acomb, York>1 <2What's wrong?>2 Last week's article "What's wrong with German Sciencc" (p 277) should have contained an acknowledgement to the generous help of Dr Christoph Schneider now at the Wissenschaftsrat (Science Council) in Ko%ln. [] <2ARIADNE>2 BACK to 1979, when I carried on a bit about superstitions about earwigs, the whole thing being started up by a film about a man driven off his troIley by one of the insects boring into his brain, shouts of "rubbish" at the screen did no good. Professor Sharp, of the Memorial University of Newfoundland, writes that he has just run across the earwig after having acquired five years' back numbers of this magazine and to say that the Anglo-Saxons had a word for it, as we are all too well aware from listening to conversations between small children. It is, he says, a case of a dropped "e". The original word was earsewig. Drop the first letter and you get an insect that wiggles its arse, which seems plain enough. Professor Sharp points out that the English bird with a white tail is called a wheatear. The same expla- nation applies. I do not know what the Anglo-Saxons called a rabbit, a candidate for description, I should have thought. Perhaps they did not want to confuse birds with animals, leaving the term whitearse exclusively avian. [] ON ONE of the few days with sunshine recently I was shambling through Cavendish Square in the West End of London. A motor cycle was parked near the pavement and a traffic light. Stretched out on it, feet on the handle- bars and head on the pillion seat or carrier was a young man, fast asleep, with his hands under his head. The motor cycle was a bright red. It belonged to the Post Office. On its side it said "Express Service". [] IT IS more than possible that I am missing a point, showing a lack of imagination, remaining in the Middle Ages and have my feet firmly stuck in the mud when I am under the impression they are just on the ground. But I cannot see what I would do with a home computer, one of which I am continually being urged to buy by manu facturers and retailers and by drops in prices of the things that are as near as a toucher opening up my cheque book. I do not want to play games in colour on my TV, I can do household accounts with a piece of paper and a pencil, I can learn languages easier from a book and a tape recorder, if I want information I can look it up and read it far faster than I can get it from a screen and I can turn back with a nip of the finger and read it again. I do not care about information being able to reach my home VDU so fast that the Encyclopaedia Britannica can get there in five seconds. I cannot use it that quickly. The paradox about all this information explosion or whatever it is called is that the speed of its distribution is so high and the actual receiving of it by a human being is so necessarily slow and far more inefficient than it is achieved by other methods, such as reading printed marks on paper. You can carry those about, too. Try folding a VDU in half and stuffing it in your pocket. Perhaps you can wrap fish and chips in computer output, but that, it seems to me, is the only asset I can think of for ordinary domestic life. Now to return to my dusty library at the top of the ivory tower. [] I LIKE to keep an eye on advertisements in the classier American magazines for several reasons. One is that I can be prepared for what is going to be available here sooner or later and so be able to combat it. Another is to marvel at the pointlessness of many a sophisticated achievement and to be superior about it, the last, or one of the last, indulgences of a European. It is an agreeable feeling, the nearest you can get to living one of the parts in a Henry James novel. To report from the front, there is now a gadget that can be fitted to your refrigerator that makes a noise like a pig every time the door is opened. The advertisement describes the noise as oink, oink. This is claimed to be a discouragement to people who eat too much. The development is a refinement of the gadget that had some off- putting remark to make instead. It is more powerful, bordering on the insulting and, above all, jokey and juvenile and so, presumably, appealing. You can also overcome one of the worst problems in contemporary life, that of cold coffee. The Mug Mate is a miniature hot plate, a "personal cup heater" and it will keep coffee, tea or soup hot to the last drop. Nor need you suffer a moment longer than you should, for if you send the money to the firm on a coupon demanding that it "rushes" you the MugMate, it will "ship the same day" it gets your order. Anyone on the verge of being clapped into the loony bin as one more pathetic case of cold coffee dementia can telephone the firm free and be saved. [] Unlike most extruded synthetic fibres, duck down has a branched and ramifying structure which makes it an unrivalled heat insulator. Daedalus is now imitating it polymerically. He recalls that ingenious nylon synthesis in which the two reagents are dissolved separately in two immiscible solvents. When the solvents are mixed together, a nylon film forms at their interface. So Daedalus is squirting one such solution through small round nozzles into the other. Each liquid jet is instantly "coated" with nylon, giving a fine hollow-tube fibre, with reagent still inside. What happens next is rather cunning. Daedalus is choosing his solvents such that nylon is preferentially permeable to the one outside. It will diffuse rapidly into the fibre, gener- ating a high internal osmotic pressure. Soon the fibre will rupture at its weakest point, internal and external reagents will meet and film over to form a "bud" on the fibre. Being weak, it will again burst osmotically, and reform further on . . . so a "branch" will grow out from the fibre. (The pretty "silica-garden" chemical grows its fronds in this way.) These branches will, by the same cate process, branch in their turn and the fibre will ramify towards an incredibly complex microstructure. DREADCO's "Fluffy Fluff will be the warmest, softest and most heat-insulating material ever known. Made into sweaters, socks, duvets or curtains, Fluffy Fluff will be wonderfully cosy. In towels and nappies and swabs it will be incredibly absorbent. But best of all, it is self-renovating. When your Fluffy Fluff sweater begins to wear thin at the elbows, just immerse it once more in the solutions. First the "intemal" solution, will slowly permeate into the fibres, then the "external" one. The osmotic process will then commence anew. It will selectively build up the thinned fibres, across which osmotic diffusion will be most rapid. New buds will ramify out from the ends of any broken fibres. In a little while the garment will be completely renovated. []