<2COMMENT>2 <2Do robots produce redundancies?>2 CONGRATULATIONS TO the 600 Group on the official opening of SCAMP --Six Hundred's Computer Aided Manufacturing Project. We've waited at least five years to see the project. It's good to know that a British firm (in this case the largest producers of centre lathes in the world) will be able to supply the technology needed to build automatic production lines (this issue, p 632). And, like it or not, Britain will eventually have to automate its manufacturing. The 600 Group made, or commissioned British firms to make, all the equipment in SCAMP. Admittedly the Fanuc robots are Japanese, but in future they will be made under licence and sold under the name "600 Fanuc". If the Department of Industry had not made a direct grant to the company it is unlikely that SCAMP would exist. Flexible manufacturing systems (production lines--combining microelectronics and mechanical engineering--that can make small batches of components cheaply) are not yet economic in this country. Labour is still relatively cheap and electronics is an expensive technology in Britain. It is known that SCAMP cost #3 million but no one will say what percentage of that figure is public money. By all accounts the launch of SCAMP was a lively affair. There was the unforgettable moment when an "operative" booted a stalled robot back into action. And there was the 600 Group's chairman, Sir Jack Wellings, in a very bullish mood. He said that flexible manufacturing systems like SCAMP meant jobs for his workers and for any company that installed the technology. The Secretary of State for Industry, was just as ebullient. "Automation can save jobs", he told the assembled press and industrialists. Neither man went into any detailed argument about how robots will create jobs. Undoubtedly there are sound bases for the statements--considering the public's apprehen- sion about automation and the taxpayer's money spent on SCAMP. Both the Dol and the 600 Group were keeping their information on the effects of robots on employment to themselves. For example, no one would even attempt a guess about how many people, using conventional equipment, it would take to match SCAMP's productivity. One journalist who persisted on the point was referred to playfully as "a Chartist or a Luddite" by one press officer, who added that there was also the argument that 600 Group should worry about staying in business and not about unemployment. Stimulating arguments aside, there are two points of view about robots and, indeed, the related field of information technology (this issue, p 634 onwards): either they do or do not destroy jobs. There are many to argue the first point. There are few who are prepared to argue the second, and that is a point of view we should hear more about. [] <2Fast-reactor slowdown>2 WHEN HUNDREDS of nuclear engineers came together for what should have been a festive shindig last week, there was universal gloom. The Secretary of State for Energy, Mr Nigel Lawson, had not fooled them with his praise for the fast reactor two days earlier; for his action, to delay the work all but indefinitely, spoke louder than his words. The nuclear estab- lishment's response to Mr Lawson's invitation to prepare a new R&D pro- gramme for fast reactors <1(New Scientist,>1 2 December, p 547) will almost certainly be met with the suggestion that without another large reactor there can be no R&D programme. Mr Lawson is right to ask the nuclear industry to spell out a new R&D effort. R&D must give way to demonstration, which should mean showing that the technology works. The days for R&D are over: we need demonstra- tion, and so far the Prototype Fast Reactor has demonstrated only that the non-nuclear bits of the technology don't work particularly well. For PFR cannot yet reliably turn nuclear heat into the steam needed to generate electricity. It may be boring for nuclear experts to have to worry about plumbing, albeit sophisticated plumbing, but they must prove that they can get the simple bits right before Mr Lawson gives them a new reactor to play with. After all, a large fast reactor could cost not much less than #2000 million to build. If the country's nuclear experts really are as bright as they would like us to think, they should be bright enough to come up with a con- vincing case for spending so much money, or to find cheaper ways of demon- strating the technology. [] <2Britain faces up to information technology>2 <2Computers, microchips and related hardware are undoubtedly destroying jobs throughout Britain.>2 <2But in theory, just as many could be produced--if the country creates the right conditions for growth>2 <2Peter Marsh>2 "It's a tragedy that all those millions of pounds of investment are not going to create many jobs. But that, I am afraid, is the way of the world." The speaker was Martin Cracknell, chief executive of Glenrothes Development Corporation in Scotland. A man at the sharp end of the drive to put information tech- nology to work in Britain, Cracknell was commenting on plans by a big American semiconductor firm to ex- pand its operations in the town. Glenrothes is a new town 25 km to the north of Edinburgh. The develop- ment corporation has wide powers to attract new industries, for instance with grants or by building cheap fac- tories, and it is particularly keen on electronics. As Cracknell explains, electronics in general and information technology in particular represent areas of growth in an otherwise de- pressed industrial scene. "Our goal", he says, "is to make sure the town is in good social and economic shape so that it is ready for the next century." But as Cracknell knows only too well, the new indus- tries based on electronics have little effect in reducing unemployment. In the past year, five electronics firms have either expanded or set up new factories in Glenrothes, creating extra wcrk for a couple of hundred people. But as a result of other companies either closing down or reducing their workforce, employment in this area of industry has actually fallen. Added to this, firms in more tradi- tional industries such as mechanical engineering or construction have made people redundant. As a result, the un- employment rate in Glenrothes stands at around 12 per cent and despite its good record at attracting high-tech- nology companies, in the words of one corporation official, "the town has to run hard to keep standing still". The experience of Glenrothes is mirrored by Livingston, another Scot- tish new town. There, 16 per cent of the workforce is jobless, a figure that the town's development corporation wants to reduce by attracting small, electronics based firms. But the policy has its imperfections. "When the old industries die they are usually operating near to peak employment," explains James Wilson, the corporation's chief executive. "The new firms start small and will not reach a peak for several years." Furthermore, the numbers of jobs created per pound of investment is, often startlingly low. Nippon Electric of Japan has recently opened a #50 million semiconductor factory in Livingston that will, by 1985, when the plant is operating at full stretch, employ only 800 people. A few months ago the plant recruited its first workers-- the men and women, mainly aged under 25, whose minds and fingers are agile enough to turn out microscopic silicon chips by the hundreds of thousands. Attracted by newspaper publicity (the company did not bother to advertise) 3000 people applied for the 20 jobs that were available. The Scottish new towns constitute a micro- cosm of what is happening over the whole of Britain. All over the country, the mass- production industries such as cars and ship- building are either trimming their opera- tions or shedding workers (sometimes both). And within these industries, the pattern of work is shifting. Routine, manual operations, which in many cases machines can do, are dying out and being replaced by skilled, technically orientated jobs which often involve either computers or the channelling of information by some other mechanism from one place to another. A second trend is that the gap in activity left by the older industries is being filled, at least partially, by new, small companies, many of which make or use electronic products. Firms in the area of information technology can be included here as, strictly, IT constitutes the subsection of the electronics industry that covers anything to do with computers, telecommunications or office equipment. In Britain, the IT industry is small by world standards, producing #3000 million worth of goods per year out of a world total of #50000 million. (This compares with the total UK electronics industry, which includes instruments, components and consumer electronics. In 1980 the indus- try sold goods worth #7700 million.) Though production is steadily increasing, Britain's IT firms provided work for 186000 people in 1980, 20000 fewer than 10 years pre- viously. The difference is explained by more efficient ways of making goods, to a large extent brought about by computers and electronics tech- niques themselves. The effect of information technology goes far beyond those firms that make electronic products. A more crucial issue concerns the use of the technology by other firms and organisations that have nothing to do with electronics. By making the flow of informa- tion between people or machines more effec- tive or by automating a production process (for instance by making possible a com- puterised technique in a factory) IT can increase the effi- ciency or simply the flexibility of many organisations. Along the way, people may lose their jobs as machines replace, at least partially, the skills of human workers. But to blame new technology for unemployment is sim- plistic. A study by the Institute of Employment Research at Warwick University showed that between 1985 and 1990 Britain can expect to lose a further 340 000 jobs directly as a result of microelectronic innovations. But the report said that, in theory at least, more jobs would be created by what economists call "compensation effects". For instance, an extra demand at home or abroad for goods made either cheaper or better by electronics will add to employment. So will the investment in capital goods and engineering skills needed to modernise out- moded factories. Further, as a result of a generally healthy economy, employment in service and other industries outside manufacturing firms could increase. All told, the Warwick researchers said that the compensation effects could increase employment by 420000--more than can- celling out the destruction of jobs in factories or offices that take on board the new technologies. A glimpse off how these "compensation effects" could work their way through the economy is given in another study, also by the Warwick institute, on the change in employment patterns over the next 10 years. The fore- casts follow broadly similar lines to a projection for the US made by the American government's Bureau of Labour Statistics (p 635). Thus in both countries people in white-collar jobs --managers, administrators, scientists and so on--will grow in numbers while the jobs in manual occupations will increase either more slowly or decline. (The main difference between the two studies is that the American forecasters assume a fairly high level of growth in total employment while the British researchers are more pessimistic, predicting that employ- ment over the 10 years will fall by 3 per cent.) The work by the planners on both sides of the Atlantic shows the theory of how information technology could have a positive effect on a nation's economy. Always assuming (perhaps dangerously) that the world moves out of a slump, unemployment would not increase more than marginally. But as the bosses of the Scottish development corpora- tions discovered, all this is only theory. Much depends not only on the ability of Britain's domestic IT industry to provide customers with good products but on how other organisations respond to using the technology. On both counts, the outlook is hardly promising. Accord- ing to a report last year for the National Enterprise Board by PA International, a firm of consultants, Britain's IT industry suffers from serious shortcomings. Although no-one can deny the impressive growth of small electronics firms that make, for instance, personal computers (see "The Great British Small Computer", p 639) the consult- ants said the IT industry: * is unimaginative in designing new products * lacks marketing ability * has insufficient skills in distributing goods * does not sell enough overseas * lacks enough big companies with comprehensive product and marketing strategies. The trade deficit in information technology in 1980 was #300 million accounting for almost half the IT goods that Britain consumes. In their report, the researchers from PA International warned this deficit could grow to #1000 million in 1980. They fear that unless Britain's IT pro- ducers improve, the country could be swamped with elec- tronics goods from abroad, straining Britain's economy. On applications, a survey by the Policy Studies Institute showed that only 49 per cent of industry is seriously considering microelectronic technology or using it already. The study points out: "There are uncomfortably strong reasons for supposing that industry in Britain has not adopted microelectronics to the fullest extent possible and that in consequence it is already tending to fall behind our leading competitors." Kenneth Baker, the minister for information technology, says that in an ideal world virtually all of manufacturing industry should be in the microelectronics camp. In an effort to make this happen, Baker next year plans to hand out "173 mil- lion to industries either using or making IT pro- ducts, an increase of some #120 million compared to 1979. Two-fifths of the firms in Britain that supply elec- tronics equipment are foreign. In particular, British industry buys a high propor- tion of its integrated cir- cuits or "chips"--the com- ponents at the heart of vir- tually all electronic machin- ery--from abroad. Most memory and logic chips used in industry are what are known as "standards". They are analogous to the family saloons that car pro- duction lines turn out by the thousand. A customer might buy a large quantity of identical "standards" to fit into its products, confident that customers can alter the exact function of the components by programming. British firms have left the business of supplying these kinds of chips mainly to companies from the US and Japan. By contrast, the leading British electronics firms rely on making what are called "specials". Manufacturers of "specials" custom-build their chips to suit particular appli- cations, in the same way as, for instance, a boat manufac- turer might tailor his products differently depending on whether the customer plans an ocean voyage or an after- noon on the Norfolk Broads. Ferranti, Plessey and GEC are all successful in this area of chip-making, although because the market is limited, sales are comparatively low. World sales in 1980 of custom- built logic chips came to only some #100000, about one- twentieth of the comparable figure fcr "standards". All three firms, however, plan to expand in this area, leaving the state-backed Inmos as the only British supplier of high-volume standard chips (Box A on p 638). Opinions differ as to whether Britain's lack of chip- making capabilities has much effect on the rest of industry. Scotland has played host to several big American semi- conductor firms (including Motorola, General Instrument and National Semiconductor) that have set up plants making standards. Ken Smith, electronics director of the Scottish Development Agency, thinks that these firms have helped to promote the growth of other electronics firms in the region and the fact that they are foreign-owned is unimportant. Scotland has been one of Britain's growth areas in electronics (Box B, p 639). On the other hand, Malcolm Juleff, the managing direc- tor of a small firm in Derbyshire called Micro Image Technology, says Britain has been "pathetic" in making chips. Juleff's company sells chemicals and other products to semiconductor firms and is annoyed that most of his big customers are American. "Making chips is the key to electronics," he says. "It is the business that controls the industry." If UK companies had recognised in the 1960s or early 1970s the need to enter the high-volume chip business, Juleff thinks that the whole of the British elec- tronics industry would be in a better state. Elsewhere in Britain, few companies make hardware that directly serves the semiconductor and electronics industry, for instance for testing or production. An excep- tion is Cambridge Instruments, based in Cambridge, which produces electron-beam equipment used in making chips. 0ver the next few years, more of the com- pany's products (mainly scientific instruments) will be purpose-built for chip firms, according to Bill Hen- derson, the firm's marketing director. In Scotland, with its con- centration of electronics firms, an observer might ex- pect a cluster of companies supplying the electronics and semiconductor business with equipment. But the Scottish Development Agency says manufac- turers are reluctant to enter this activity because they fear it will be too complex for them to manage. Britain's IT firms, in company with any others that use electronic equipment, rely on universities and other insti- tutions for a supply of skilled people. Whether Britain is producing enough of these people is debatable. A report by the Manpower Services Com- mission this summer said that within the next three years, industry "does not predict any major shortages of electronics staff." On the other hand the survey by the Policy Studies Institute reported that skill shortages are stopping many firms from using the new technology. According to the survey, if 22 000 electronics engineers were suddenly to appear overnight, they would all get jobs in British industry. Still more alarmist about the flow of skills and know- ledge into the IT industry from the country's research base was a committee that reported to the Department of Industry a couple of months ago. A group of elec- tronics specialists chaired by John Alvey, the director of technology at British Telecom, said the govern- ment should contribute the lion's share of a #350 mil- lion programme over five years to coordinate IT re- search in universities and industry. This kind of in- vestment, said the commit- tee, is needed to keep Bri- tain competitive with other nations like japan and the US where IT developments are proceeding apace. Kenneth Baker indicated recently that he thinks such a plan should go ahead; if he can persuade his government colleagues that the scheme could make a real contribution to economic growth it could start up by the Spring. Just like any other technology, IT is intrinsically neither good nor bad for Britain. Everything depends on how the country adapts itself to using information technology. And there is nothing inevitable about the way IT will shape the country. Britain is quite at liberty to ignore the technology and pursue its own unique way of doing things. But then it would have to face up to the fact that, by comparison with much of the rest of the world, it would grow steadily poorer with no chance of arresting that trend until well into the next century. [] <2The great British small computer>2 <2A host of small British firms sells small computers and this activity consumes a great deal of the country's>2 <2talent in electronics engineering. But will these little companies ever become big?>2 <2John lamb>2 Spectrum, Jupiter Ace, Oric, Lynx and Dragon--the names represent probably the most promising part of Britain's information- technology industry. They are the names of microcomputers produced by a new breed of electronics entrepreneurs. The new companies, many of them under a year old and em- ploying no more than a couple of dozen people, base their computers on processor chips imported from the US. Another segment of the same emerging in- dustry is involved in an equally fast-moving activity--developing the software that pro- vides small computers for homes, offices and schools with instructions. Take, for instance, Intelligence (UK), a company in Wimbledon, south London, run by 28-year-old Ashley Ward. Until 1979, Ward worked as a racing driver. Then he took a job with an American firm that sold time on its computers to outsiders. He orig- inally planned to earn enough cash selling computers to pay for the time he spent on the race track. That idea changed when he grew interested in a set of computer software called Visicalc with which people can produce and manipulate tabular information on a com- puter screen. After failing to persuade his employer to produce a similar program, Ward developed one of his own, in part- nership with an American programmer called Rusty Luring. Since February last year, Ward has sold 3600 copies of the program, called Micromodeller, mainly to run on Apple computers. With annual sales running at about #1 million, Intelli- gence (UK) now owns two subsidiaries--Herbvale, which produces software, and Merton Electronics, which makes electronic components. The company also exports pro- grams to owners of computers in the United States. One reason for Ward's success is the way he has pro- moted his products. He has emphasised public relations to some effect: a single article in a national newspaper brought 1000 enquiries about Micromodeller. In this sense Ward is not typical of the people who run the new electronics firms. The British Technology Group, formed by a merger of the National Enterprise Board and the National Research Development Council, has invested in Il small firms in electronics. "We are looking for people with entrepren- eurial flair," said David Jones, who directs the board's efforts in the area of small companies. "We have to be sure that they have the right product and that they will be able to sell it. One of the weaknesses we find is that marketing is the skill most frequently lacking." Another trend is that many of the new firms actually like being small. Eddie Bleasdale, managing director of Bleasdale Computer Systems, sums up: "We never want to become a big company because it is just not our scene. We want to specialise in high-reliability systems. We want people to come to us and ask us to develop systems to order." Bleasdale already has an impressive track record. The firm has scored a hit in adapting to its own computer a set of American software called Unix. The software is called an operating system; it controls the basic "house- keeping" activities of a computer in a particularly efficient way. Bleasdale's own computer, called the BDC 60, is a "16-bit machine". It handles chunks of data in a format 16 bits wide; most computers, by contrast, can digest data in only 8-bit streams. This factor, together with the Unix operating system, makes the machine particularly powerful. Five workers assemble Bleasdale's computers in a build- ing in Lutterworth, Leicestershire, in which, four decades ago, Frank Whittle developed the jet engine. Eventually Bleasdale plans to increase the workforce to between 30 and 50. Following the trend of many new industries, firms that make computers often spring up near each other. Britain contains three areas that vie for the term of a "mini" Silicon Valley; they are centred on Bristol, Reading and Cambridge. Cambridge is the home of the British small computer. It was here that Sinclair Research, headed by Clive Sin- clair, Britain's best known electronic entrepreneur, de- signed the ZX80, ZX81 and Spectrum computers. Other computer firms with particularly idiosyncratic names-- Acorn, Torch, Camputers and Jupiter Cantab--have also set up in the city. Acorn, run by Herman Hauser and Chris Curry, pipped Sinclair to supply the BBC with a computer for its TV series, <1The Computer Programme>1, which taught people to use the machines. Advance orders alone ran to 12 000; ICL was signed up as the manufacturer but could not make the equipment quickly enough. Acorn also partly owns Torch Computers, which sells a #2500 machine for business users. Like the computers made by Acorn, this can be plugged into what is called a local area network--a system of wires and terminals that connects small computers within a building. Local area networks are a Cambridge speciality. It all started in the 1970s when the Cambridge University Computer Labora- tory designed a network called the Cambridge Ring. One of the first enterprises to exploit the ring network. is Xionics, based in London. Like Eddie Bleasdale, Xionics's chairman, Mike Bevan, is keen that his firm should con- centrate on new products, rather than spend a lot of time setting up mechanisms for producing and selling goods in massive numbers. So Xionics does not get involved in selling its computer network, which it calls Xinet; instead it has handed responsibility for this to a distributor. The network comprises a storage and control device which is linked to a ring of terminals and computers. Xionics had an easier birth than most small companies: it managed to persuade BP, its first customer, to put up #80000 toward the cost of developing Xinet. Since this first order in 1979, Xionics has sold networks to ICI, Allied Breweries, and the Cabinet Office. "We win every selec- tion procedure hands down on technical grounds," boasts Bevan. "Our only problem has been our size and credi- bility." Next year Xionics plans to quadruple production at its factory in Letchworth and to set up deals to sell the products in Europe and the Middle East. In Uxbridge, West London, another British company is making its mark in local area networks, but in a rather different way. An electronics specialist called Colin Crook set up Zynar three years ago as a subsidiary of the Rank leisure group. Zynar's main business is in importing a network made by Nestar, a Californian firm, and called Cluster One. Unlike Xinet, the network does not link machines made by more than one company. Instead the system connects computers made by only one firm--Apple--so letting them share expensive disc files, printers and so on. Many of the small British information technology firms began life as importers and distributors of American-made systems. Comart, of St Neots near Cambridge, started as a mail-order business, selling computers in kit form to electronics enthusiasts. After problems in obtaining com- ponents, Comart struck out on its own, producing its Comart Communicator, a small business computer. The system is what computer salesmen called "modular": extra memory, storage, and processor chips can be added to fit the particular job the computer is expected to do. Comart expects to sell products worth #11 million this year, double the figure for 1981-82. David Broad, Comart's founder, explains that the company finances new invest- ments mainly from sales. "The American way of doing things is to have an idea and then persuade someone to back it. Europeans tend to favour the entrepreneur who is self-funding; there is not the same confidence in new ventures over here." One firm that may have bucked that trend is Star Computers, which has raised some #770000 by selling shares on the Stock Exchange. Founded by David Blechner and Jack Schumann in 1973, the firm started out as a computer bureau, hiring time on its computers to cus- tomers who used them for their own jobs. Blechner likens the operation of a computer bureau to a laundry. "We took in other people's dirty washing," he says. "Then five years ago we woke up and decided to sell washing machines instead." Star is what is known as a systems house. It buys mini- computers from the American firm Data General, adds financial software, and then sells the package to accoun- tants. Some of Star's customers in the world of accoun- tancy have even set themselves up as dealers. A company with similar beginnings to Star but which has taken a different direction is ACT, based in Birming- ham. ACT was a computer bureau and systems house until five years ago when it started selling an American machine called the ADDS. After this Roger Foster, the firm's chairman, decided to expand into selling software for microcomputers. In 1979 ACT bought Petsoft, a company that publishes programs for the Pet Computer made by Commodore. This year, the firm has diversified even more--with a powerful American-built computer called Sirius, together with a set of business programs called Pulsar. So far ACT has sold 5000 of the computers, earning #8 million in the past six months. A crucial question is whether Britain's small informa- tion-technology companies will ever grow big. In other words, will they form the ICLs and Ferrantis of the future? The omens are hardly promising. Of a total of 250 types of small computers on sale in Britain (a "small" computer is defined as costing under #10000) about 30 are home- grown. The ratio is low, even though it is higher than in other areas of information technology; for instance, in minicomputers, British firms are heavily outnumbered by foreign companies. A more healthy statistic is that British companies account for one in three of the firms selling programs for microcomputers in this country. Yet while companies like Sinclair and Acorn sell more computers in Britain than many of their overseas rivals, the products of the UK firms are relatively cheap, so the industry's overall sales are not high. The American com- panies continue to dominate the higher-priced end of the market. For instance, Apple last year sold in Britain com- puters worth #42.5 million while Sinclair managed sales of just #8.6 million. It seems certain that the numbers of suppliers will also decrease once the well-oiled sales machines of IBM and Digital Equipment swing into action next year. Both firms have announced small computers and plan big sales cam- paigns. IBM is already planning to make its machines in a factory in Scotland. The disadvantage that besets British firms is that they cannot sell enough products in the UK to generate the cash needed to promote their wares overseas and develop new products. The fledgling computer companies also have to contend with the generally poor track record of Britain's financial institutions in backing high-technology firms with the kind of cash needed to turn little firms into big ones. Added to this, many of the firms do not even want to become big; they started small and like it that way. Given this set of circumstances, it could be that the new wave of information technology firms will never turn into a real breaker, but be seen in a few years as just a ripple on the pond. [] <2Battling IT out on the factory floor>2 -------------------------------------------------------------------------- <2Engineers often introduce into factories new electronic machinery not because it is better than the old>2 -------------------------------------------------------------------------- <2methods but to give management the upper hand over the workers>2 <2Barry Wilkinson>2 At the heart of attempts by firms in Britain to introduce modern technology is a battle for control of the shopfloor between management and factory worker. As a result, the precise direction of innovation by the companies that introduce new technology often has little to do with improved effi- ciency or better production, but depends on the different political interests of the managers and workers involved. Microelectronic applications to production machinery usually come in the form of control devices; so, on the factory floor at least, new technology is essentially bound up with control over the pace of production and the quality of output. By setting or programming these devices, a worker can predetermine the movement of machinery. This eliminates the need for an operator at the machine itself to intervene continually in the production process. As a result a company may have a golden opportunity to remove its operators from their central role in controlling production. Most people's view of how technology alters the work- place barely takes into account this shift in the emphasis of who controls production. In the conventional view of technological change in factories, computer-controlled machinery inevitably sweeps through industry because of the enormous competitive advantages. Firms that fail to take advantage go bust. And the people on the receiving end of technological advance, so the wisdom goes, simply have to adjust to the consequences. You can't stand in the way of progress! If this picture held true, then one would expect it to be confirmed in the real world of industry. In particular, one would expect to find two things. First the motivations of managers in introducing new technology would be simply <1econcmic>1 -- to increase efficiency, productivity and so on. Secondly, the social consequences of the technology would be inevitable, following directly from the logic of the change. More skills might be needed here, fewer there; some jobs might become redundant, but on the other hand others would probably be created. In my case studies this picture almost invariably did <1not>1 hold true. Motivations other than economic ones were often predominant. In my research, I investigated the introduction of micro- electronically controlled machines in engineering firms involved in what is called batch production. In this, goods are made not in continuous runs but in bursts of, for instance, 20 components at a time. In between batches, machinery is reset so that the next run of components is slightly different from the first. In the past, batch produc- tion has been difficult to automate because traditional control devices were neither cheap nor flexible enough to cater for the need for frequent readjustments of machin- ery. With modern microelectronic devices, however, automation is no longer only to be seen in factories that make identical goods in continuous streams, but is spread- ing to batch production. In one case study, an engineering company decided to automate a plating line. ln plating, components are dropped into various solutions to coat them with a metal such as zinc or aluminium. Under the company's old sys- tem, three men worked on each plating line. The men loaded parts onto a system of fixtures and controlled the fixtures as they dipped the parts into vats of different solutions. The workers also unloaded the "plated" components at the end of the process. In the new system, an electronic device controls the mechanisms holding the parts. The only jobs left (besides that of setting the equipment) are loading and unloading. So the new lines would each need only two workers, the increase in productivity paying for the new control system within a couple of years. In terms of the conventional wisdom, the motivation for the new technology appears uncomplicated--improved production and so on. Yet during interviews with managers involved, I found that other factors were at work. The managers complained that the workers were lazy and unreliable. They would take long breaks, and would sometimes work slowly, leaving components in vats for longer than necessary. In the eyes of the supervisors, "automation" was a way of gaining greater control over the pace and quality of output. Further, managers tried to consolidate their new powers by putting the new controls well beyond the reach of any shop-floor worker--on the other side of the factory, where only managers and engineers were permitted. Yet by no means everything went accord- ing to the management's plans. Initially, the new system would break down as a result of faults in the computer programs. So the platers had to use a manual override to get any production at all. Even after the programs had been proven, the workers continued to use the override, so thwarting the aims of their supervisors and retaining an element of control. In a sense, then, little had changed, as the managers had to resort to the old method of cajoling the workforce to extract maximum effort. The episode may seem to illustrate sheer bloody- mindedness, but the platers' action was not unreasonable from their point of view. Some took a pride in their work and explained that by using the override they could get a better product. In their view, the automatic machinery was unreliable and the management was wrong to attempt to wrest controls from the shop-floor. As events turned out, therefore, the management might have done better to realise that there are competing defi- nitions as to what constitutes "efficiency". In other words, the views of the shop-floor workers on what constitutes good production may have held some truth. In another factory, this time a machine shop, managers were divided on whether to allow shop-floor workers some aspects of control over setting computerised tools. 0ver several years, the company had introduced these new machines onto the shop floor. The machines included equipment with which an operator could do programming by editing the computer tapes that white-collar staff pro- vided. The tapes contain in- structions that guide the machines' cutting heads to shape components. Even though producing the tapes was primarily the responsibility of the com- puter staff, the machine operators insisted on cor- recting faults in the pro- gramming with the editing equipment. Sometimes they even tried embellishments that improved the tools' performance. Though most managers recognised the remarkable achievements of the machin- ists, the programmers were unhappy that blue collar workers remained in control. However, the superintendents and foremen who worked on the factory floor, many having themselves been skilled machinists, sided with the operators and consequently nothing happened to change the balance of power. The computer programmers, however, remained disgruntled. These working practices had developed quite informally, and had become "institutionalised" in a set of unwritten rules that in the end failed to satisfy anyone completely. Unwritten rules may be a precarious basis on which to run a factory. What should have happened, perhaps, was for all the actors in this particular drama to have realised the rival groups' strengths and weaknesses and for senior management to have developed control procedures that divided up responsibilities for the production process in an equitable way. For instance, management could have recognised right from the outset that the machinists had a vital role to play and not left this to be argued about in a battle with the computer staff. Had the management sided with the latter, it would have had to stop the oper- ators from "tampering" with the machine controls--perhaps by withholding a key that switches on the editing equipment. Such a tactic of isolating one group of workers from the control of technology was tried by another company I visited, which made goods from rubber moulds. As in the case of the plating company, operators had grown familiar with a new set of moulding machinery soon after it had been installed. By adjusting the controls, the workers had managed to "work the system" to their own advantage. Managers became alarmed as, against all expectations, the operators managed to control the machinery just as well as their supervisors and demanded higher wages. The episode threw into chaos the normal wage bargaining consultations and only after several months of discussions was a new system of work organisa- tion finally thrashed out. But managers told me that they had learned from the experience. They planned that the next wave of new machinery would be installed in an area away from the shop-floor where engineers would try out working practices so that shop-floor workers would simply not have the chance to gain the upper hand. Forgetting that the shop-floor is a "contested terrain", and that workers' aspirations for control are frequently translated into attempts to create desired working practices, is to miss a vital aspect of changes in production technology. But to expose and come to terms with these aspects is often difficult. After all, it may be in the inter- ests of some people, espec- ially those who introduce the new technology, not to admit to the existence of the real forces that shape the direction of technical change. A manager may have to justify, perhaps to a director who knows nothing about shopfloor politics, a particular piece of new machinery. If the manager can dress up the equipment as a way to increase effi- ciency, then he is far more likely to win his case than if he presents the hard- ware as a means of winning a political battle. [] <2The right way to wire up the nation>2 <2Information technology is impotent without a versatile and effective mechanism for channelling the>2 <2electronic signals that encode the information. That is why Britain should base its efforts to>2 <2cable up the country on optical fibre, rather than on an obsolescent technology>2 The issue for Britain now is not <1whether>1 the country should be cabled, but how. Decisions made over the next three months seem certain to shape the future of work and play in Britain until well into the next century. The cable will be based either on conventional coaxial links made from copper or the newer opticalfibres that have much higher "band- width"--in other words they can carry higher volumes of the signals that repre- sent communications traffic. The system will handle not only telephone calls and data messages but other signals that need high bandwidth, for instance those that encode TV pictures or large stores of computerised information. Furthermore, cable may make it possible for people to interchange infor- mationon a rapid two-way basis. For example, high-bandwidth cable could be vital in services in which people "dial up" their banks from home to con- duct financial transactions, perhaps with organisations at the other end of the country or even abroad. The Department of Industry will decide by March what kind of cable system should be laid and who is to provide it. Work will then start as soon as possible so that major conurbations at least are cabled by 1986. Once workers have installed Britain's new cable, no one expects that the country will want to replace it--at least not for a long time. The disruption caused by a second spell in which roads and pavements are dug up would be too great. That is why the imminent decisions are so vital. Unfortunately the technical, political and commercial issues are anything but clear-cut. All the signs are that, with the tight time-scale, Britain will reach a compromise decision which could be bad for the nation and bad for the future of information technology. In June 1981, prime minister Margaret Thatcher appointed a panel to advise her government on all matters relating to information technology. In March this year in its first report, the panel said Britain should be cabled in time for country's satellite TV service due to start in 1986. The future of cable is intrinsically linked with satellite TV. A consortium of British Aerospace, GEC-Marconi and British Telecom is to operate the satellite which will beam programmes from space directly to small "dishes" on houses and in back gardens. Initially the satellite will carry two TV channels (out of a maximum of five permitted by international allocation) both provided by the BBC. One channel will be a subscription service, mainly feature films and major sporting events. The other channel will draw on TV programmes transmitted around the world and will be financed entirely from licence fees, which will have to increase as a result. The government's information-technology advisers pointed out that British householders will not find it as easy to receive satellite signals as many believe. The aerial, a 0-9 metre dish, must point at a satellite hovering in space between 17s and 28s above the horizon. It must "see" the satellite with an accuracy of half a degree and be un- obscured by trees or roof tiles. As many as a quarter of British houses may be unable to erect a dish that can be lined up with the satellite with enough accuracy for good reception. This is why cable and satellite technology must advance together; reception centres in parts of Britain with a good view of the satellite could pick up the signals and relay them by cable into nearby homes. In the US, cable TV is already well established. Recep- tion from conventional ground-based trans- mission equipment is so bad that many families can watch television only if they receive signals piped in by cable. Some British homes already rely on cable TV. These are in new towns like Milton Keynes that prohibit TV aerials or in valleys that cannot receive signals at all. Both in Britain and in the US, cable feeds usually supply more programmes than are locally avail- able via radio waves. Apart from a few technical experiments, cable services the world over rely on coaxial copper cable similar to the wires that connect the TVs in people's living rooms to the aerials on the roof. As the price of copper increases, coaxial cable becomes more expensive. Also, like any other elec- trical wiring, especially if it is run underground, coaxial cable can be affected by damp. Further, and most import- antly, coaxial cable has a relatively small bandwidth, which limits the range of radio frequencies and so the number of channels that it can carry. Each British TV channel takes up about 8 MHz of band- width (whereas in the US the TV channels are narrower and the pictures consequently less clear). Many of the coaxial links in the ground today can carry only a handful of different channels at the same time. Modern coaxial systems, of the kind that American firms are supplying can carry around 50 channels. But even this is stretching the technology. An alternative approach-optical fibre-- has much to recommend it. Optical fibre is made of very thin glass, which is flexible and so transparent that it can transmit light with very little attenuation. If this light is modulated, or switched on and off to create pulses, an optical fibre can carry audio and video signals just like a coaxial cable. Because light waves have a high frequency, modulated light can carry signals of wide bandwidth. A bundle of light fibres, which together forms a cable smaller in diameter than a coaxial wire, can carry several hundred TV channels -- more than even the most hardened television addict will ever require. The bandwidth of such a system is wide enough to cope with high-definition television which requires around 30 MHz, instead of 8 MHz, per channel. And an optical fibre can easily accommodate two-way communication. Optical fibre promises also to be cheap, because it is made of sand. It is immune from damp and electrical inter- ference. But a snag is that the fibre requires an opto- electronic converter where the subscriber wants to plug into cable. This unit converts light pulses into electrical signals and vice-versa. Mrs Thatcher's panel of advisers recommend against linking homes with optical fibres on the grounds of cost. The group estimated that bringing coaxial links into the average British home would cost between #200 and #300. The comparable figure, it said, for optical fibres was nearly #2000. But the advisers subsequently admitted that they had based this estimate solely on figures supplied by British Telecom. These, in turn, had derived from an ex- periment in wiring up 18 homes in Milton Keynes--hardly a representative scheme. In truth, the price of the solid-state lasers and photo receptors needed in the termination hardware is contin- ually falling. For a few years it would undoubtedly be more expensive to cable Britain with optical-fibre links directly into the home. But by the mid-1980s, mass production will have reduced the cost of solid-state opto-electronics to insignificant levels. In their report, the prime minister's advisers did not come out against optical fibres altogether. They suggested a mixed system in which the trunk cables that connect towns and neighbourhoods are fibre, leaving coaxial wires to carry signals into houses and offices from a series of switching stations each serving about 100 subscribers. In more technical terms, this approach differs from the tree-and-branch system of cabling, the type used in the US. In a tree and branch network, all the input signals are fed into a main trunk and branches successively split off to serve each subscriber individually. So each link in the system has to handle the whole range, and bandwidth, of services provided. Instead of tree and branch, the panel recommend the newer concept of a switched, or star, system. In a star net- work, trunk lines carry all the signals to switching points which route selected services to individual subscribers. Each subscriber in a star system selects the signals he wants by keying instructions into a control pad. With star switching, people would find it relatively easy to connect devices such as microprocessors and business computers. With such equipment, staff could work from home instead of commuting to offices. But in advising that the switched cable links into homes should be of narrow- bandwidth coaxial cable rather than optical fibre, the information-technology panel was being short sighted, according to some observers. Its approach would drasti- cally limit the volume of information that the cable system can carry. In yet another report, Lord Hunt of Tanworth ignored the vital issue of technical standards for a future cabling system. In October, Hunt reported to William Whitelaw, the home secretary, on the consequences for Britain of the new technology and how it should be implemented. Hunt commented that his inquiry "was not concerned with the technology to be used in cabling the country, for example whether coaxial or fibre optic cable should be used". According to Lord Hunt, "decisions on the technical side don't affect the services on screen". But at least one minister expected Hunt to take a different line. On 20 April, Kenneth Baker, the minister for information technology, discussed in a lecture in Bir- mingham the standards for a cable system. He said: "Lord Hunt will be looking at these and many other areas in which we expect there to be a lively public debate." Baker then went on to emphasise the need for a wide-bandwidth system to cope with services such as home shopping, bank- ing, burglar alarms, fire alarms, and message transmission. "Broad-band cable means much more than an increase in the number of TV channels," he said. But John Butcher, a Junior industry minister said a few weeks ago : "The Department of Industry is aware of the danger that future cable development should not be 'boxed in' by overspecified, futuristic or inflexible require- ments for cable systems. We are determined to see the earliest possible introduction of cable and this means that national requirements should be minimal." Kenneth Baker cleared up some of the confusion in his statement to the House of Commons last week. He said that the government wished companies that get involved in cabling to emphasise the more advanced "star" systems. Even if operators choose to base their networks on the tree-and-branch system, they will be required to lay their ducts in such a way that the star technique can be accom- modated later on. And operators of star networks will be offered longer franchises than those who supply the older technology. All this, of course, presupposes that private firms do the lion's share of the cabling--the digging of ducts, provision of the wiring and installing of equipment to homes. But in the wings, British Telecom wants to get in on the act. The corporation (currently state-owned but soon to be privatised) already has considerable experience with optical-fibre technology, and argues that it should be allowed to play a large role in cabling Britain. Now that Hunt's committee has ducked all the technical issues and government ministers cannot agree on a policy, officials inside the Department of Industry are trying to decide how tightly the government should control the type of cable system laid, what technology to recommend and who should be allowed to do what. With the short time available and the conflicting issues so confused, it seems that the department will recommend a typical British compromise. Under this, British Telecom would lay optical- fibre trunk lines, leaving private enterprise free to install the switched links to interface both with the trunk lines and domestic TV sets. If such a system is adopted, it may keep TV viewers happy for a few years, but it will prove a handicap to the growth of information technology and a source of discontent within a very few years. [] <2Shorter words mean faster reading>2 <2Spelling is at the very heart of information technology but, so far in IT Year, it has been largely ignored.>2 <2The technique of removing surplus letters from English words, for example, makes the language easiest to>2 <2learn, quicker to read and may help to reduce eye strain>2 English spelling needs to be up- dated. Its unnecessary difficulties are an economic burden, contribute to educational and social problems and make communications less efficient than they should be. The British government has spent #1.2 million in 1982 to promote information technology and overlooked its essential base-words. Spelling is often regarded as a totem to be preserved by patriots, rather than a tech- nology, because it was invented thousands of years ago. It is a method to write down a - language which is commonly confused with the language itself. English spelling was standardised in its present form only a little more than 200 years ago. The guiding prin- ciples then of etymology and precedent would not be acceptable today. The weak- nesses of that scholarly approach are obvious --for example the lack of rationale for -<1ible>1 and -<1able>1 or the problems over deciding between -<1ease,>1 -<1eese,>1 -<1ese,>1 -<1eze,>1 -<1eeze,>1 -<1ieze>1 and -<1eize.>1 Spelling reform has been an area for armchair arguers and most theories are untested. However, research in cognitive psychology, cybernetics and comparative lin- guistics is beginning to shake old assumptions and to provide guidelines for investigation. Studies at Aberdeen University are testing how literate adults and learners could benefit from better spelling. Four aspects are being investigated: removal of surplus letters; the most useful forms of sound-symbol relation- ships; whether English spelling could follow its own rules consistently; and how everything in print could stay readable. One early finding has immediate practical value. Text with surplus and misleading letters dropped from the spelling can speed adult reading and help backward readers to make fewer mistakes, within minutes. This is not the same as shorthand, where omitted letters must be reconstituted for reading. "Surplus" can be determined by experiment. In practise it means "serving no apparent purpose for either pronunciation or meaning" as in: <1acomodation, miselaneus, reserch, lern, caotic, literat, hav,>1 <1giv, wer, ar, stil, wil, tho, thro, qik, Jump, skipd, anser.>1 Shortened spelling is being tested on word-processor screens. The possibility of reducing words here could be very useful as televised text and microfiche present prob- lems with eye fatigue and space. Cutting out surplus letters reduces the effort of reading and words become closer to compact "pictures", like Japanese and some Eastern alphabets which have been claimed to allow faster reading than English. Because longer words are usually more difficult for learners, the interests of learners and users may coincide. The aim is to remove clutter, whereas the random omissions of letters familiar to readers of well-known newspapers is a trend to be discouraged. Why try to improve English spelling? There are more than 260 ways to spell 18 to 20 basic English vowel sounds, and there are another 226 forms for the 23 consonants. Some theorists claim that English spelling is a triumph of English muddling-through that achieves the best possible compromise between conflicting needs. It is claimed to be simple for users and hard for learners, to aid readers and to hinder writers, and to speed the reading of the able while condemning the less able to semi-literacy. English spelling is also said to suit native speakers and to limit the international use of the language and its role as an educational medium in polyglot countries. The differing needs might not be irrecon- cilable with an improved system. Most claims for present spelling do not hold, or do not apply consistently. It is a tautology to say that English spelling is excellent because the reader can Jump around for clues on so many different levels (semantic, visual, phonic, contextual, etymological, and so on). The reason that so many clues are needed is be- cause they are chaotic and unreliable. Teachers often call reading "a psycholinguistic guessing game". Fashions in teaching come and go because none have solved the problem of English spelling. Coercive drilling produced too many "mindless reciters" with little time for other learning. Visual rote-learning by "look and say" methods puts a greater burden on the less able. Phonetics (sounding out words) is too unreliable. Even the best mixed methods are time-consuming and fallible. Today, with children turning to television for effortless, alternative entertainment, a distressing and accelerating trend in the teaching of reading is to sugar-coat the pill. Books are filled with pictures rather than text, and with trivial content and banal style, to make them "easier to read". A spelling system should be reasonable, so that beginners can use their intelligence in learning to read, and learn in 18 months, as in some European languages, instead of taking three years plus, as in Britain and the US. Children and overseas students should be able to learn new words without constant assistance or confinement to a restricted vocabulary. In the future, "visible language" may be revolutionised -perhaps by language-crossing symbols without the problems of Chinese. In the interim, English spelling could be improved, retaining the Roman alphabet, for continuity, perhaps with modifications. It is a handicap having 26 Latin letters for forty-odd English sounds. It is untrue that the only way to improve spelling is to return to the principle of "spelling as you speak" based on phonemics. This is easier to learn and can be learnt at a young age. The primary schools' experiments of the 1960s with the initial teaching alphabet (ITA) proved this. But there are no comparative tests to see whether any other form of spelling might be better still, without the disadvantages of ITA's divergence from familiar type-faces and its problems for those who have difficulty distinguish- ing all the speech sounds required. The "natural spelling" of beginners in many cases tends to condense rather than elaborate. Adults can read "spelling as you speak", although Dr John Beech of the New University of Ulster found that an afternoon was not enough to reach normal reading speeds in one well-known version, World English Spelling. Standardising a phonemic spelling could be controversial because of the variety of spoken English. Further, the visual links with present spelling and other European lang- uages may be weakened. Identical spellings for words that sound the same are not a real problem. Context made the meaning clear for the following words with multiple meanings in the previous few paragraphs: <1book, original,>1 <1form, still, familiar, type, tends, well, lost, sounds, can,>1 <1found, links, present.>1 Another approach is to apply consistently the supposed advantages of present spelling. Beech found that, when the most common rules of our inconsistent system are applied consistently in a piece of fiction, adults regained their normal reading speed after reading some 6000 words. Beech's idea of consistent rules could be developed for even easier reading and learning. One advantage claimed for present spelling is that related words keep a similar appearance even if there are minor sound changes in, for example, <1divide, division; anxious, anxiety.>1 Beech claims that this aids fast readers skimming for meaning as well as helping learners with new words. In fact, English spelling abandons this principle more often than it keeps it. For example: speak, <1speech,>1 or fly, <1flies, Feight, flew.>1 One interesting possibility is to extend, when possible, the convention that vowel sound can be distinguished by the presence or absence of a following "e". That is, instead of: <1finite, infinite, finish, final>1; or in a "phonemic" version, <1fieniet, infinit, finish, fienal;>1 both meaning and sound could be clear in, <1finite, infinit, finish, finel.>1 Reading failure can have many causes. Often school- children aged in their early teens, who struggle through a reading test with a 15 per cent error rate, reduce errors to 2 per cent and increase confidence visibly on first meet- ing a rule-based spelling. What they can read without any introduction is certainly possible for literate adults to adapt to. <2Foreigners do better with fewer words>2 One aim of experiments is to search for a compact spell- ing which can show sound and meaning as directly as possible, meet the needs of all who learn and use it, and is still close to existing spelling which has, so to speak, been "cleaned up". Shortened spelling is often read faster than conventional spelling by beginners, backward readers and foreigners. Readers who rely heavily on conventional visual rote- learning may adJust more slowly. The "rule-based" spell- ing currently being tested is about 20 per cent slower to read for the first 1200 words, although some backward readers and beginners immediately improve speed or accuracy. A comparison with pictures may explain why it could be more effective to modify exact sound-symbol relation- ships with rule-based conventions in spelling. Phonetic representation of specific speech sounds is like a photo- graph of a specific person. Phonemes for the sounds in a particular language are like a sketch of a type of person; a conventionalised drawing of a man can represent any human being anywhere. A "diaphone" (the ways in which a phenomenon can be pronounced) is somewhat similar. For example, in "banana", the letter "a" is a convention repre- senting three different sounds all with an "a" quality about them. They are easy to read and spell for this word, and variant pronunciations of "banana" are recognisable across cultures. In popular belief, adults would take years or generations to adapt to spelling change, although in this decade vaster changes have taken place in everything else. But in fact readers can adapt to shortened spelling within minutes regardless of their own views or suspicions. The introduction of change may be most acceptable and often unnoticeable through new media such as video screens. It can co-exist as alternative spellings during transition--indeed, a hundred or so words have been simplified this century when dictionaries have listed alter- native spellings and the public has opted for the easier way of arranging the letters. It is a wasteful evasion of the problem to build English dictionaries into computers to correct people's spelling. The same computer techniques could be more usefully employed in standardising the printed word during transi- tion, and the public would become acclimatised through everyday reading rather than need any special training. There is a bandwagon effect that is apparent once init- iatives are taken. Spelling must be identified as a human invention, open to research and development, as much as every other tech- nological element in the spectacular and marvellous world of microprocessors, video, telecommunications--and, not least, of books. [] <2When the planning has to sto>2 -------------------------------------------------------------------------- <2France has grandiose plans to ensure that information technology will buoy up the nations economy>2 <2and improve the quality of life. But the country is learning painfully that planning is not everything>2 Undeterred by past failures, gov- ernment planners in France are once again designing an ideal information-technology world. Faced with France's multicoloured vision of IT, observers could be forgiven for thinking other nations are still preoccupied with the small-screen black-and-white, version. IT will solve France's problems, protect the nation from cultural invasion and give the man in the street a new cultural <1Joie de vivre->1-or so the planners say. Yet despite the nation's visionairies, reality is vastly different. IBM with its #18000 million turnover does far better at IT than France Inc. with its #400000 million gross domestic product. One reason is that IBM backs its unoffi- cial motto--THINK--with action. The French, on the other hand, have so far stopped short at thinking. Rene/ Descartes, the I7th century French philosopher, gave the world the expression <1Cogito, ergo sum>1 (I think therefore I am) and endowed the French with an invincible confidence in they very own way of look- ing at the world. Unfortunately, French planners seem to have adopted the philosophy <1Cogito, ergo est>1 (Thinking enough about something will make it happen). Just as the idea of a voyage was more satisfying than the trip itself for the poet Baudelaire, the Romantic tradition in France is strong enough for the idea of a well thought-out plan to be more important than its execution. "French industry is more interested in pro- duction than in sales. We must correct this tendency which is almost a cultural problem," Louis Gallois, a leading civil servant told a conference last month, so revealing that planners have at last caught a whiff of their inbuilt handicap. Despite grand plans for the "informationisation of society", the partnership of government and industry which operated under the government of Valery Giscard d'Estaing left the French information-technology industry in a mess--at least compared with American or Japanese firms. "Ambitious but realistic" targets in the nation's IT balance of payments were achieved, but not maintained. Last year's deficit was about #125 million ( although this compares well with the comparable figure for the UK which is almost three times higher) and this year the deficit will probably be about the same. The nation's main data-processing firm, Cii-Honeywell Bull, will on its own lose around #90 million this year. Like Britain's ICL, it was formed from several disparate companies offering different types of computer. In the past few years it has suffered from serious errors in planning and a lack of investment. The current govern- ment and Jean-Pierre Brule/, the sacked head of Cii- Honeywell Bull, blame Saint Gobain, the firm's controlling shareholder, for taking more money out of the company than it put in. But not only Cii-Honeywell Bull failed to make the grade. Many of the once-vaunted French "peri-informa- tique" companies (those making small computers and associated equipment) also came a cropper. Some of those which did worse benefitted from funds under France's 1976 peri-infor- mantics plan. Notable amongst these was Thomson CSF's minicomputer subsidiary, a company called <1SEMS.>1 In 1976, SEMS was the world's fourth largest maker of mini-computers. But since then it had failed to hold its own against big American firms such as DEC--both in sales and in technical performance. More- over, the government's decision to make SEMS responsible for de- veloping minicomputers added to the problems of Cii-Honeywell Bull. Refus- ing to be excluded from this growing sector, the com- pany started to make and sell "minis" developed by one of its shareholders, the American firm Honey- well. The move cut across the efforts of SEMS, and stopped it achieving the sales it had hoped for. If industry and govern- ment planners scored low marks during the past decade, they were in good company. The autocratic French postal and telecom- munications administration (PTT) muffed several of its biggest proJects. After say- ing it would produce a #200 facsimile-transmission unit designed to bring electronic mail to the person in the street, the PTT has finally acknowledged that the proJect is a non-starter. Five or six years ago, the PTT also announced grandiose plans for a nationwide network of videoconference centres. Some 200 were due to have started by the late 1970s; with the centres business people in towns around France could have held conferences over telecommunications links with counterparts on the other side of the country. But only a few centres ever opened their doors. Even the PTTs "electronic telephone directory" turned out to cost twice as much as the #50 planned. With the directory, householders can find out telephone numbers by pressing keys on a terminal rather than consulting the operator. The chances of the administration installing 34 million terminals by 1992 (its original target) seems remote. Officially announced as available from October this year, the cheap terminal (now called Minitel) is still unheard of in some official PTT outlets in Paris. The PTT also boasted it would install 20 000 kilometres of optical fibre in Biaritz. The final total will probably be one- eighth of this figure. In consumer electronics, the French have had a more than usually protected market because of the SECAM colour television standard. This makes it difficult for foreign firms to sell TVs in France. Thom- son has managed to hold its own on some soil, but it has done lamentably in sell- ing domestic video equip- ment. Now nationalised and backed by government money, the firm may buy its way into video technology and markets. The company recently developed a #300 microcomputer, though this compares poorly with cheaper devices made by Clive Sinclair, Britain's leading electronics entrepreneur. The programming side of IT has also been a weakly child. French television and film output is of poor quality and has achieved little success worldwide. On the other hand, American and British imports have done well. Despite efforts by French publishers, very few firms sell software for the increasing numbers of computer owners in schools and at home. The catalogue of failures continues. The one maJor attempt at international cooperation has turned sour. Saint Gobain took a substantial share in Olivetti to try to cement a tie between Cii-Honeywell Bull and the big Italian multinational. Olivetti has so far refused to play ball, effectively pocketing Saint Gobain's money which could have gone to its own computer company. As well as not making top grade in IT industry, France seems to have fallen short of ambitions in using the tech- nology. The lukewarm reception among the public for the electronic telephone directory is but one example of how the French people are not always ready to react according to the government's plans. The country is behind Britain in using microcomputers. Apple claims that Britain accounts for 35 per cent of the small computers sold in Europe, with Germany and France level at around 20 per cent. Office automation is another area where the French may be lagging. Louis Nauges, a commentator on the elec- tronics industry, says that Europe as a whole is ten years behind the US in using automation in the office. France is particularly bad; Frost & Sullivan study shows that word-processing software sales in France are less than one-third those in the UK. Last of the big actors where France has missed the boat, at least for the present, is in robotics. With some exceptions (notably Renault French industry has failed to instal modern manufacturing equipment. To make up for a late start, the government this year announced it would make available #125 million of grants to push industry into the robot age. What about the impact of IT on society? This is where French theorists have been very active. Bringing out reports on IT has been something of a growth industry. Witness the Tricot report on this data processing and civil liberties in 1975, the Nora-Mine document on the informa- tisation of society in 1978 and, two years later, the Madec report on IT. At one time, politicians sais that, with data processing, firms would be abl to set up offices in remote areas of France' well away from Paris. This could help the nation to spread economic development more evenly. But despite the existence of tariffs for data communications that are independent of distance, the great decentralisa- tion phemomenon has yet to be observed. France scores a lot higher than the UK in Iegislation to protect individuals against the misuse of data that could threaten privacy. Yet the country's performance in this area hardly lives up to promises when the relevant laws were passed in 1978. Several months ago, France's interior minister announced a new file on people suspected of being terrorists. He conveniently forgot that before he could do this, the law obliged him to seek the opinion of France's data protection commission. There are a few bright stars in the French IT firmament. The most substantial is the nation's success in telecom- munications. Ironically, the firm which achieved success in this area--CIT-Alcatel--was the most bitter opponent of Giscard d'Estaing. The company will be the only nationalised information-technology firm to make a profit this year. Software for large machines and computer services in general (computer bureaux for example) add up to an important and successful industry; computer companies sell more equipment in France than any other part of Europe. Ironically again, French software and computer- services companies have benefitted little from direct government spending. They claim, indeed, that the govern- ment's outlay on software and services is lower in France than in many European countries, where public spending in this area is traditionally high. Finally, as a country with a competent military industry, the nation has a strong hand in military and complex civil electronics such as radar and broadcasting. Thomson CSF and Matra have excelled here. The other side of the coin is that Thomson, in particular, has had problems in selling to private firms; its sales force has, so it seems, learnt the right way of buttering up only government clients. President Mitterrand's government (which has still to finalise its IT plans after more than 18 months' delibera- tion) will, so it says, build on French strengths and remedy the weaknesses. The principal weapon will be money. Under the electronics plan, government and industry (in- cluding foreign firms in Prance like IBM) should spend #12000 million over the next five years. Just where the cash will be spent is not yet clear, though presumably much will finance R&D, and development of new products and sales strategies. Louis Gallois points out that France's state programmes to develop telecommununica- tions and nuclear energy consumed similar amounts of money during the 1970s-and largely succeeded. Jean- Claude Hirel, the head of the government's electronics division, cites the lack of spending as one reason why the previous electronics plans went wrong. Hirel says the new strategy will be better because companies will be told much more clearly what to do. Also, the government will place great store on encourag- ing firms to be competitive. "It's no good having a pro- tected market," insists Hirel. "There won't be any promises of state purchases." One of the biggest handicaps faced by France's planners is the scepticism of private enterprise. Companies do not much care for the interventionist policies of Mitterrand's government, including its decision to nationalise all the large IT companies. For the planners' part, they know that they must come up with good results to make up for the inadequacies of the previous strategies. But so far, the government has not dared to predict when gocd results may come. [] <2What whaling means to the Japanese>2 --------------------------------------------------------------------------- <2Japan's reluctance to abandon whaling has deep cultural and religious roots. If it is regarded>2 <2simply as oriental perversity the Japanese are unlikely to change their ways>2 Christopher Japan, the leading whaling nation, Moreby seems set to resign from the International Whaling Commission. When the recent IWC annual meeting voted to ban all commercial whaling (25 for, 7 against) the Japanese delegation and press predictably condemned the decision. The need to catch whales goes back a thousand years or more in Japanese his- tory. Whale meat made up for the lack of other sources of protein in the Japanese diet. In contrast, Western whaling nations needed the whale oil, although it is true that products from coastal whal- ing in Europe included meat as early as the llth century. When Buddhism was first introduced into Japan in the 6th century, Buddhist tenets forbad the taking of life and hence eating meat from terrestrial animals, so preventing the development of a land source of animal protein. This religious ban compounded the chronic shortage of grazing land. However, Buddhist teaching was not applied to fish, which became the main source of protein in the Japanese diet; and whales were regarded as fish When eating houses renamed forbidden deer and boar meat "mountain whale", the Japanese knew that eating whale meat was likely to stay. Japan now also utilises other parts of the whale's body: blubber for food; oil for many things, including boot polish; whalebone in all kinds of articles; teeth in hadi- crafts; skin for leather; and various internal organs that are put to assorted uses. At first, the Japanese simply took stranded whales for fcod. Then they began catching such coastal species as the great right and grey whales from rowing and sailing boats, using simple spears and harpoons. When coastal stocks began to dwindle Japan looked to species frequenting deeper waters, such as the rorquals (and others). The only trouble was, the whales sank when killed. So the Japanese developed, earlier and independently from the West, the idea of netting them to prevent sinking. Used first on the Kii Peninsula (south of Osaka) probably in the l7th century, the technique was simple. Lookouts on outlying islands warned fishing villages of approaching whales, whereupon great numbers of men in small boats gave chase. The boats surrounded the whales, drove them into nets, where they became enmeshed and were rendered helpless by harpoon thrusts. When Japan opened up to Western technology in the latter half of the l9th century their whaling fleets were smaller than their Western counterparts. But the Japanese soon realised the commerical potential of whaling and, adopting steamships and the newly- invented harpoon gun, their fleets grew. Japanese whaling greatly expanded, especially in the 1930s. By this time they were taking species such as the blue, sei and sperm that are found in the deeper oceans, and were whaling as far afield as the Antarctic and Atlantic oceans. The Japanese whaling fleet contained only two factory ships before the Pacific War, but after 1945 the number increased to seven. The whale catches also increased until a peak in 1965 when the Japanese caught no fewer than 26 986 whales. Then the catches dropped suddenly. In 1966 the catch was 22784; in 1970, 16887; in 1976, 9632; and in 1979, 4618. Other whaling nations recognised the destruction of whale stocks and set up the International Whaling Commission in 1946 to control international whaling, Just when Japan was developing her whaling industry to full commercial efficiency. Two whaling nations that had been among the largest, the United States and Britain, gave up whaling, and others have recently given up or are showing signs of doing so. At the recent IWC annual meeting, Iceland and Brazil, both previously die-hard whaling nations, abstained from voting for a total ban on whaling, but did not vote for a continuance. However, seven countries still refuse to vote for a total prohibition, including Japan, which is the leading protestor and which today has the largest whaling fleet. Only one Japanese factory ship remains in operation, and one Japanese whaling company is left, accounting for only a fraction of the income that once was earned from whale products. But Japan is usually quick to rationalise --that is, eliminate--her declining industries. So why does she continue so vocif- erously to float IWC mora- toria, and adopt such a recalcitrant attitude to- wards an industry that is so obviously in decline? Three possible explanations can be advanced. The first, given by the Japanese dele- gation to IWC meetings in recent years, is that whale meat is still a large part of the Japanese diet, Just as chicken and pork is part of the European diet. But is this true? In 1978 Japanese whale meat accounted for only 1 1/2 per cent of the total meat consumption; in 1970 it was 9 per cent. Per capita consumption of whale meat has dropped from 2 kilograms in 1960 to 0-63 kg in 1978. This fall in con- sumption is due partly to reduced demand and partly to increasing prices, in turn resulting from reduced catches. Whale meat in Japan is now a luxury item for wealthy businessmen, or for special occasions such as weddings and religious festivals. A second pcssible explanation is political. The ruling Liberal Democratic Party (LDP) owes much of its support to the rural electorate, including the fishing communities. The LDP fear that an attempt to phase out whaling, even for a theoretically temporary period of a moratorium, would lose them the support of the fishing communities. Even though fewer than three thousand people are now engaged in whaling, directly and indirectly, their loss of support might trigger a loss of confidence among other rural people and jeopardise the LDP's increasingly frail hold on government; an office they have held for more than 25 years. But probably the most important explamation of Japan's failure to conform to IWC policies is its continuing aversion to the demands of external agencies. This has its origin in the late 19th and early 20th centuries when Japan realised it needed natural resources from overseas in order to industrialise effectively to compete with the West. Japan has few natural resources of its own, and any threat to those it dces have is seen as a matter of life and death to the japanese people. In the 1930s, when the Japanese perceived that the Americans, British, Chinese and Dutch, were threaten- ing their supplies of oil and raw materials they coined the term "ABCD encircle- ment", and used it even- tually as an excuse to start the Pacific War. Today, Japan sees the whale issue as another example of the West impos- ing its own values on a vul- nerable country. Indeed, opposition to whale conser- vation campaigns in Japan are based on appeals to nationalism. For one or all of these reasons, Japan is reluctant to accept the scientific evi- dence and recommenda- tions of the IWC scientific committee, attacks the morality of the commission, and accuses the conservation lobby of racial prejudice. The Japanese claim, first, that because whaling is uncertain the benefit of the doubt must be given to the whalers. Secondly, they say there is no scientific evidence that whales are near extinction. And thirdly they argue that a model formulated by Japanese scientists to estimate the numbers of sperm whales, which is based on age of whale at sexual maturity, offers better evidence about whale stocks than the model developed by the Inter- national Institute for Economic Development, which uses length of whale; yet the Japanese model does not fit the data as well as the IIED model. They also refuse to acoept a recommendation of a zero quota for coastal sperm whales because, they say, this particular whale species is caught entirely within the 200 mile (320 km) exclusive economic zone, and therefore free from IWC restrictions. When the IWC was established in 1946, its purpose was to work for the resuscitation of an industry that was declining because stocks had been over-exploited during the previous decades, rather than to conserve existing whale stocks. The Japanese whalers feel that the present IWC is more concerned with emotionalist conservation for its own sake than with preserving whale stocks for commercial exploitation. Thus they see no legal or moral obligation to accept any decision of the IWC on proposed quotas and moratoria. Hence their rejection of the sperm whale moratorium at the 1980-81 IWC meetings. Unfortunately, the Japan- ese attitude towards con- tinuing whaling in the face of stern worldwide opposi- tion implies that the Japan- ese people are not conser- vation minded. Both at IWC meetings and in published articles, conservationists have implied that the nat- ional temperament of the Japanese is both selfish and anti-conservationist. Yet although Japan is self-inter- ested (like every other nation) it is not anti-conser- vationist. Japan now has 27 nat- ional parks, covering just over 5 per cent of the total land area, and over 50 quasi-national parks. These are scattered throughout the country and include most of the variety of habitats in Japan. Each year, millions of people visit these picturesque areas. A government environmental agency investigates many environmental problems such as conservation and pollution. There is even a whale research institute in Tokyo which contributes to a better understanding of whale ecology and whaling tech- niques. An explosive harpoon is being developed, not only to damage less meat, but to kill whales quickly and humanely. The kind of aggression by organisations such as Greenpeace, which harasses whaling operations at sea, only raises the ire of whaling nations like Japan, and exacerbates the problem. What is needed are peaceful campaigns aimed at the whaling communities via public demonstrations; japan is not averse to these measures, as the countrywide anti-nuclear demonstrations show. Anti- whaling notices in the press and on television by, for example, the Japanese branches of Friends of the Earth and World Wildlife Fund would help this campaign. Anti- whaling could centre around, for example, alternatives to whale products, as listed in the <1Whale>1 Manual published by Friends of the Earth. Other approaches might originate from within the IWC itself. First, the IWC would concentrate on supporting whaling research, such as developing methods to assess the age and numbers of whale stocks accurately. If whaling scientists from all nations pool their knowledge instead of battling each other, then perhaps mutual distrust between Japan and non-japanese whaling scientists might disappear. Cooperation in the conservation of whales could begin. Secondly, pragmatic poli- tics could play a larger part in the thinking of the IWC. A compromise might be agreed with whaling nations who dissent from voting for moratoria by allowing them a more restricted quota than orig- inally proposed. At subse- quent meetings efforts could be made to reduce this quota further. Such an allowance was decided at the 1980-81 IWC annual meetings, when Japan was allowed to catch 890 sperm whales per year off the Japanese coast where stocks are still high despite the 1981 IWC ban on catching sperm whales. On the other hand, Japan must not assume that membership of the IWC means a green light to catch unlimited numbers of whales. The Japanese recognise that the whales in the oceans are not a limitless resource, and that these animals need time to breed and mature. Japanese whalers, as well as those from other whaling nations, must learn to catch whales selectively, and free immatures and females in the breeding season. If Japan resigns from the IWC and refuses to catch whales sensibly the United States might conceivably impose a trade embargo. After the trade wars with the West over cars and electronic goods, who can tell what that might mean for a country so dependent on trade? [] <2Molecular drive: a third force in evolution>2 ------------------------------------------------------------------------- <2Evolution may not be prompted solely by the reproductuive excesses of a few strikingly superior>2 -------------------------------------------------------------------------- <2individuals. A newly-described mode of evolution, termed "molecular drive", causes the genetic>2 -------------------------------------------------------------------------- <2make-up of all individuals of a population to change in unison>2 A new and widespread mode of evolution has been described by a research group led by Dr Gabriel Dover, at the University of Cambridge. He argues that, even in the absence of natural selection or genetic drift, a population can go through a process of evolutionary change. Moreover, individuals in a population will change in unison, with very little genetic variation between them during the period of change. This unexpected cohesive mode of evolu- tion he and his colleagues, E. Coen, T. Strachan and S. Brown, have called "molecular drive" (Nature, vol 299, plll)- Evolutionists, up to now, have considered only two processes that could spread mutations of genes to all individuals in a population and so create evolutionary change. The first and foremost is natural selection; individuals that are particularly well-adapted to their- environment reproduce more than their fellows and so copies of their genes become more common. This idea dates, at least in principle, from Charles Darwin's <1The>1 <1Origin of Species by Natural Selection,>1 of 1859. More recently, population geneticists have recognised a second evolutionary force: genetic drift. Drift is a random process of genetic change due to "sampling error". By chance certain genes will become relatively common in a popula- tion simply because large numbers of gametes, which may contain other genes, are wasted. Dover is now proposing a third force that can genetically transform a species; transforming not because of the external forces of selection or drift, but from within. In Dover's molecular drive model, the genetic changes still arise through mutation. However, in contrast to the spread of classical gene mutations by selection and drift, Dover points out that some mutations can also be spread by internal mechanisms of "turnover" in DNA. This turnover, and its evolutionary consequences, occurs only in genes that have several copies within the same individual, so- called "families" of genes. Families of genes with multiple copies are numerous in all animals and plant studied so far. They can have from a handful of copies to several hundred, often scattered on many different chromosomes. In almost all families, the copies are unexpectedly similar to each other within each species and yet different between species. Good examples of this are the genes that give rise to the histones (proteins that make up the chromosomal superstructure), the ribosomal RNA, the immunoglobulins and many others. The extent to which genes within a family are similar to one another (that is, the degree of homogeneity in a family) depends upon a balance between the rate of homogenisation, the rate of mutation and the selective forces that also act on the family. Mutation (the alteration of a gene) and selection are well established. Homogenisa- tion--the forces keeping the genes in a family similar to each other--is more subtle; but the way in which this can occur can now be explained by a variety of molecular mechanisms which ensure, willy-nilly, that one variant copy would replace all other copies. Turnover is a consequence of frequent molecular exchanges between copies of a family. Such exchanges can cause fluctuations in the prevalence of a particular variant copy, and are an unexpected consequence of the chemical and physical properties of DNA. In one type of exchange, called gene conversion, two intially slightly different copies end up sharing the DNA sequence of one of them. One copy converts the other. The direction of conversion is often random, but sometimes one variant copy per- sistently converts the other. But random fluctuations in the direction of conversion, or a bias in the direction of conversion, eventually ensures that all copies of a gene family in all individuals are of the same type. The process of unequal exchange also enables one variant of a gene to supplant, eventually, an originally much more abundant alternative. In normal "crossing- over", two chromosomes exchange an exactly equivalent segment of their DNA. But if the exchange is unequal, one chromosome ends up with an extra copy of the segment and the other without the particular segment. This leads to random fluctuations in the numbers of alternative variants making up gene families and, as in gene conver- sion, one variant copy may replace all the others through- out a sexual population. Another mechanism, called transposition, can also in- crease the frequency of one variant through the genome. Segments of DNA can make extra copies of them- selves which then move off and re-insert on completely different chromosomes. Random gain-and-loss of variants through such jumping mechanisms can eventually homo- genise a family in all individuals. Dover's crucial insight was the surprising realisation that the slow, gradual homogenisation of a gene family by these internal mechanisms could bring about a more or less synchronous genetic change in a sexually repro- ducing population. In each individual at any moment during the period of change, a similar proportion of the copies of a gene family will have been replaced by a new variant copy. This genetic cohesion of a population arises because, while the very slow processes of homogenisation are at work (among copies of a family on the same and different chromosomes), the chromosomes in a population are being mixed at each generation. The slow homo- genisation of large families could take many thousands of generations; but chromosomes are constantly shared out among the members of a population. During meiosis (the cell divisions that create the gametes), chromosomes are randomly assorted. And during fertilisation, gametes meet up and fuse at random. So at every generation, each individual tends to get a comprehensive mix of the chromo- somes that are available in the population. This random mixing of chromosomes is so much more rapid than the observed slow rates of the DNA turnover mechanisms responsible for family homogenisation that the genetic makeup of a family tends to be very similar in all individuals. The genetic cohesiveness of a population may have some surprising evolutionary consequences. A mutant copy, spread throughout a gene family, may alter some aspects of the animal's form or physiology (phenotype). But the change in phenotype would be gradual and con- current in all individuals; the genetic variation and dif- ferences in fitness between individuals in the population would be kept to a minimum. Many of the genes in a family might have to be converted to a new variant before any significant change in the phenotype occurs-but by that time all individuals would have a similar pro- portion of the new variant and so would be physically very similar. If the change in phenotype is deleterious and selection acts against it, then many individuals would suffer the same fate. But if selection positively encourages the change or is neutral to it, most or all the individuals in a population would gradually change from one phenotypic form to another. It is this change "in unison" that is such a distinctive feature of mole- cular drive. Dover points out that while non-mendel- ian segregation of single- copy genes can accelerate the increase in the fre- quency of a favoured muta- tion, this process would not create the cohesive pattern of genetic change that occurs when similar proces- ses operate within a gene family whose members are distributed on two or more chromosomes. Such a cohesive transfor- mation under molecular drive could initiate new modes of biological organ- isation. Dozens of gene families have been des- cribed with precise effects on the phenotype of organ- isms. Future studies could show that genetic changes in gene families could have effects on any aspect of phenotype, from sexual behaviour to morphology, and might also influence the behaviour of chromo- somes and the expression of genes. The unexpected cohesive system of genetics of molecular drive could permit long-term changes in biological organisation which might lead to both pre-mating and post-mating barriers, and the formation of new species; not within a population, for the change is occurring in unison, but between different, separate, populations. Molecular drive, of course, will continually interact with the forces of natural selection and gen- etic drift, Dover empha- sises. But evolution, it now seems, must be a complex outcome of all three modes of genetic change: adap- tive (selection), accidental (drift) and the cohesively- driven processes of change that constitute molecular drive. []