On 25 July 2000, an Air France Concorde bound for New York crashed in flames shortly after takeoff from Charles de Gaulle Airport outside Paris killing all its passengers. Officials were quick to point out that the crash was the first of a Concorde since the supersonic plane went into commercial operation in 1976. But the Concorde has been flown much less than, say, the Boeing 747. Further, there have been quite a few troubles with the Concorde in the past. For example, between 1979 and 1981, on four separate occasions, tyres blew out as the planes were taking off. Due to the high stresses from supersonic flight, on several occasions sections of the tail have fallen off. Over the last 15 years, there have been at least four emergency landings.
A week prior to that, on 17 July, a Boeing 737 belonging to Alliance Air crashed in flames into an apartment block near Patna airport. Though tragic, this crash doesn´t come as a big surprise given the poor record of air safety India has. In the 1990s alone, there were at least three major civilian air crashes. India´s Comptroller and Auditor General reported in 1997 that there had been 187 accidents and 2729 incidents involving Indian Air Force (IAF) aircraft between April 1991 and March 1997, resulting in the loss of 147 airplanes and 63 pilots.
Such aircraft accidents have obviously prompted increased attention to safety, leading to design improvements and safety features. Nevertheless, accidents have continued. In studying the safety of airplanes and other hazardous technologies, several sociologists and organisation theorists have come to a pessimistic conclusion: serious accidents are inevitable with complex, high-technology systems.
Charles Perrow of Yale University, who coined the term ‘normal accidents’ to describe such accidents, identifies two structural features of many hazardous technologies — ‘interactive complexity’ and ‘tight coupling’ — which make them highly accident-prone regardless of the intent of their operators. According to Perrow, ‘complex interactions are those of unfamiliar sequences, or unplanned and unexpected consequences, and either not visible or not immediately comprehensible’. Tight coupling means that ‘there is no slack or buffer or give between two items; what happens in one directly affects what happens in the other’.
In addition to these structural factors, normal accident theorists also point to conflicting interests both within organisations and between organisations and the broader political community, which make accidents more probable while making it unlikely that organisations will learn the appropriate lessons from accidents.
Normal accident theory has been put to severe tests and has generally been successful. Scott Sagan of Stanford University, in an important and wide-ranging study of nuclear weapon systems in the United States, identified a number of close calls, and concluded that while on any given day, the risk of a serious nuclear weapons accident may be low, in the long run, such an accident is extremely likely.
Analysing the explosion of the Challenger space shuttle in January 1986, Diane Vaughan of Boston College pointed out that while the disaster was the result of a mistake, what is important to remember is not that individuals in organisations make mistakes, but that mistakes themselves are ‘socially organised and systematically produced’. The origins of the accident ‘were in routine and taken-for-granted aspects of organisational life that created a way of seeing that was simultaneously a way of not seeing’.
Perhaps the best illustration of a hazardous technology that displays the structural and political problems that normal accident theorists point to is the nuclear reactor. The technology is highly complex, with different components interaction non-linear, unfamiliar ways. The time scales involved in different processes are very short; operations can quickly spin out of control. Compounding the problem is the secrecy and control maintained by the institutions that construct and operate these reactors.
Even so, the nuclear establishments of the world have persisted in claiming that the probability of reactor accidents is very low. South Asia´s authorities are no exception. Last year following the Tokaimura accident in Japan, India´s Atomic Energy Commission (AEC) chairman claimed: ‘There is no possibility of any nuclear accident in the near or distant future in India. We have 150 reactor years of safe operation.’
It is worth contrasting this with earlier similar pronouncements. Just three years before the Chernobyl accident, writing in the Bulletin of the International Atomic Energy Agency (Vol. 25, June 1983), the head of IAEA´s safety division claimed: ‘The design feature of having more than 1000 individual primary circuits increases the safety of the reactor system—a serious loss of coolant accident is practically impossible…the safety of nuclear power plants in the Soviet Union is assured by a very wide spectrum ´of measures…’ But on 26 April 1986, Unit 4 of the Chernobyl reactor went critical and exploded, releasing an immense amount of radioactivity into the atmosphere. Practically, every country in the northern hemisphere received some radioactive fallout. Between 100,000 and 150,000 hectares of agricultural land had to be abandoned. Estimates of worldwide deaths resulting from the radioactive contamination vary from a few hundreds to tens of thousands.
What makes the assurance offered by the Indian AEC chairman. even more absurd is that at the time of the Chernobyl accident, the Soviet Union had over 1000 reactor years of experience. The confidence is also misplaced because there have been several accidents over the course of India´s nuclear history examples include the fire at Narora, multiple heavy water leaks at Kalpakkam, the collapse of the containment at Kaiga and flooding of the pumps at Kakrapar. It was only sheer luck that none of these resulted in major catastrophes.
With secrecy written into its mandate through the 1962 Atomic Energy Act, the Indian Department of Atomic Energy has been able to hide unpleasant facts from public scrutiny to a greater extent than most countries. In part, the secrecy reflects the close connection between nuclear power production and nuclear weapons development. But it also serves to cover accidents, safety violations and poor performance.
India is not alone in continuing the expansion of nuclear power even in the face of these risks. In South Asia, Pakistan is also following the same path. And Bangladesh has recently announced its intention to start a nuclear power programme.
In Pakistan, the one-power reactor that has been in operation near Karachi has had a poor track record. During the period 1972-97, on an average, the plant has been shut down for about 55 days each year due to equipment failure or ‘human error’. More dangerous is the reactor coming up at Chashma.
Pakistani physicists Zia Mian and A H Nayyar identify three concerns with the Chashma reactor. First, Chashma is located in a seismically-active zone. Second, the reactor is a replica of the Chinese Qinshan reactor that suffered an accident in 1998. Third, this is the first time that China is indigenously manufacturing various reactor components. Pakistan, in other words, will be the guinea pig for this design and its components. Mian and Nayyar estimate that given the high population density of South Asia, an accident would cause somewhere between 5,000 and 33,000 deaths. Efforts to delay, if not prevent the start-up of Chashma have so far not succeeded.
Nuclear reactor designs have, of course, been modified to incorporate lessons from the Chernobyl accident, and earlier accidents. Nevertheless, no reactor, not even the so-called ‘inherently safe’ reactor, is wholly risk free. A 1990 study by the Union of Concerned Scientists concluded: ‘As a general proposition, there is nothing ´inherently´ safe about a reactor. Regardless of the attention to design, construction, operation, and management of nuclear reactors, there is always something that could be done (or not done) to render the reactor dangerous. The degree to which this is true varies from design to design, but we believe that our general conclusion is correct.’
The risk of accidents does not necessarily mean that we should abandon a technology but it should certainly cause concern and lead to the exploration of safer alternatives. Prior to this, however, is the requirement that the pursuit of nuclear power, or any other hazardous technology, should be done democratically with the informed consent of the potentially affected populations. The first step towards a democratic debate is an honest assessment of the risks involved. Unfortunately, with nuclear establishments, as with the purveyors of other hazardous technologies, that step seems the hardest.