Accident theorists have suggested that the complexity of nuclear power plants makes it difficult to know beforehand everything that might go wrong to cause an accident, and furthermore that these accidents can spread quickly, making the attainment of safety difficult. However, there is a range of operational safety levels and some nuclear plants around the world have operated with relatively high levels of reliability. Organization theorists have studied what such high performing plants have in common, and during in-depth field studies in nuclear power plants they have found common features that they argue are essential for reliable operation.
Characteristics of high reliability operations
What they find is that in nuclear reactors operating at good levels of reliability political elites and organization leaders place a high priority on safety in design and operations and operators have confidence in this fact. Furthermore, there is an atmosphere of openness and responsibility in which all individuals feel responsible for every detail of operations that they can observe, and feel free to point out their observations without fear. There are reliable backups in technical operations and in management of personnel, and this often prevents failures from escalating. At the same time, there is always a belief that present levels of safety are not enough, so that the guard is never let down. This means that such organizations are always exploring what could go wrong, and learning not only from their mistakes but also from others'.
In India, the Department of Atomic Energy's operations does not satisfy these characteristics. While detailed descriptions of what actually goes on inside India's nuclear facilities are not possible because of lack of access, there is some evidence about DAE's actions leading up to accidents and their response to them.
Lacking a culture of accountability
In 2003, there was an accident in the Kalpakkam Atomic Reprocessing Plant (KARP) that caused extremely high radiation exposures (280-420 mSv) to workers. The cause is said to be a valve failure, due to which highly radioactive waste entered a tank containing waste of lower radioactivity. At the time of the accident, about five years after the plant was commissioned, no monitors had been installed to check for radiation levels in that area. Neither were any mechanisms to detect the valve failure. Therefore workers had no way of knowing that the sample they went in to collect was actually emitting high levels of radiation. The accident was recognized only after a sample collected was taken to a different room and processed.
Finally, some months later, when the union resorted to a strike the management transferred some of the key workers involved in the agitation and gave notice to others; this had the desired effect, and two days later all the striking workers joined back. The BARC Director's response was that "If the place was not safe, they would not have joined back".
Organization theorists point out that highly reliable operations are highly demanding and therefore precarious in systems that are structurally prone to accidents, because of the competing priorities and the difficulty of justifying efforts on safety whose direct outcomes are often unclear. But there is little effort on the part of the DAE, and there is low priority given to safety at the highest levels. Workers do not have control over their immediate environments, and problems cannot be raised openly in DAE's facilities. It is also not clear if the important lessons are learnt by the concerned organizations. Even after the leak at KARP was made public, the DAE continued to deny the causes of the problems and instead blamed the workers for a situation over which they had no control.
The absence of independent regulation
Regardless of the DAE's claims that it operates under strict regulatory supervision, this is not so. The Atomic Energy Regulatory Board (AERB) reports to the Atomic Energy Commission (AEC), which is headed by the secretary of the DAE. The Chairman of the Nuclear Power Corporation (NPC) is also a member of the AEC. Thus, both the DAE and the NPC exercise considerable administrative powers over the Atomic Energy Regulatory Board. In practice, this means that the AERB sometimes plays down the significance of accidents. For example, in March 1999, there was a leak of heavy water in the second unit of the MAPS reactor near Madras. The AERB dismissed the incident by saying that "the release to the environment is maintained well within the limits specified by the AERB." However, an independent scientist estimated that the radioactivity released to the environment was several times the permitted 300 curies per day per reactor and perhaps even exceeding the discharge limit of 10 times the daily quota, much higher than the AERB claims.
Too much secrecy
The lack of independent regulation is compounded by the difficulty in obtaining information about a program whose details are often shrouded in secrecy. Information about the accident at KARP in 2003 only became public when employee association members leaked information to the media after several unsuccessful attempts to have safety problems addressed by the management. M R Srinivasan, a former head of the DAE, has called upon the organisation to "adopt an enlightened policy of keeping the public informed at all times about safety aspects of its installations". But secrecy persists, and is often justified by the DAE using arguments such as national security.
Such an argument doesn't stand up to scrutiny. The weapons potential of India's reactors can be inferred from the design details and the operating records submitted to the IAEA, all of which are publicly available. However, making such information public could weaken the DAE's own claims of safe operation and possibly fuel public concern about how its facilities are being operated.
Organization theorists often point to the importance of feedback and learning necessary to maintain high levels of reliability, especially in systems such as nuclear reactors where problems that can occur are not always known in advance. Openness therefore has instrumental benefits for the attainment of safety, and is also required if safe operations are to be demonstrated. On the contrary, the practice of secrecy in India makes it likely that problems are often kept under wraps until a mishap brings them to light. It also violates the right of affected communities to informed participation in the choices that, instead, are made on their behalf.