THE BULLETIN OF THE ATOMIC SCIENTIST – Nuclear safety lessons from Japan’s summer earthquake

THE BULLETIN OF THE ATOMIC SCIENTIST

Nuclear safety lessons from Japan’s summer earthquake

BY ASHWIN KUMAR AND M. V. RAMANA | 4 DECEMBER 2007

On July 16, 2007, an earthquake with a magnitude of somewhere between 6.6 and 6.8
struck Japan. Its epicenter was about 16 kilometers north of the Kashiwazaki-Kariwa
Nuclear Power Plant (KKNPP), the biggest such plant in the world. The known results of
the earthquake include a fire and leaks of radioactivity. However, news of damage to the
reactors continues to emerge, the most recent being the discovery of a jammed control
rod in Unit-7. Though there was no major release of radioactivity, the many failures and
unanticipated events that occurred at the reactor after the earthquake have important
implications for nuclear safety worldwide.

To start, the Japanese nuclear establishment never anticipated the magnitude of the
earthquake. Under Japan’s old guidelines, which formed the basis of the KKNPP design,
the seismic hazard for each nuclear site is defined in terms of two intensities, termed S1
and S2. (See “Status Report on Seismic Re-Evaluation”.) The S1 earthquake, referred to
as the “maximum design earthquake,” is less intense and determined by historical events
and current and past fault activity. The S2 earthquake, called the “extreme design
earthquake” and supposedly an impossibility, is derived from seismo-tectonic structures
and active faults. These requirements were believed to provide a “sufficient range of
earthquakes to assure reactor safety for any potential earthquake shaking.” (See “A
Developing Risk-Informed Design Basis Earthquake Ground Motion Methodology”
.) But
clearly the S2 design earthquake wasn’t extreme enough: The peak ground acceleration of
the July 16 earthquake was two-and-a-half times greater than what was assumed for the
S2 earthquake.

After the recent earthquake, Takashi Nakata at the Hiroshima Institute of Technology and
Yasuhiro Suzuki at Nagoya University analyzed the data in the Tokyo Electric Power
Co.’s (TEPCO) license application and concluded that it indicated a fault five times
longer than the one TEPCO identified. Between 1979 and 1985, TEPCO found four small
faults off the coast of Kashiwazaki-Kariwa, but it concluded that they were either inactive
or unimportant. However, Nakata and Suzuki believe that three of these small faults
constitute one 36-kilometer long fault, which is probably active, too.

Such differences in conclusions suggest that there were organizational pressures to ignore
inconvenient data or interpret it to ensure it supported vested interests–in this case,
building the reactor at a specific site. A similar example (albeit from a different
technological arena) is the 1986 Challenger space shuttle explosion. In her 1996 book,
The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA,
sociologist Diane Vaughan observed that the ultimate origins of the accident “were in
routine and taken-for-granted aspects of organizational life that created a way of seeing
that was simultaneously a way of not seeing.”

The July earthquake also points to how actual accidents could result in unexpected modes
of failure. Nuclear reactors are based on complex interactive technologies, operating at
high temperatures or pressures, with tightly coupled events occurring at a rapid pace, and
therefore, prone to accidents. (See Charles Perrow’s 1984 book, Normal Accidents:
Living with High-Risk Technologies
.) By simultaneously affecting large parts of a nuclear
power plant, earthquakes increase the possibility of accidents. The Japanese earthquake
damaged KKNPP’s switchyards and water piping for the fire-suppression systems. It also
caused a fire when electrical equipment near a transformer slipped, causing separated
cables to short-circuit. There are reports that leaking oil was involved in this fire.
Some events were unexpected: For example, underground electric cables were pulled
down by ground subsidence, creating a large opening in the outer wall of the reactor’s
basement–a so-called “radiation-controlled area” that must be completely shut off from
the outside. According to a TEPCO official, “It was beyond our imagination that a space
could be made in the hole on the outer wall for the electric cables.”

Finally, the earthquake showed how emergency plans that look great on paper can fail
when disaster strikes, with KKNPP managers admitting that “disaster-prevention
measures did not function successfully.” For example, the fire-extinguishing system at
one reactor couldn’t be used because of pipe damage, resulting in the water hose only
being able to spray water at a distance of a meter rather than the normal dozens of meters.
Since the plant didn’t have chemical fire extinguishers, workers had no choice but to
watch the fire burn. The earthquake also knocked out a hotline to the local fire station.
Meanwhile, Plant Director Akio Takahashi was told of the fire immediately after the
earthquake, but didn’t dispatch the facility’s firefighters because he thought management
would have already done so–an example of the human error even high-level officials are
capable of during stressful moments. When they were notified, the fire brigade struggled
to reach the plant because of the area’s other damage. So although the transformer fire
was detected at 10:15 a.m., firefighters didn’t start fighting it until more than an hour
later. Takahashi also admitted to problems in the facility’s primary firefighting system
and cooperation between related organizations.

Throughout, TEPCO’s primary aim seemed to be to quell fear rather than accurately
report facts. For example, TEPCO employees knew about the leak by 12:50 p.m., but the
company didn’t publicly report it until 8:28 p.m. Similarly, the initial estimates of
radioactivity TEPCO released were significantly smaller than the final figure. If this was
the case during a relatively major accident that was displayed on television screens
around the world, it’s easy to imagine the paucity of information nuclear authorities
would make available during a smaller accident.

The prevention of accidents at nuclear facilities depends on both technological and
organizational factors. The efficacy of these factors is contingent upon them performing
according to design. Though there wasn’t a large-scale release of radioactivity, events at
KKNPP after the July earthquake demonstrate both technological and organizational
failures. Discussions about nuclear safety should begin by acknowledging the possibility
of such failures.

Copyright © 2009 Bulletin of the Atomic Scientists. All Rights Reserved.
Source URL (retrieved on 04/15/2009 – 16:21):
http://www.thebulletin.org/node/168

Speak Your Mind

Tell us what you're thinking...
and oh, if you want a pic to show with your comment, go get a gravatar!

You must be logged in to post a comment.