Saturday 19 March 2011

History

Origins
Question book-new.svg
    This section needs additional citations for verification.
Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (November 2010)
See also: Nuclear fission#History

The pursuit of nuclear energy for electricity generation began soon after the discovery in the early 20th century that radioactive elements, such as radium, released immense amounts of energy, according to the principle of mass–energy equivalence. However, means of harnessing such energy was impractical, because intensely radioactive elements were, by their very nature, short-lived (high energy release is correlated with short half-lives). However, the dream of harnessing "atomic energy" was quite strong, even it was dismissed by such fathers of nuclear physics like Ernest Rutherford as "moonshine." This situation, however, changed in the late 1930s, with the discovery of nuclear fission.

In 1932, James Chadwick discovered the neutron, which was immately recognized as a potential tool for nuclear experimentation because of its lack of an electric charge. Experimentation with bombardment of materials with neutrons led Frédéric and Irène Joliot-Curie to discover induced radioactivity in 1934, which allowed the creation of radium-like elements at much less the price of natural radium. Further work by Enrico Fermi in the 1930s focused on using slow neutrons to increase the effectiveness of induced radioactivity. Experiments bombarding uranium with neutrons led Fermi to believe he had created a new, transuranic element, which he dubbed hesperium.
Constructing the core of B-Reactor at Hanford Site during the Manhattan Project.

But in 1938, German chemists Otto Hahn[25] and Fritz Strassmann, along with Austrian physicist Lise Meitner[26] and Meitner's nephew, Otto Robert Frisch,[27] conducted experiments with the products of neutron-bombarded uranium, as a means of further investigating Fermi's claims. They determined that the relatively tiny neutron split the nucleus of the massive uranium atoms into two roughly equal pieces, contradicting Fermi. This was an extremely surprising result: all other forms of nuclear decay involved only small changes to the mass of the nucleus, whereas this process—dubbed "fission" as a reference to biology—involved a complete rupture of the nucleus. Numerous scientists, including Leo Szilard, who was one of the first, recognized that if fission reactions released additional neutrons, a self-sustaining nuclear chain reaction could result. Once this was experimentally confirmed and announced by Frédéric Joliot-Curie in 1939, scientists in many countries (including the United States, the United Kingdom, France, Germany, and the Soviet Union) petitioned their governments for support of nuclear fission research, just on the cusp of World War II.

In the United States, where Fermi and Szilard had both emigrated, this led to the creation of the first man-made reactor, known as Chicago Pile-1, which achieved criticality on December 2, 1942. This work became part of the Manhattan Project, which built large reactors at the Hanford Site (formerly the town of Hanford, Washington) to breed plutonium for use in the first nuclear weapons, which were used on the cities of Hiroshima and Nagasaki. A parallel uranium enrichment effort also was pursued.
The first light bulbs ever lit by electricity generated by nuclear power at EBR-1 at what is now Idaho National Laboratory.

After World War II, the prospects of using "atomic energy" for good, rather than simply for war, were greatly advocated as a reason not to keep all nuclear research controlled by military organizations. However, most scientists agreed that civilian nuclear power would take at least a decade to master, and the fact that nuclear reactors also produced weapons-usable plutonium created a situation in which most national governments (such as those in the United States, the United Kingdom, Canada, and the USSR) attempted to keep reactor research under strict government control and classification. In the United States, reactor research was conducted by the U.S. Atomic Energy Commission, primarily at Oak Ridge, Tennessee, Hanford Site, and Argonne National Laboratory.

Work in the United States, United Kingdom, Canada, and USSR proceeded over the course of the late 1940s and early 1950s. Electricity was generated for the first time by a nuclear reactor on December 20, 1951, at the EBR-I experimental station near Arco, Idaho, which initially produced about 100 kW. Work was also strongly researched in the US on nuclear marine propulsion, with a test reactor being developed by 1953. (Eventually, the USS Nautilus, the first nuclear-powered submarine, would launch in 1955.) In 1953, US President Dwight Eisenhower gave his "Atoms for Peace" speech at the United Nations, emphasizing the need to develop "peaceful" uses of nuclear power quickly. This was followed by the 1954 Amendments to the Atomic Energy Act which allowed rapid declassification of U.S. reactor technology and encouraged development by the private sector.
Early years
Calder Hall nuclear power station in the United Kingdom was the world's first nuclear power station to produce electricity in commercial quantities.[28]
The Shippingport Atomic Power Station in Shippingport, Pennsylvania was the first commercial reactor in the USA and was opened in 1957.

On June 27, 1954, the USSR's Obninsk Nuclear Power Plant became the world's first nuclear power plant to generate electricity for a power grid, and produced around 5 megawatts of electric power.[29][30]

Later in 1954, Lewis Strauss, then chairman of the United States Atomic Energy Commission (U.S. AEC, forerunner of the U.S. Nuclear Regulatory Commission and the United States Department of Energy) spoke of electricity in the future being "too cheap to meter".[31] Strauss was referring to hydrogen fusion[32][33]—which was secretly being developed as part of Project Sherwood at the time—but Strauss's statement was interpreted as a promise of very cheap energy from nuclear fission. The U.S. AEC itself had issued far more conservative testimony regarding nuclear fission to the U.S. Congress only months before, projecting that "costs can be brought down... [to]... about the same as the cost of electricity from conventional sources..." Significant disappointment would develop later on, when the new nuclear plants did not provide energy "too cheap to meter."

In 1955 the United Nations' "First Geneva Conference", then the world's largest gathering of scientists and engineers, met to explore the technology. In 1957 EURATOM was launched alongside the European Economic Community (the latter is now the European Union). The same year also saw the launch of the International Atomic Energy Agency (IAEA).

The world's first commercial nuclear power station, Calder Hall in Sellafield, England was opened in 1956 with an initial capacity of 50 MW (later 200 MW).[28][34] The first commercial nuclear generator to become operational in the United States was the Shippingport Reactor (Pennsylvania, December 1957).

One of the first organizations to develop nuclear power was the U.S. Navy, for the purpose of propelling submarines and aircraft carriers. The first nuclear-powered submarine, USS Nautilus (SSN-571), was put to sea in December 1954.[35] Two U.S. nuclear submarines, USS Scorpion and USS Thresher, have been lost at sea. Several serious nuclear and radiation accidents have involved nuclear submarine mishaps.[11][9] The Soviet submarine K-19 reactor accident in 1961 resulted in 8 deaths and more than 30 other people were over-exposed to radiation.[10] The Soviet submarine K-27 reactor accident in 1968 resulted in 9 fatalities and 83 other injuries.[11]

The United States Army also had a nuclear power program, beginning in 1954. The SM-1 Nuclear Power Plant, at Ft. Belvoir, Virginia, was the first power reactor in the US to supply electrical energy to a commercial grid (VEPCO), in April 1957, before Shippingport. The SL-1 was a United States Army experimental nuclear power reactor which underwent a steam explosion and meltdown in 1961, killing its three operators.[36]
Development
History of the use of nuclear power (top) and the number of active nuclear power plants (bottom).

Installed nuclear capacity initially rose relatively quickly, rising from less than 1 gigawatt (GW) in 1960 to 100 GW in the late 1970s, and 300 GW in the late 1980s. Since the late 1980s worldwide capacity has risen much more slowly, reaching 366 GW in 2005. Between around 1970 and 1990, more than 50 GW of capacity was under construction (peaking at over 150 GW in the late 70s and early 80s) — in 2005, around 25 GW of new capacity was planned. More than two-thirds of all nuclear plants ordered after January 1970 were eventually cancelled.[35] A total of 63 nuclear units were canceled in the USA between 1975 and 1980.[37]
Washington Public Power Supply System Nuclear Power Plants 3 and 5 were never completed.

During the 1970s and 1980s rising economic costs (related to extended construction times largely due to regulatory changes and pressure-group litigation)[38] and falling fossil fuel prices made nuclear power plants then under construction less attractive. In the 1980s (U.S.) and 1990s (Europe), flat load growth and electricity liberalization also made the addition of large new baseload capacity unattractive.

The 1973 oil crisis had a significant effect on countries, such as France and Japan, which had relied more heavily on oil for electric generation (39% and 73% respectively) to invest in nuclear power.[39][40] Today, nuclear power supplies about 80% and 30% of the electricity in those countries, respectively.

Some local opposition to nuclear power emerged in the early 1960s,[41] and in the late 1960s some members of the scientific community began to express their concerns.[42] These concerns related to nuclear accidents, nuclear proliferation, high cost of nuclear power plants, nuclear terrorism and radioactive waste disposal.[43] In the early 1970s, there were large protests about a proposed nuclear power plant in Wyhl, Germany. The project was cancelled in 1975 and anti-nuclear success at Wyhl inspired opposition to nuclear power in other parts of Europe and North America.[44][45] By the mid-1970s anti-nuclear activism had moved beyond local protests and politics to gain a wider appeal and influence, and nuclear power became an issue of major public protest.[46] Although it lacked a single co-ordinating organization, and did not have uniform goals, the movement's efforts gained a great deal of attention.[47] In some countries, the nuclear power conflict "reached an intensity unprecedented in the history of technology controversies".[48] In France, between 1975 and 1977, some 175,000 people protested against nuclear power in ten demonstrations.[49] In West Germany, between February 1975 and April 1979, some 280,000 people were involved in seven demonstrations at nuclear sites. Several site occupations were also attempted. In the aftermath of the Three Mile Island accident in 1979, some 120,000 people attended a demonstration against nuclear power in Bonn.[49] In May 1979, an estimated 70,000 people, including then governor of California Jerry Brown, attended a march and rally against nuclear power in Washington, D.C.[50][51] Anti-nuclear power groups emerged in every country that has had a nuclear power programme. Some of these anti-nuclear power organisations are reported to have developed considerable expertise on nuclear power and energy issues.[52]

Health and safety concerns, the 1979 accident at Three Mile Island, and the 1986 Chernobyl disaster played a part in stopping new plant construction in many countries,[53][54] although the public policy organization Brookings Institution suggests that new nuclear units have not been ordered in the U.S. because of soft demand for electricity, and cost overruns on nuclear plants due to regulatory issues and construction delays.[55]

Unlike the Three Mile Island accident, the much more serious Chernobyl accident did not increase regulations affecting Western reactors since the Chernobyl reactors were of the problematic RBMK design only used in the Soviet Union, for example lacking "robust" containment buildings.[56] Many of these reactors are still in use today. However, changes were made in both the reactors themselves (use of low enriched uranium) and in the control system (prevention of disabling safety systems) to reduce the possibility of a duplicate accident.

An international organization to promote safety awareness and professional development on operators in nuclear facilities was created: WANO; World Association of Nuclear Operators.

Opposition in Ireland and Poland prevented nuclear programs there, while Austria (1978), Sweden (1980) and Italy (1987) (influenced by Chernobyl) voted in referendums to oppose or phase out nuclear power. In July 2009, the Italian Parliament passed a law that canceled the results of an earlier referendum and allowed the immate start of the Italian nuclear program.[57] One Italian minister even called the nuclear phase-out a "terrible mistake

No comments:

Post a Comment