The Fusion Reactor Materials Program in the United States began in about the mid-1970s, in a world sensitized to energy production needs by the 1973 Arab Oil Embargo. In the program's early stages, materials problems were recognized to be as important as any of the other problems facing this new energy technology. The effects of irradiation on the behavior of materials to be used in the first-wall and blanket structure were of particular concern. Today, although the economic and political realities that define energy needs have changed, fusion energy still remains an important part of our vision for a better future. Global environmental effects such as the “greenhouse effect” from increased CO2 in the atmosphere and acid rain, coupled with safety concerns about fission reactors, make fusion very attractive as a clean and safe energy source.
Over the past 10 years, the range of candidate materials for structural application in fusion reactors has expanded to include new martensitic and ferritic steels with reduced activation characteristics, vanadium alloys, ceramics, and composite materials. However, because austenitic stainless steels are easily fabricated, nonmagnetic, reasonably strong and tough, commercially available, and familiar, they also remain as candidate materials for fusion. They have also been extensively studied for use in various fission reactor systems as well. The austenitic and stainless properties of Fe-Cr-Ni alloys and steels were discovered during 1900–1915. Commercial grades like AISI types 304 and 316 stainless steels were developed prior to World War II. The effects of irradiation on materials have been studied since 1942, when Eugene P. Wigner anticipated that fission fragments and energetic neutrons would damage crystalline materials by displacing atoms to create point defects (vacancies and interstitials).