A little more than 20 years ago we learned how to direct photons, electrons, or ions a few keV or lower energy onto a surface and measure the energy or mass distribution of backscattered or emitted species. By arranging conditions so that the bombarding or detected species is strongly attenuated by matter, the information obtained is restricted to not more than the outermost several atomic layers. We then learned how to infer the composition, chemical state, and (sometimes) the molecular structure of the surface from these spectra. So began the era of “alphabet soup.” The ensuing decades brought several things:
■ The commercial availability of major surface spectroscopies. Even if not totally user friendly, they aren't outright hostile and work for most people on most days. Instrument makers seeking increased sales volume will continue to improve their ease of operation and reliability.
■ The scrutiny of most materials and structures. There is reliable literature about the very large majority of applications. We know where to begin interpretation and what are the major pitfalls. We have little excuse for the egregious errors we sometimes still make.
■ Many good research groups. When something really new comes along (e.g., the high temperature superconductors), the correct understanding of the data is worked out quickly enough to use it as a tool, guiding research and development.
■ The seeming reluctance, outside certain specialized areas, of the materials community to integrate surface spectroscopies into other than basic research. Expense is often cited as a barrier, but this is hardly credible when the daily cost of top-notch spectroscopy isn't more than twice typical industrial researcher internal billine rates.