Skip to main content Accessibility help
×
Hostname: page-component-77c89778f8-sh8wx Total loading time: 0 Render date: 2024-07-19T02:11:06.280Z Has data issue: false hasContentIssue false

1 - Introduction

Published online by Cambridge University Press:  05 March 2013

Christopher J. Roy
Affiliation:
Virginia Polytechnic Institute and State University
Get access

Summary

This chapter briefly sketches the historical beginnings of modeling and simulation (M&S). Although claiming the beginning of anything is simply a matter of convenience, we will start with the stunning invention of calculus. We then discuss how the steadily increasing performance and decreasing costs of computing have been another critical driver in advancing M&S. Contributors to the credibility of M&S are discussed, and the preliminary concepts of verification and validation are mentioned. We close the chapter with an outline of the book and suggest how the book might be used by students and professionals.

Historical and modern role of modeling and simulation

Historical role of modeling and simulation

For centuries, the primary method for designing an engineered system has been to improve the successful design of an existing system incrementally. During and after the system was built, it would be gradually tested in a number of ways. The first tests would usually be done during the building process in order to begin to understand the characteristics and responses of the new system. This new system was commonly a change in the old system's geometrical character, materials, fastening techniques, or assembly techniques, or a combination of all of these changes. If the system was intended to be used in some new environment such as a longer bridge span, a taller structure, or propelled at higher speeds, the system was always tested first in environments where the experience base already existed. Often, during the building and testing process, design or assembly weaknesses and flaws were discovered and modifications to the system were made. Sometimes a catastrophic failure of a monumental project would occur and the process would start over: occasionally after attending the funeral of the previous chief designer and his apprentices (DeCamp, 1995). In ancient times, chief designers understood the consequences of a major design failure; they had skin in the game.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2010

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Ashill, P. R. (1993). Boundary flow measurement methods for wall interference assessment and correction: classification and review. Fluid Dynamics Panel Symposium: Wall Interference, Support Interference, and Flow Field Measurements, AGARD-CP-535, Brussels, Belgium, AGARD, 12.1–12.21.Google Scholar
Blottner, F. G. (1990). Accurate Navier--Stokes results for the hypersonic flow over a spherical nosetip. Journal of Spacecraft and Rockets. 27(2), 113–122.CrossRefGoogle Scholar
Boehm, B. W. (1981). Software Engineering Economics, Saddle River, NJ, Prentice-Hall.Google Scholar
Bradley, R. G. (1988). CFD validation philosophy. Fluid Dynamics Panel Symposium: Validation of Computational Fluid Dynamics, AGARD-CP-437, Lisbon, Portugal, North Atlantic Treaty Organization.Google Scholar
Chapman, D. R., Mark, H., and Pirtle, M. W. (1975). Computer vs. wind tunnels. Astronautics & Aeronautics. 13(4), 22–30.Google Scholar
Cosner, R. R. (1994). Issues in aerospace application of CFD analysis. 32nd Aerospace Sciences Meeting & Exhibit, AIAA Paper 94–0464, Reno, NV, American Institute of Aeronautics and Astronautics.Google Scholar
DeCamp, L. S. (1995). The Ancient Engineers, New York, Ballantine Books.Google Scholar
Dwoyer, D. (1992). The relation between computational fluid dynamics and experiment. AIAA 17th Ground Testing Conference, Nashville, TN, American Institute of Aeronautics and Astronautics.
Edwards, P. N. (1997). The Closed World: Computers and the Politics of Discourse in Cold War America, Cambridge, MA, The MIT Press.Google Scholar
Harlow, F. H. and Fromm, J. E. (1965). Computer experiments in fluid dynamics. Scientific American. 212(3), 104–110.CrossRefGoogle Scholar
Kahneman, D. and Tversky, A., Eds. (2000). Choices, Values, and Frames. Cambridge, UK, Cambridge University Press.Google Scholar
Kirby, R. S., Withington, S., Darling, A. B., and Kilgour, F. G. (1956). Engineering in History, New York, NY, McGraw-Hill.Google Scholar
Lynch, F. T., Crites, R. C., and Spaid, F. W. (1993). The crucial role of wall interference, support interference, and flow field measurements in the development of advanced aircraft configurations. Fluid Dynamics Panel Symposium: Wall Interference, Support Interference, and Flow Field Measurements, AGARD-CP-535, Brussels, Belgium, AGARD, 1.1–1.38.Google Scholar
Marvin, J. G. (1988). Accuracy requirements and benchmark experiments for CFD validation. Fluid Dynamics Panel Symposium: Validation of Computational Fluid Dynamics, AGARD-CP-437, Lisbon, Portugal, AGARD.
Mehta, U. B. (1991). Some aspects of uncertainty in computational fluid dynamics results. Journal of Fluids Engineering. 113(4), 538–543.CrossRefGoogle Scholar
NaRC (1986). Current Capabilities and Future Directions in Computational Fluid Dynamics, Washington, DC, National Research Council.Google Scholar
Neumann, R. D. (1990). CFD validation – the interaction of experimental capabilities and numerical computations, 16th Aerodynamic Ground Testing Conference, AIAA Paper 90–3030, Portland, OR, American Institute of Aeronautics and Astronautics.
Oberkampf, W. L. (1994). A proposed framework for computational fluid dynamics code calibration/validation. 18th AIAA Aerospace Ground Testing Conference, AIAA Paper 94–2540, Colorado Springs, CO, American Institute of Aeronautics and Astronautics.CrossRef
Oberkampf, W. L. and Aeschliman, D. P. (1992). Joint computational/experimental aerodynamics research on a hypersonic vehicle: Part 1, experimental results. AIAA Journal. 30(8), 2000–2009.CrossRefGoogle Scholar
Roache, P. J. (2004). Building PDE codes to be verifiable and validatable. Computing in Science and Engineering. 6(5), 30–38.CrossRefGoogle Scholar
Tetlock, P. E. (2005). Expert Political Judgment: How good is it? How can we know?, Princeton, NJ, Princeton University Press.Google Scholar
Top500 (2008). 32nd Edition of TOP500 Supercomputers. .
Tversky, A. and Kahneman, D. (1992). Advances in prospect theory: cumulative representation of uncertainty. Journal of Risk and Uncertainty. 5(4), 297–323.CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

  • Introduction
  • William L. Oberkampf, Christopher J. Roy, Virginia Polytechnic Institute and State University
  • Book: Verification and Validation in Scientific Computing
  • Online publication: 05 March 2013
  • Chapter DOI: https://doi.org/10.1017/CBO9780511760396.002
Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

  • Introduction
  • William L. Oberkampf, Christopher J. Roy, Virginia Polytechnic Institute and State University
  • Book: Verification and Validation in Scientific Computing
  • Online publication: 05 March 2013
  • Chapter DOI: https://doi.org/10.1017/CBO9780511760396.002
Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

  • Introduction
  • William L. Oberkampf, Christopher J. Roy, Virginia Polytechnic Institute and State University
  • Book: Verification and Validation in Scientific Computing
  • Online publication: 05 March 2013
  • Chapter DOI: https://doi.org/10.1017/CBO9780511760396.002
Available formats
×