Skip to main content Accessibility help
×
Hostname: page-component-77c89778f8-fv566 Total loading time: 0 Render date: 2024-07-19T01:56:43.076Z Has data issue: false hasContentIssue false

12 - Model accuracy assessment

from Part IV - Model validation and prediction

Published online by Cambridge University Press:  05 March 2013

Christopher J. Roy
Affiliation:
Virginia Polytechnic Institute and State University
Get access

Summary

As has been discussed in a number of chapters, particularly Chapter 10, Model validation fundamentals, and Chapter 11, Design and execution of validation experiments, model accuracy assessment is the core issue of model validation. Our intent in model accuracy assessment is to critically and quantitatively determine the ability of a mathematical model and its embodiment in a computer code to simulate a well-characterized physical process. We, of course, are only interested in well-characterized physical processes that are useful for model validation. How critical and quantitative the model accuracy assessment is will depend on (a) how extensive the experimental data set is in exploring the important model input quantities that affect the system response quantities (SRQs) of interest; (b) how well characterized the important model input quantities are, based on measurements in the experiments; (c) how well characterized the experimental measurements and the model predictions of the SRQs of interest are; (d) whether the experimental measurements of the SRQs were available to the computational analyst before the model accuracy assessment was conducted; and (e) if the SRQs were available to the computational analysts, whether they were used for model updating or model calibration. This chapter will explore these difficult issues both conceptually and quantitatively.

We begin the chapter by discussing the fundamental elements of model accuracy assessment. As part of this discussion, we review traditional and recent methods for comparing model results and experimental measurements, and we explore the relationship between model accuracy assessment, model calibration, and model prediction. Beginning with the engineering society definitions of terms given in Chapter 2, Fundamental concepts and terminology, the perspective of this book is to segregate, as well as possible, each of these activities. There is, however, an alternative perspective in the published literature that believes all of these activities should be combined. We briefly review this alternative perspective and the associated approaches, and contrast these with approaches that segregate these activities.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2010

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Almond, R. G. (1995). Graphical Belief Modeling. 1st edn., London, Chapman & Hall.CrossRefGoogle Scholar
Anderson, M. C., Hasselman, T. K., and Carne, T. G. (1999). Model correlation and updating of a nonlinear finite element model using crush test data. 17th International Modal Analysis Conference (IMAC) on Modal Analysis, Paper No. 376, Kissimmee, FL, Proceedings of the Society of Photo-Optical Instrumentation Engineers, 1511–1517.Google Scholar
Angus, J. E. (1994). The probability integral transform and related results. SIAM Review. 36(4), 652–654.CrossRefGoogle Scholar
Aster, R., Borchers, B., and Thurber, C. (2005). Parameter Estimation and Inverse Problems, Burlington, MA, Elsevier Academic Press.Google Scholar
Aughenbaugh, J. M. and Paredis, C. J. J. (2006). The value of using imprecise probabilities in engineering design. Journal of Mechanical Design. 128, 969–979.CrossRefGoogle Scholar
Babuska, I., Nobile, F., and Tempone, R. (2008). A systematic approach to model validation based on bayesian updates and prediction related rejection criteria. Computer Methods in Applied Mechanics and Engineering. 197(29–32), 2517–2539.CrossRefGoogle Scholar
Bae, H.-R., Grandhi, R. V., and Canfield, R. A. (2006). Sensitivity analysis of structural response uncertainty propagation using evidence theory. Structural and Multidisciplinary Optimization. 31(4), 270–279.CrossRefGoogle Scholar
Barone, M. F., Oberkampf, W. L., and Blottner, F. G. (2006). Validation case study: prediction of compressible turbulent mixing layer growth rate. AIAA Journal. 44(7), 1488–1497.CrossRefGoogle Scholar
Barre, S., Braud, P., Chambres, O., and Bonnet, J. P. (1997). Influence of inlet pressure conditions on supersonic turbulent mixing layers. Experimental Thermal and Fluid Science. 14(1), 68–74.CrossRefGoogle Scholar
Baudrit, C. and Dubois, D. (2006). Practical representations of incomplete probabilistic knowledge. Computational Statistics and Data Analysis. 51, 86–108.CrossRefGoogle Scholar
Bayarri, M. J., Berger, J. O., Paulo, R., Sacks, J., Cafeo, J. A., Cavendish, J., Lin, C. H., and Tu, J. (2007). A framework for validation of computer models. Technometrics. 49(2), 138–154.CrossRefGoogle Scholar
Bedford, T. and Cooke, R. (2001). Probabilistic Risk Analysis: Foundations and Methods, Cambridge, UK, Cambridge University Press.CrossRefGoogle Scholar
Bernardo, J. M. and Smith, A. F. M. (1994). Bayesian Theory, New York, John Wiley.CrossRefGoogle Scholar
Bogdanoff, D. W. (1983). Compressibility effects in turbulent shear layers. AIAA Journal. 21(6), 926–927.CrossRefGoogle Scholar
Box, E. P. and Draper, N. R. (1987). Empirical Model-Building and Response Surfaces, New York, John Wiley.Google Scholar
Chen, W., Baghdasaryan, L., Buranathiti, T., and Cao, J. (2004). Model validation via uncertainty propagation. AIAA Journal. 42(7), 1406–1415.CrossRefGoogle Scholar
Chen, W., Xiong, Y., Tsui, K.-L., and Wang, S. (2006). Some metrics and a Bayesian procedure for validating predictive models in engineering design. ASME 2006 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Philadelphia, PA.CrossRef
Chen, W., Xiong, Y., Tsui, K.-L., and Wang, S. (2008). A design-driven validation approach using bayesian prediction models. Journal of Mechanical Design. 130(2).CrossRefGoogle Scholar
Chinzei, N., Masuya, G., Komuro, T., Murakami, A., and Kudou, K. (1986). Spreading of two-stream supersonic turbulent mixing layers. Physics of Fluids. 29(5), 1345–1347.CrossRefGoogle Scholar
Coleman, H. W. and Steele, W. G., Jr, . (1999). Experimentation and Uncertainty Analysis for Engineers. 2nd edn., New York, John Wiley.Google Scholar
Coleman, H. W. and Stern, F. (1997). Uncertainties and CFD code validation. Journal of Fluids Engineering. 119, 795–803.CrossRefGoogle Scholar
Crassidis, J. L. and Junkins, J. L. (2004). Optimal Estimation of Dynamics Systems, Boca Raton, FL, Chapman & Hall/CRC Press.CrossRefGoogle Scholar
D’Agostino, R. B. and Stephens, M. A., eds. (1986). Goodness-of-Fit-Techniques. New York, Marcel Dekker.Google Scholar
Debisschop, J. R. and Bonnet, J. P. (1993). Mean and fluctuating velocity measurements in supersonic mixing layers. In Engineering Turbulence Modeling and Experiments 2: Proceedings of the Second International Symposium on Engineering Turbulence Modeling and Measurement. Rodi, W. and Martelli, F. (eds. New York, Elsevier.Google Scholar
Debisschop, J. R., Chambers, O., and Bonnet, J. P. (1994). Velocity-field characteristics in supersonic mixing layers. Experimental Thermal and Fluid Science. 9(2), 147–155.CrossRefGoogle Scholar
DesJardin, P. E., O’Hern, T. J., and Tieszen, S. R. (2004). Large eddy simulation of experimental measurements of the near-field of a large turbulent helium plume. Physics of Fluids. 16(6), 1866–1883.CrossRefGoogle Scholar
DeVolder, B., Glimm, J., Grove, J. W., Kang, Y., Lee, Y., Pao, K., Sharp, D. H., and Ye, K. (2002). Uncertainty quantification for multiscale simulations. Journal of Fluids Engineering. 124(1), 29–41.CrossRefGoogle Scholar
Devore, J. L. (2007). Probability and Statistics for Engineers and the Sciences. 7th edn., Pacific Grove, CA, Duxbury.
Dowding, K. J., Hills, R. G., Leslie, I., Pilch, M., Rutherford, B. M., and Hobbs, M. L. (2004). Case Study for Model Validation: Assessing a Model for Thermal Decomposition of Polyurethane Foam. SAND2004–3632, Albuquerque, NM, Sandia National Laboratories.CrossRefGoogle Scholar
Dowding, K. J., Red-Horse, J. R., Paez, T. L., Babuska, I. M., Hills, R. G., and Tempone, R. (2008). Editorial: Validation challenge workshop summary. Computer Methods in Applied Mechanics and Engineering. 197(29–32), 2381–2384.CrossRefGoogle Scholar
Draper, N. R. and Smith, H. (1998). Applied Regression Analysis. 3rd edn., New York, John Wiley.CrossRefGoogle Scholar
Drosg, M. (2007). Dealing with Uncertainties: a Guide to Error Analysis, Berlin, Springer-Verlag.Google Scholar
Dubois, D. and Prade, H., eds. (2000). Fundamentals of Fuzzy Sets. Boston, MA, Kluwer Academic Publishers.CrossRefGoogle Scholar
Dutton, J. C., Burr, R. F., Goebel, S. G., and Messersmith, N. L. (1990). Compressibility and mixing in turbulent free shear layers. 12th Symposium on Turbulence, Rolla, MO, University of Missouri-Rolla, A22–1 to A22–12.Google Scholar
Easterling, R. G. (2001). Measuring the Predictive Capability of Computational Models: Principles and Methods, Issues and Illustrations. SAND2001–0243, Albuquerque, NM, Sandia National Laboratories.CrossRefGoogle Scholar
Easterling, R. G. (2003). Statistical Foundations for Model Validation: Two Papers. SAND2003–0287, Albuquerque, NM, Sandia National Laboratories.CrossRefGoogle Scholar
Elliot, G. S. and Samimy, M. (1990). Compressibility effects in free shear layers. Physics of Fluids A. 2(7), 1231–1240.CrossRefGoogle Scholar
Ferson, S. (2002). RAMAS Risk Calc 4.0 Software: Risk Assessment with Uncertain Numbers. Setauket, NY, Applied Biomathematics.Google Scholar
Ferson, S. and Oberkampf, W. L. (2009). Validation of imprecise probability models. International Journal of Reliability and Safety. 3(1–3), 3–22.CrossRefGoogle Scholar
Ferson, S., Kreinovich, V., Ginzburg, L., Myers, D. S., and Sentz, K. (2003). Constructing Probability Boxes and Dempster-Shafer Structures. SAND2003–4015, Albuquerque, NM, Sandia National Laboratories.CrossRefGoogle Scholar
Ferson, S., Nelsen, R. B., Hajagos, J., Berleant, D. J., Zhang, J., Tucker, W. T., Ginzburg, L. R., and Oberkampf, W. L. (2004). Dependence in Probabilistic Modeling, Dempster-Shafer Theory, and Probability Bounds Analysis. SAND2004–3072, Albuquerque, NM, Sandia National Laboratories.Google Scholar
Ferson, S., Kreinovich, V., Hajagos, H., Oberkampf, W. L., and Ginzburg, L. (2007). Experimental Uncertainty Estimation and Statistics for Data Having Interval Uncertainty. Albuquerque, Sandia National Laboratories.Google Scholar
Ferson, S., Oberkampf, W. L., and Ginzburg, L. (2008). Model validation and predictive capability for the thermal challenge problem. Computer Methods in Applied Mechanics and Engineering. 197, 2408–2430.CrossRefGoogle Scholar
Fetz, T., Oberguggenberger, M., and Pittschmann, S. (2000). Applications of possibility and evidence theory in civil engineering. International Journal of Uncertainty. 8(3), 295–309.CrossRefGoogle Scholar
Gartling, D. K., Hogan, R. E., and Glass, M. W. (1994). Coyote – a Finite Element Computer Program for Nonlinear Heat Conduction Problems, Part I – Theoretical Background. SAND94–1173, Albuquerque, NM, Sandia National Laboratories.Google Scholar
Geers, T. L. (1984). An objective error measure for the comparison of calculated and measured transient response histories. The Shock and Vibration Bulletin. 54(2), 99–107.Google Scholar
Gelman, A. B., Carlin, J. S., Stern, H. S., and Rubin, D. B. (1995). Bayesian Data Analysis, London, Chapman & Hall.Google Scholar
Ghosh, J. K., Delampady, M., and Samanta, T. (2006). An Introduction to Bayesian Analysis: Theory and Methods, Berlin, Springer-Verlag.Google Scholar
Giaquinta, M. and Modica, G. (2007). Mathematical Analysis: Linear and Metric Structures and Continuity, Boston, Birkhauser.Google Scholar
Gioia, F. and Lauro, C. N. (2005). Basic statistical methods for interval data. Statistica Applicata. 17(1), 75–104.Google Scholar
Goebel, S. G. and Dutton, J. C. (1991). Experimental study of compressible turbulent mixing layers. AIAA Journal. 29(4), 538–546.CrossRefGoogle Scholar
Grabe, M. (2005). Measurement Uncertainties in Science and Technology, Berlin, Springer-Verlag.Google Scholar
Gruber, M. R., Messersmith, N. L., and Dutton, J. C. (1993). Three-dimensional velocity field in a compressible mixing layer. AIAA Journal. 31(11), 2061–2067.Google Scholar
Haldar, A. and Mahadevan, S. (2000). Probability, Reliability, and Statistical Methods in Engineering Design, New York, John Wiley.Google Scholar
Hanson, K. M. (1999). A framework for assessing uncertainties in simulation predictions. Physica D. 133, 179–188.CrossRefGoogle Scholar
Hasselman, T. K., Wathugala, G. W., and Crawford, J. (2002). A hierarchical approach for model validation and uncertainty quantification. Fifth World Congress on Computational Mechanics, , Vienna, Austria, Vienna University of Technology.Google Scholar
Hazelrigg, G. A. (2003). Thoughts on model validation for engineering design. ASME 2003 Design Engineering Technical Conference and Computers and and Information in Engineering Conference, DETC2003/DTM-48632, Chicago, IL, ASME.CrossRefGoogle Scholar
Helton, J. C., Johnson, J. D., and Oberkampf, W. L. (2004). An exploration of alternative approaches to the representation of uncertainty in model predictions. Reliability Engineering and System Safety. 85(1–3), 39–71.CrossRefGoogle Scholar
Helton, J. C., Oberkampf, W. L., and Johnson, J. D. (2005). Competing failure risk analysis using evidence theory. Risk Analysis. 25(4), 973–995.CrossRefGoogle ScholarPubMed
Higdon, D., Kennedy, M., Cavendish, J., Cafeo, J. and Ryne, R. D. (2004). Combining field observations and simulations for calibration and prediction. SIAM Journal of Scientific Computing. 26, 448–466.CrossRefGoogle Scholar
Higdon, D., Nakhleh, C., Gattiker, J., and Williams, B. (2009). A Bayesian calibration approach to the thermal problem. Computer Methods in Applied Mechanics and Engineering. In press.Google Scholar
Hills, R. G. (2006). Model validation: model parameter and measurement uncertainty. Journal of Heat Transfer. 128(4), 339–351.CrossRefGoogle Scholar
Hills, R. G. and Leslie, I. (2003). Statistical Validation of Engineering and Scientific Models: Validation Experiments to Application. SAND2003–0706, Albuquerque, NM, Sandia National Laboratories.CrossRefGoogle Scholar
Hills, R. G. and Trucano, T. G. (2002). Statistical Validation of Engineering and Scientific Models: a Maximum Likelihood Based Metric. SAND2001–1783, Albuquerque, NM, Sandia National Laboratories.CrossRefGoogle Scholar
Hobbs, M. L. (2003). Personal communication.
Hobbs, M. L., Erickson, K. L., and Chu, T. Y. (1999). Modeling Decomposition of Unconfined Rigid Polyurethane Foam. SAND99–2758, Albuquerque, NM, Sandia National Laboratories.Google Scholar
Huber-Carol, C., Balakrishnan, N., Nikulin, M., and Mesbah, M., eds. (2002). Goodness-of-Fit Tests and Model Validity. Boston, Birkhauser.CrossRefGoogle Scholar
ISO (1995). Guide to the Expression of Uncertainty in Measurement. Geneva, Switzerland, International Organization for Standardization.Google Scholar
Iuzzolino, H. J., Oberkampf, W. L., Barone, M. F., and Gilkey, A. P. (2007). User's Manual for VALMET: Validation Metric Estimator Program. SAND2007–6641, Albuquerque, NM, Sandia National Laboratories.Google Scholar
Kennedy, M. C. and O’Hagan, A. (2001). Bayesian calibration of computer models. Journal of the Royal Statistical Society Series B – Statistical Methodology. 63(3), 425–450.CrossRefGoogle Scholar
Klir, G. J. (2006). Uncertainty and Information: Foundations of Generalized Information Theory, Hoboken, NJ, Wiley Interscience.Google Scholar
Klir, G. J. and Wierman, M. J. (1998). Uncertainty-Based Information: Elements of Generalized Information Theory, Heidelberg, Physica-Verlag.Google Scholar
Kohlas, J. and Monney, P.-A. (1995). A Mathematical Theory of Hints – an Approach to the Dempster-Shafer Theory of Evidence, Berlin, Springer-Verlag.CrossRefGoogle Scholar
Krause, P. and Clark, D. (1993). Representing Uncertain Knowledge: an Artificial Intelligence Approach, Dordrecht, The Netherlands, Kluwer Academic Publishers.CrossRefGoogle Scholar
Kriegler, E. and Held, H. (2005). Utilizing belief functions for the estimation of future climate change. International Journal for Approximate Reasoning. 39, 185–209.CrossRefGoogle Scholar
Law, A. M. (2006). Simulation Modeling and Analysis. 4th edn., New York, McGraw-Hill.Google Scholar
Lehmann, E. L. and Romano, J. P. (2005). Testing Statistical Hypotheses. 3rd edn., Berlin, Springer-Verlag.Google Scholar
Leonard, T. and Hsu, J. S. J. (1999). Bayesian Methods: an Analysis for Statisticians and Interdisciplinary Researchers, Cambridge, UK, Cambridge University Press.Google Scholar
Liu, F., Bayarri, M. J., Berger, J. O., Paulo, R., and Sacks, J. (2009). A Bayesian analysis of the thermal challenge problem. Computer Methods in Applied Mechanics and Engineering. 197(29–32), 2457–2466.CrossRefGoogle Scholar
Manski, C. F. (2003). Partial Identification of Probability Distributions, New York, Springer-Verlag.Google Scholar
MathWorks (2005). MATLAB. Natick, MA, The MathWorks, Inc.Google Scholar
McFarland, J. and Mahadevan, S. (2008). Multivariate significance testing and model calibration under uncertainty. Computer Methods in Applied Mechanics and Engineering. 197(29–32), 2467–2479.CrossRefGoogle Scholar
Mielke, P. W. and Berry, K. J. (2007). Permutation Methods: a Distance Function Approach. 2nd edn., Berlin, Springer-Verlag.Google Scholar
Miller, R. G. (1981). Simultaneous Statistical Inference. 2nd edn., New York, Springer-Verlag.CrossRefGoogle Scholar
Molchanov, I. (2005). Theory of Random Sets, London, Springer-Verlag.Google Scholar
Nagano, Y. and Hishida, M. (1987). Improved form of the k-epsilon model for wall turbulent shear flows. Journal of Fluids Engineering. 109(2), 156–160.CrossRefGoogle Scholar
Nguyen, H. T. and Walker, E. A. (2000). A First Course in Fuzzy Logic. 2nd edn., Cleveland, OH, Chapman & Hall/CRC.Google Scholar
Oberkampf, W. L. and Barone, M. F. (2004). Measures of agreement between computation and experiment: validation metrics. 34th AIAA Fluid Dynamics Conference, AIAA Paper 2004–2626, Portland, OR, American Institute of Aeronautics and Astronautics.CrossRef
Oberkampf, W. L. and Barone, M. F. (2006). Measures of agreement between computation and experiment: validation metrics. Journal of Computational Physics. 217(1), 5–36.CrossRefGoogle Scholar
Oberkampf, W. L. and Ferson, S. (2007). Model validation under both aleatory and epistemic uncertainty. NATO/RTO Symposium on Computational Uncertainty in Military Vehicle Design, AVT-147/RSY-022, Athens, Greece, NATO.Google Scholar
Oberkampf, W. L. and Helton, J. C. (2005). Evidence theory for engineering applications. In Engineering Design Reliability Handbook. Nikolaidis, E., Ghiocel, D. M. and Singhal, S. (eds.). New York, NY, CRC Press: 29.Google Scholar
Oberkampf, W. L. and Trucano, T. G. (2002). Verification and validation in computational fluid dynamics. Progress in Aerospace Sciences. 38(3), 209–272.CrossRefGoogle Scholar
Oberkampf, W. L., Trucano, T. G., and Hirsch, C. (2004). Verification, validation, and predictive capability in computational engineering and physics. Applied Mechanics Reviews. 57(5), 345–384.CrossRefGoogle Scholar
O’Hagan, A. (2006). Bayesian analysis of computer code outputs: a tutorial. Reliability Engineering and System Safety. 91(10–11), 1290–1300.CrossRefGoogle Scholar
O’Hern, T. J., Weckman, E. J., Gerhart, A. L., Tieszen, S. R., and Schefer, R. W. (2005). Experimental study of a turbulent buoyant helium plume. Journal of Fluid Mechanics. 544, 143–171.CrossRefGoogle Scholar
Paciorri, R. and Sabetta, F. (2003). Compressibility correction for the Spalart-Allmaras model in free-shear flows. Journal of Spacecraft and Rockets. 40(3), 326–331.CrossRefGoogle Scholar
Paez, T. L. and Urbina, A. (2002). Validation of mathematical models of complex structural dynamic systems. Proceedings of the Ninth International Congress on Sound and Vibration, Orlando, FL, International Institute of Acoustics and Vibration.Google Scholar
Papamoschou, D. and Roshko, A. (1988). The compressible turbulent shear layer: an experimental study. Journal of Fluid Mechanics. 197, 453–477.CrossRefGoogle Scholar
Pilch, M. (2008). Preface: Sandia National Laboratories Validation Challenge Workshop. Computer Methods in Applied Mechanics and Engineering. 197(29–32), 2373–2374.CrossRefGoogle Scholar
Press, W. H., Teukolsky, S. A., Vetterling, W. T., and Flannery, B. P. (2007). Numerical Recipes in FORTRAN. 3rd edn., New York, Cambridge University Press.Google Scholar
Pruett, C. D., Gatski, T. B., Grosch, C. E., and Thacker, W. D. (2003). The temporally filtered Navier-Stokes equations: properties of the residual stress. Physics of Fluids. 15(8), 2127–2140.CrossRefGoogle Scholar
Rabinovich, S. G. (2005). Measurement Errors and Uncertainties: Theory and Practice. 3rd edn., New York, Springer-Verlag.Google Scholar
Raol, J. R., Girija, G. and Singh, J. (2004). Modelling and Parameter Estimation of Dynamic Systems, London, UK, Institution of Engineering and Technology.CrossRefGoogle Scholar
Rayner, G. D. and Rayner, J. C. W. (2001). Power of the Neyman smooth tests for the uniform distribution. Journal of Applied Mathematics and Decision Sciences. 5(3), 181–191.CrossRefGoogle Scholar
Rider, W. J. (1998). Personal communication.
Roache, P. J. (1998). Verification and Validation in Computational Science and Engineering, Albuquerque, NM, Hermosa Publishers.Google Scholar
Rougier, J. (2007). Probabilistic inference for future climate using an ensemble of climate model evaluations. Climate Change. 81(3–4), 247–264.CrossRefGoogle Scholar
Russell, D. M. (1997a). Error measures for comparing transient data: Part I, Development of a comprehensive error measure. Proceedings of the 68th Shock and Vibration Symposium, Hunt Valley, Maryland, Shock and Vibration Information Analysis Center.Google Scholar
Russell, D. M. (1997b). Error measures for comparing transient data: Part II, Error measures case study. Proceedings of the 68th Shock and Vibration Symposium, Hunt Valley, Maryland, Shock and Vibration Information Analysis Center.Google Scholar
Rutherford, B. M. and Dowding, K. J. (2003). An Approach to Model Validation and Model-Based Prediction – Polyurethane Foam Case Study. Sandia National Laboratories, SAND2003–2336, Albuquerque, NM.Google Scholar
Samimy, M. and Elliott, G. S. (1990). Effects of compressibility on the characteristics of free shear layers. AIAA Journal. 28(3), 439–445.CrossRefGoogle Scholar
Seber, G. A. F. and Wild, C. J. (2003). Nonlinear Regression, New York, John Wiley.Google Scholar
Sivia, D. and Skilling, J. (2006). Data Analysis: a Bayesian Tutorial. 2nd edn., Oxford, Oxford University Press.Google Scholar
Sprague, M. A. and Geers, T. L. (1999). Response of empty and fluid-filled, submerged spherical shells to plane and spherical, step-exponential acoustic waves. Shock and Vibration. 6(3), 147–157.CrossRefGoogle Scholar
Sprague, M. A. and Geers, T. L. (2004). A spectral-element method for modeling cavitation in transient fluid-structure interaction. International Journal for Numerical Methods in Engineering. 60(15), 2467–2499.CrossRefGoogle Scholar
Stern, F., Wilson, R. V., Coleman, H. W., and Paterson, E. G. (2001). Comprehensive approach to verification and validation of CFD simulations – Part 1: Methodology and procedures. Journal of Fluids Engineering. 123(4), 793–802.CrossRefGoogle Scholar
Tieszen, S. R., Domino, S. P., and Black, A. R. (2005). Validation of a Simple Turbulence Model Suitable for Closure of Temporally-Filtered Navier Stokes Equations Using a Helium Plume. SAND2005–3210, Albuquerque, NM, Sandia National Laboratories.CrossRefGoogle Scholar
Trucano, T. G., Swiler, L. P., Igusa, T., Oberkampf, W. L., and Pilch, M. (2006). Calibration, validation, and sensitivity analysis: what's what. Reliability Engineering and System Safety. 91(10–11), 1331–1357.CrossRefGoogle Scholar
van den Bos, A. (2007). Parameter Estimation for Scientists and Engineers, Hoboken, NJ, Wiley-Interscience.CrossRefGoogle Scholar
Walley, P. (1991). Statistical Reasoning with Imprecise Probabilities, London, Chapman & Hall.CrossRefGoogle Scholar
Wang, S., Chen, W. and Tsui, K.-L. (2009). Bayesian validation of computer models. Technometrics. 51(4), 439–451.CrossRefGoogle Scholar
Wellek, S. (2002). Testing Statistical Hypotheses of Equivalence, Boca Raton, FL, Chapman & Hall/CRC.CrossRefGoogle Scholar
Wilcox, D. C. (2006). Turbulence Modeling for CFD. 3rd edn., La Canada, CA,DCW Industries.Google Scholar
Winkler, R. L. (1972). An Introduction to Bayesian Inference and Decision, New York, Holt, Rinehart, and Winston.Google Scholar
Wirsching, P., Paez, T. and Ortiz, K. (1995). Random Vibrations: Theory and Practice, New York, Wiley.Google Scholar
Wong, C. C., Blottner, F. G., Payne, J. L., and Soetrisno, M. (1995a). Implementation of a parallel algorithm for thermo-chemical nonequilibrium flow solutions. AIAA 33rd Aerospace Sciences Meeting, AIAA Paper 95–0152, Reno, NV, American Institute of Aeronautics and Astronautics.Google Scholar
Wong, C. C., Soetrisno, M., Blottner, F. G., Imlay, S. T., and Payne, J. L. (1995b). PINCA: A Scalable Parallel Program for Compressible Gas Dynamics with Nonequilibrium Chemistry. SAND94–2436, Albuquerque, NM, Sandia National Laboratories.Google Scholar
Yee, H. C. (1987). Implicit and Symmetric Shock Capturing Schemes. Washington, DC, NASA, NASA-TM-89464.
Yoon, S. and Jameson, A. (1987). An LU-SSOR scheme for the Euler and Navier-Stokes equations. 25th AIAA Aerospace Sciences Meeting, AIAA Paper 87–0600, Reno, NV, American Institute of Aeronautics and Astronautics.Google Scholar
Zeman, O. (1990). Dilatation dissipation: the concept and application in modeling compressible mixing layers. Physics of Fluids A. 2(2), 178–188.CrossRefGoogle Scholar
Zhang, R. and Mahadevan, S. (2003). Bayesian methodology for reliability model acceptance. Reliability Engineering and System Safety. 80(1), 95–103.CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×