# Methodologies for Treating Model Uncertainty and Discretization Error in Modeling and Simulation of Physical Systems

Thursday, September 16, 1999 - 10:10am - 10:50am

Lind 409

Kenneth Alvin (Sandia National Laboratories)

Numerical simulation has become an integral part of the design, qualification and certification process for nearly all mechanical systems. Given the explosive growth in computational resources at all levels of science and engineering, modeling and simulation is assuming an ever greater responsibility in the design process, while physical testing is often relegated to a secondary role. Increasing the reliability of mathematical models requires that we gain a greater understanding of the simplifying assumptions employed in the model, and the influence of potential modeling errors or uncertainties on the response of the model. We can decompose the elements of a mathematical model into a vector of physical parameters used in forming equations, such as material response parameters or geometric dimensions, and the model structure, which includes the form of the mathematical equations in their differential and algebraic forms. Variability in the physical parameters can be addressed by many existing computational propagation methods, which are designed to propagate distributions on the parameters through a fixed model structure in order to estimation statistics of interest on the model response quantities.

Uncertainty and error inherent in the model structure is a more pernicious and (unfortunately) more frequently ignored aspect of uncertainty analysis. The assumptions and approximations employed are typically made out of necessity, and yet estimating their effect requires that those assumptions be changed in some way. The usual approach to this seemingly intractable problem is model validation; that is, the use of physical experiments to show that the given modeling errors are sufficiently small to make useful predictions for the problem of interest. Unfortunately, validation of the model structure itself is still primarily qualitative, even when statistical techniques are employed over a physical parameter space. Furthermore, established methods for estimating certain sources of model error, such as that due to discretization, are often neglected in validation and prediction studies. Hence, extending the conclusions of a validation exercise to other applications of the same model structure involves an unquantified level of risk.

We present some initial investigations into the problem of quantifying the effects of model uncertainty and error, through both probabilistic and non probabilistic approaches. In both cases, variations or biases in model structure are coupled into traditional uncertainty analysis in order to estimate higher order uncertainty or bias in the estimated statistics computed from the uncertainty analysis. Some sources of error, such as the effect of discretizing partial differential equations, can be modeled in an explicit non probabilistic fashion. Other sources of error, such as simplifying assumptions, may be difficult or impossible to model and/or bound. In such cases, it can be more effective to model the error as a worst case bias of some assumed magnitude, and then estimate the effects of this bias on the uncertainty analysis. If the results of the uncertainty analysis are stable in the presence of an assumed conservative error bound, we have demonstrated increased reliability of the model. Finally, some model structure elements can be treated as uncertainties and assessed through Bayesian inference. The Bayesian Model Averaging (BMA) approach of Draper appears promising for treatment of alternative forms of submodels in the empirical modeling of material response, as well as in the metamodeling and model reduction techniques that are frequently employed within uncertainty and error estimation algorithms.

* Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract DE-AC04-94AL85000.

Uncertainty and error inherent in the model structure is a more pernicious and (unfortunately) more frequently ignored aspect of uncertainty analysis. The assumptions and approximations employed are typically made out of necessity, and yet estimating their effect requires that those assumptions be changed in some way. The usual approach to this seemingly intractable problem is model validation; that is, the use of physical experiments to show that the given modeling errors are sufficiently small to make useful predictions for the problem of interest. Unfortunately, validation of the model structure itself is still primarily qualitative, even when statistical techniques are employed over a physical parameter space. Furthermore, established methods for estimating certain sources of model error, such as that due to discretization, are often neglected in validation and prediction studies. Hence, extending the conclusions of a validation exercise to other applications of the same model structure involves an unquantified level of risk.

We present some initial investigations into the problem of quantifying the effects of model uncertainty and error, through both probabilistic and non probabilistic approaches. In both cases, variations or biases in model structure are coupled into traditional uncertainty analysis in order to estimate higher order uncertainty or bias in the estimated statistics computed from the uncertainty analysis. Some sources of error, such as the effect of discretizing partial differential equations, can be modeled in an explicit non probabilistic fashion. Other sources of error, such as simplifying assumptions, may be difficult or impossible to model and/or bound. In such cases, it can be more effective to model the error as a worst case bias of some assumed magnitude, and then estimate the effects of this bias on the uncertainty analysis. If the results of the uncertainty analysis are stable in the presence of an assumed conservative error bound, we have demonstrated increased reliability of the model. Finally, some model structure elements can be treated as uncertainties and assessed through Bayesian inference. The Bayesian Model Averaging (BMA) approach of Draper appears promising for treatment of alternative forms of submodels in the empirical modeling of material response, as well as in the metamodeling and model reduction techniques that are frequently employed within uncertainty and error estimation algorithms.

* Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract DE-AC04-94AL85000.