Date of Award
Master of Science (MS)
Juang , Charng-Hsein
Khan , Abdul A
As prediction of the performance and behavior of complex engineering systems shifts from a primarily empirical-based approach to the use of complex physics-based numerical models, the role of experimentation is evolving to calibrate, validate, and quantify uncertainty of the numerical models. Oftentimes, these experiments are expensive, placing importance on selecting experimental settings to efficiently calibrate the numerical model with a limited number of experiments. The aim of this thesis is to reduce the experimental resources required to reach predictive maturity in complex numerical models by (i) aiding experimenters in determining the optimal settings for experiments, and (ii) aiding the model developers in assessing the predictive maturity of numerical models through a new, more refined coverage metric. Numerical model predictions entail uncertainties, primarily caused by imprecisely known input parameter values and biases, primarily caused by simplifications and idealizations in the model. Hence, calibration of numerical models involves not only updating of parameter values but also inferring the discrepancy bias, or empirically trained error model. Training of this error model throughout the domain of applicability becomes possible when experiments conducted at varying settings are available. Of course, for the trained discrepancy bias to be meaningful and a numerical model to be predictively mature, the validation experiments must sufficiently cover the operational domain. Otherwise, poor training of the discrepancy bias and overconfidence in model predictions may result. Thus, coverage metrics are used to quantify the ability of a set of validation experiments to represent an entire operation domain. This thesis is composed of two peer-reviewed journal articles. The first article focuses on the optimal design of validation experiments. The ability to improve the predictive maturity of a plasticity material model is assessed for several index-based and distance-based batch sequential design selection criteria through a detailed analysis of discrepancy bias and coverage. Furthermore, the effect of experimental uncertainty, complexity of discrepancy bias, and initial experimental settings on the performance of each criterion is evaluated. Lastly, a technique that integrates index-based and distance-based selection criteria to both exploit the available knowledge regarding the discrepancy bias and explore the operational domain is evaluated. This article is published in Structural and Multidisciplinary Optimization in 2013. The second article is focused on developing a coverage metric. Four characteristics of an exemplar coverage metric are identified and the ability of coverage metrics from the literature to satisfy the four criteria is evaluated. No existing coverage metric is determined to satisfy all four criteria. As a solution, a new coverage metric is proposed which exhibits satisfactory performance in all four criteria. The performance of the proposed coverage metric is compared to the existing coverage metrics using an application to the plasticity material model as well as a high-dimensional Rosenbrock function. This article is published in Mechanical Systems and Signal Processing in 2014.
Egeberg, Matthew, "Optimal Design of Validation Experiments for Calibration and Validation of Complex Numerical Models" (2014). All Theses. 1893.