This book addresses the experimental calibration of best-estimate numerical simulation models. The results of measurements and computations are never exact. Therefore, knowing only the nominal values of experimentally measured or computed quantities is insufficient for applications, particularly since the respective experimental and computed nominal values seldom coincide. In the author's view, the objective of predictive modeling is to extract "best estimate" values for model parameters and predicted results, together with "best estimate" uncertainties for these parameters and results. To achieve this goal, predictive modeling combines imprecisely known experimental and computational data, which calls for reasoning on the basis of incomplete, error-rich, and occasionally discrepant information.
The customary methods used for data assimilation combine experimental and computational information by minimizing an a priori, user-chosen, "cost functional" (usually a quadratic functional that represents the weighted errors between measured and computed responses). In contrast to these user-influenced methods, the BERRU (Best Estimate Results with Reduced Uncertainties) Predictive Modeling methodology developed by the author relies on the thermodynamics-based maximum entropy principle to eliminate the need for relying on minimizing user-chosen functionals, thus generalizing the "data adjustment" and/or the "4D-VAR" data assimilation procedures used in the geophysical sciences. The BERRU predictive modeling methodology also provides a "model validation metric" which quantifies the consistency (agreement/disagreement) between measurements and computations. This "model validation metric" (or "consistency indicator") is constructed from parameter covariance matrices, response covariance matrices (measured and computed), and response sensitivities to model parameters.
Traditional methods for computing response sensitivities are hampered by the "curse of dimensionality," which makes them impractical for applications to large-scale systems that involve many imprecisely known parameters. Reducing the computational effort required for precisely calculating the response sensitivities is paramount, and the comprehensive adjoint sensitivity analysis methodology developed by the author shows great promise in this regard, as shown in this book. After discarding inconsistent data (if any) using the consistency indicator, the BERRU predictive modeling methodology provides best-estimate values for predicted parameters and responses along with best-estimate reduced uncertainties (i.e., smaller predicted standard deviations) for the predicted quantities. Applying the BERRU methodology yields optimal, experimentally validated, "best estimate" predictive modeling tools for designing new technologies and facilities, while also improving on existing ones.
The customary methods used for data assimilation combine experimental and computational information by minimizing an a priori, user-chosen, "cost functional" (usually a quadratic functional that represents the weighted errors between measured and computed responses). In contrast to these user-influenced methods, the BERRU (Best Estimate Results with Reduced Uncertainties) Predictive Modeling methodology developed by the author relies on the thermodynamics-based maximum entropy principle to eliminate the need for relying on minimizing user-chosen functionals, thus generalizing the "data adjustment" and/or the "4D-VAR" data assimilation procedures used in the geophysical sciences. The BERRU predictive modeling methodology also provides a "model validation metric" which quantifies the consistency (agreement/disagreement) between measurements and computations. This "model validation metric" (or "consistency indicator") is constructed from parameter covariance matrices, response covariance matrices (measured and computed), and response sensitivities to model parameters.
Traditional methods for computing response sensitivities are hampered by the "curse of dimensionality," which makes them impractical for applications to large-scale systems that involve many imprecisely known parameters. Reducing the computational effort required for precisely calculating the response sensitivities is paramount, and the comprehensive adjoint sensitivity analysis methodology developed by the author shows great promise in this regard, as shown in this book. After discarding inconsistent data (if any) using the consistency indicator, the BERRU predictive modeling methodology provides best-estimate values for predicted parameters and responses along with best-estimate reduced uncertainties (i.e., smaller predicted standard deviations) for the predicted quantities. Applying the BERRU methodology yields optimal, experimentally validated, "best estimate" predictive modeling tools for designing new technologies and facilities, while also improving on existing ones.
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.