Overview on: Statistics in uncertainty quantification and uncertainty quantification in statistics, by Dr. Amy Braverman.
Traditionally the UQ community has focused on quantifying uncertainties in the predictions made by deterministic models. Statistics provides an alternative formalism for modeling unknown processes using data. These two approaches represent the ends of a spectrum of knowledge about the processes of interest, and therefore one or the other may be more appropriate depending on the problem. However, we are not restricted to just one or the other - hybrid approaches are possible, and indeed are used for some problems. In addition, statistical tools are already heavily used in UQ, and various applied mathematical tools are already incorporated into statistical computing. The point is that the endeavor known as “uncertainty quantification” is much broader than the way it is traditionally understood, and it should be reframed to acknowledge that reality. Not only will this increase synergies among existing techniques and lead to more powerful approaches to a wider range of problems, but the increased diversity of viewpoints will benefit the intellectual environment of our community. This presentation will discuss these issues in the hopes of stimulating a conversation about how we can move in this direction.
Overview on: Roles of sensitivity analysis and uncertainty quantification for science and engineering models, by Dr. Ralph Smith.
This presentation will focus on the use of sensitivity analysis and uncertainty quantification for applications arising in science and engineering.
First, pertinent issues will be illustrated in the context of weather and climate modeling, applications utilizing smart materials for energy harvesting, biology models, radiation source localization in an urban environment, and simulation codes employed for nuclear power plant design. This will demonstrate that the basic UQ goal is to ascertain uncertainties inherent to parameters, initial and boundary conditions, experimental data, and models themselves to make predictions with quantified and improved accuracy.
The use of data, to improve the predictive accuracy of models, is central to uncertainty quantification and we will discuss the use of Bayesian techniques to construct distributions for model inputs. Specifically, the focus will be on algorithms that are both highly robust and efficient to implement. The discussion will subsequently focus on the use of sensitivity analysis to isolate critical model inputs and reduce model complexity. This will include both local sensitivity analysis, based on derivatives of the response with respect to model parameters, and variance-based techniques which determine how uncertainties in responses are apportioned to uncertainties in parameters. The presentation will conclude with a discussion detailing the manner in which model discrepancy must be addressed to construct time-dependent models that can adequately predict future events. An important aspect of this presentation is that all concepts will be illustrated with a suite of both fundamental and large-scale examples from biology and engineering.
Overview on Inverse problems: integrating data with PDE-based models under uncertainty, by Dr. Noemi Petra.
Recent years have seen tremendous growth in the volumes of observational and experimental data. In this context, one fundamental question is: How do we extract knowledge from this data? When the data correspond to observations of physical systems (represented by mathematical models), this knowledge-from-data problem is fundamentally an inverse problem. This presentation aims to introduce the mathematical and computational aspects of inverse problems governed by partial differential equations, particularly modern developments that emphasize the quantification of uncertainty in the inverse solution within the framework of Bayesian inference. The concepts introduced in this talk will be demonstrated using the hIPPYlib - Inverse Problem Python library. hIPPYlib is an extensible software framework for the solution of large-scale deterministic and Bayesian inverse problems governed by partial differential equations with possibly infinite-dimensional parameter fields, which are high-dimensional after discretization. hIPPYlib overcomes the prohibitive nature of Bayesian inversion for this class of problems by implementing state-of-the-art scalable algorithms for PDE-based inverse problems that exploit the structure of the underlying operators, notably the Hessian of the log-posterior. The fast and scalable (with respect to both parameter and data dimensions) algorithms in hIPPYlib allow us to address critical questions in applying numerical simulations to potentially large-scale problems.
Spotlight on: Spatial statistics and uncertainty quantification in remote sensing and climate science, by Dr. Emily Kang.
This talk presentation will discuss two examples of uncertainty quantification in climate and environmental sciences. The first part of the talk will be on the development of new multivariate spatial statistical models to enhance climate projections by combining climate model output and observational data. Emphasis will be put on the importance of UQ for spatial and spatio-temporal processes and the need for new mathematical and statistical methods to better understand climate model outputs. The second part of the talk will introduce a statistical emulator designed for remote sensing applications. This emulator is built using dimension reduction and classic Gaussian process regression and be extended and compared to machine learning methods when it comes to constructing surrogate models.
Spotlight on: Optimization problems governed by PDEs under the influence of random variables, by Dr. Maria Strazzullo.
Parametric optimal control problems aim to narrow the gap between collected data and mathematical models, enabling more reliable and accurate simulations. These problems are common in various scientific contexts; however, their computational complexity often limits their applicability, especially in uncertain parametric settings involving multiple evaluations for many parameters. Realistic problems typically involve parameters affected by uncertainty due to scattered or missing data and noisy measurements.
This presentation will focus on optimization problems governed by parametric partial differential equations under the influence of random variables. To estimate statistical quantities, such as moments of the solutions, Monte Carlo estimators are employed. These estimators compute the average over numerous optimal solutions for various outcomes of the random parametric instance. This approach requires many simulations for different parameters, making standard discretization techniques unfeasible due to their time-consuming and computationally expensive nature. To address this challenge, we employ reduced-order models. These models exploit the parametric structure of the problem, identifying a low-dimensional representation known as the reduced space. Through Galerkin projection in this reduced space, the problem can be solved faster without compromising accuracy. These strategies accelerate standard statistical analysis techniques. Specifically, we focus on weighted ROMs (wROMs), tailored reduced strategies based on prior knowledge about the distribution of the random variable. Enhancing the reduced model with previous distribution information further accelerates simulations for new parametric instances, surpassing the capabilities of standard ROMs. This presentation will start with an introductory overview of optimal control for parametric PDEs and standard ROMs. Then, it will delve into wROMs and explore their applications in environmental sciences and convection-dominated flows in both steady and time-dependent settings.