Skip to content
Licensed Unlicensed Requires Authentication Published by De Gruyter 2022

15 Deep learning in high dimension: ReLU neural network expression for Bayesian PDE inversion

From the book Optimization and Control for Partial Differential Equations

  • Joost A. A. Opschoor , Christoph Schwab and Jakob Zech

Abstract

We establish dimension-independent expression rates by deep ReLU networks for certain countably parametric maps, so-called (b, ε, X)-holomorphic functions. These are mappings from [−1, 1]N → X, with X being a Banach space, that admit analytic extensions to certain polyellipses in each of the input variables. Parametric maps of this type occur in uncertainty quantification for partial differential equations with uncertain inputs from function spaces, upon the introduction of bases. For such maps, we prove (constructive) expression rate bounds by families of deep neural networks, based on multilevel polynomial chaos expansions. We show that (b, ε X)-holomorphy implies summability and sparsity of coefficients in generalized polynomial chaos expansions. This, in turn, implies deep neural network expression rate bounds. We apply the results to Bayesian inverse problems for partial differential equations with distributed uncertain inputs from Banach spaces. Our results imply the existence of “neural Bayesian posteriors” emulating the posterior densities with expression rate bounds that are free from the curse of dimensionality and limited only by sparsity of certain gPC expansions. We prove that the neural Bayesian posteriors are robust in large data or small noise asymptotics (e. g., [42]), which can be emulated in a noiserobust fashion.

© 2022 Walter de Gruyter GmbH, Berlin/Boston
Downloaded on 9.6.2023 from https://www.degruyter.com/document/doi/10.1515/9783110695984-015/html
Scroll to top button