Abstract
We establish dimension-independent expression rates by deep ReLU networks for certain countably parametric maps, so-called (b, ε, X)-holomorphic functions. These are mappings from [−1, 1]N → X, with X being a Banach space, that admit analytic extensions to certain polyellipses in each of the input variables. Parametric maps of this type occur in uncertainty quantification for partial differential equations with uncertain inputs from function spaces, upon the introduction of bases. For such maps, we prove (constructive) expression rate bounds by families of deep neural networks, based on multilevel polynomial chaos expansions. We show that (b, ε X)-holomorphy implies summability and sparsity of coefficients in generalized polynomial chaos expansions. This, in turn, implies deep neural network expression rate bounds. We apply the results to Bayesian inverse problems for partial differential equations with distributed uncertain inputs from Banach spaces. Our results imply the existence of “neural Bayesian posteriors” emulating the posterior densities with expression rate bounds that are free from the curse of dimensionality and limited only by sparsity of certain gPC expansions. We prove that the neural Bayesian posteriors are robust in large data or small noise asymptotics (e. g., [42]), which can be emulated in a noiserobust fashion.