In this paper we deal with Nesterov acceleration and show that it speeds up Landweber iteration when applied to linear ill-posed problems. It is proven that, if the exact solution , then optimal convergence rates are obtained if and if the iteration is terminated according to an a priori stopping rule. If or if the iteration is terminated according to the discrepancy principle, only suboptimal convergence rates can be guaranteed. Nevertheless, the number of iterations for Nesterov acceleration is always much smaller if the dimension of the problem is large. Numerical results verify the theoretical ones.
 A. Beck and M. Teboulle, Fast gradient-based algorithms for constrained total variation image denoising and deblurring problems, IEEE Trans. Image Process. 18 (2009), 2419–2434. 10.1109/TIP.2009.2028250Search in Google Scholar PubMed
 B. Kaltenbacher, A. Neubauer and O. Scherzer, Iterative Regularization Methods for Nonlinear Ill-Posed Problems, Radon Ser. Comput. Appl. Math. 6, De Gruyter, Berlin, 2008. 10.1515/9783110208276Search in Google Scholar
 Y. E. Nesterov, A method for solving the convex programming problem with convergence rate (in Russian), Dokl. Akad. Nauk SSSR 269 (1983), 543–547. Search in Google Scholar
© 2017 Walter de Gruyter GmbH, Berlin/Boston