Based on the idea of using different preconditioners in the FGMRES, the multipreconditioned GMRES with two or more preconditioners being applied simultaneously was proposed by Greif et. al [23]. Assume there are *k* preconditioners, i.e., *M*_{1}, …, *M*_{k}, *k* ≥ 2, then at the beginning, for the initial residual *r*_{0}, we have *v*_{1} = *r*_{0}/∥*r*_{0}∥_{2} and

$$\begin{array}{c}{\displaystyle {Z}_{1}=({M}_{1}^{-1}{v}_{1},\dots ,{M}_{k}^{-1}{v}_{1})\in {\mathcal{R}}^{n\times k},}\end{array}$$(9)

such that the first iteration is computed by *x*_{1} = *x*_{0} + *Z*_{1}*y*_{1}, where the vector *y*_{1} ∈ 𝓡^{k} is chosen to minimize ∥*b* – 𝓐*x*_{1}∥_{2}. From (9), it is easy to see that using all preconditioners simultaneously has enlarged the space where the solution is sought. Similar to , the multipreconditioned GMRES algorithm (i.e., ) is given as follows.

Comparing with , the search space increases at each iteration, and the relation in the equation (7) has been replaced by:

$$\begin{array}{c}{\displaystyle \mathcal{A}{\stackrel{~}{Z}}_{m}={\stackrel{~}{V}}_{m+1}{\stackrel{~}{H}}_{m},}\end{array}$$(10)

where

$$\begin{array}{c}{\displaystyle {\stackrel{~}{Z}}_{m}=({Z}_{1},\dots ,{Z}_{m}),\phantom{\rule{2em}{0ex}}{\stackrel{~}{V}}_{m+1}=({V}_{1},\dots ,{V}_{m+1}),}\end{array}$$

and

$$\begin{array}{c}{\displaystyle {\stackrel{~}{H}}_{m}=\left(\begin{array}{cccc}{H}_{1,1}& {H}_{1,2}& \cdots & {H}_{1,m}\\ {H}_{2,1}& {H}_{2,2}& \cdots & {H}_{2,m}\\ & \ddots & & \vdots \\ & & {H}_{m,m-1}& {H}_{m,m}\\ & & & {H}_{m+1,m}\end{array}\right),}\end{array}$$

in which the matrices *V*_{j+1} and *H*_{j+1,j} (1 ≤ *j* ≤ *m*) are computed by QR factorization in the line 8 of . Note that matrices *Z*_{j} and *V*_{j+1} have *k*^{j} columns, *j* = 1, …, *m*. Thus the matrix *V͠*_{m+1} has:

$$\begin{array}{c}{\displaystyle {\theta}_{m}=\sum _{j=0}^{m}{k}^{j}=\frac{{k}^{m+1}-1}{k-1}}\end{array}$$(11)

columns, the matrix *Z͠*_{m} has *θ*_{m} – 1 = (*k*^{m+1} – *k*)/(*k* – 1) columns, and the size of the upper hessenberg matrix *H͠*_{m} is *θ*_{m} × (*θ*_{m} – 1).

Let 𝓟_{m–1} = 𝓟_{m–1}(*X*_{1}, …, *X*_{k}) be the space of all possible polynomials of matrices in *k* variables of at most degree *m* – 1, then at the *j*-th step of the MPGMRES, the approximate solution can be represented by:

$$\begin{array}{c}{\displaystyle {x}_{j}={x}_{0}+\sum _{i=1}^{k}{\omega}_{j}^{i}({M}_{1}^{-1}\mathcal{A},\dots ,{M}_{k}^{-1}\mathcal{A}){M}_{i}^{-1}{r}_{0},}\end{array}$$(12)

where
$\begin{array}{c}{\displaystyle {\omega}_{j}^{i}}\end{array}$ ∈ 𝓟_{m–1}(*X*_{1}, …, *X*_{k}), see [23] for details. Furthermore, from (12), the corresponding residual can be computed as:

$$\begin{array}{}{\displaystyle {r}_{j}={r}_{0}+\sum _{i=1}^{k}{\omega}_{j}^{i}(\mathcal{A}{M}_{1}^{-1},\dots ,\mathcal{A}{M}_{k}^{-1})\mathcal{A}{M}_{i}^{-1}{r}_{0}}\\ {\displaystyle \phantom{\rule{1em}{0ex}}=\sum _{i=1}^{k}({\tau}_{i}+{\omega}_{j}^{i}(\mathcal{A}{M}_{1}^{-1},\dots ,\mathcal{A}{M}_{k}^{-1}\mathcal{A}{M}_{i}^{-1})){r}_{0}}\\ {\displaystyle \phantom{\rule{1em}{0ex}}=\sum _{i=1}^{k}{\beta}_{j+1}^{i}(\mathcal{A}{M}_{1}^{-1},\dots ,\mathcal{A}{M}_{k}^{-1}){r}_{0},}\end{array}$$(13)

where
$\begin{array}{}{\displaystyle {\beta}_{j+1}^{i}}\end{array}$ ∈ 𝓟_{m}(*X*_{1}, …, *X*_{k}), and *τ*_{i} satisfies

$$\begin{array}{}{\displaystyle \sum _{i=1}^{k}{\beta}_{j+1}^{i}(0,\dots ,0)=\sum _{i=1}^{k}{\tau}_{i}=1,}\end{array}$$

and

$$\begin{array}{}{\displaystyle \frac{{\mathrm{\partial}}^{j+1}{\beta}_{j+1}^{i}}{\mathrm{\partial}{X}_{s}^{j+1}}=0,\phantom{\rule{1em}{0ex}}1\le i,s\le k,\phantom{\rule{1em}{0ex}}i\ne s,}\end{array}$$

which implies that, in the matrix polynomial
$\begin{array}{}{\displaystyle {\beta}_{j+1}^{i}}\end{array}$, only the *i*-th variable may have the highest degree *j*, the degrees of all other variables are less than or equal to *j* – 1. From (13), the following result is established.

#### Theorem 3.1

*Let* *r*_{0} *be the initial residual of the linear system (6)*, *and* *r*_{j} *be the residual at the* *j*-*th step of the MPGMRES with* *k* *preconditioners* *M*_{1}, …, *M*_{k}. Then it has:

$$\begin{array}{}{\displaystyle \frac{\parallel {r}_{j}\parallel}{\parallel {r}_{0}\parallel}\le \underset{\begin{array}{c}{\beta}_{j+1}^{i}\in {\mathcal{P}}_{m}({X}_{1},\dots ,{X}_{k}),1\le i\le k\\ \sum _{i=1}^{k}{\beta}_{j+1}^{i}(0,\dots ,0)=1,\frac{{\mathrm{\partial}}^{j+1}{\beta}_{j+1}^{i}}{\mathrm{\partial}{X}_{s}^{j+1}}=0,i\ne s\end{array}}{min}\parallel \sum _{i=1}^{k}{\beta}_{j+1}^{i}(\mathcal{A}{M}_{1}^{-1},\dots ,\mathcal{A}{M}_{k}^{-1})\parallel .}\end{array}$$(14)

Therefore, it is important to find an optimal combination of all the different preconditioners as they are used simultaneously at each iteration [23, 24, 25]. This problem still requires further research.

## Comments (0)

General note:By using the comment function on degruyter.com you agree to our Privacy Statement. A respectful treatment of one another is important to us. Therefore we would like to draw your attention to our House Rules.