Skip to content
Licensed Unlicensed Requires Authentication Published by De Gruyter May 9, 2008

Parallel Quasi-Monte Carlo Methods for Linear Algebra Problems

V. Alexandov, E. Atanassov and I. Dimov
From the journal

In this paper we propose an improved quasi-Monte Carlo method for solving Linear Algebra problems. We show that by using low-discrepancy sequences both the convergence and the CPU time of the algorithm are improved. Two parallelization schemes using the Message Passing Interface with static and dynamic load balancing are proposed. The dynamic scheme is useful for computing in the GRID environment.

Published Online: 2008-05-09
Published in Print: 2004-12

© de Gruyter 2004

Scroll Up Arrow