[mvapich-discuss] MVAPICH 2-GDR Issue with MPI_Allreduce

Subramoni, Hari subramoni.1 at osu.edu
Thu Jan 23 14:48:30 EST 2020


Hi All.

We followed up on this issue offline and have resolved the issue. The updated RPM is available for download from the MVAPICH download page.

Best,
Hari.

From: mvapich-discuss-bounces at cse.ohio-state.edu <mvapich-discuss-bounces at mailman.cse.ohio-state.edu> On Behalf Of Herten, Andreas
Sent: Wednesday, January 8, 2020 5:34 AM
To: mvapich-discuss at cse.ohio-state.edu <mvapich-discuss at mailman.cse.ohio-state.edu>
Cc: Alvarez, Damian <d.alvarez at fz-juelich.de>; Breuer, Thomas <t.breuer at fz-juelich.de>; Markus Schmitt <mschmitt at pks.mpg.de>; Hater, Thorsten <t.hater at fz-juelich.de>
Subject: [mvapich-discuss] MVAPICH 2-GDR Issue with MPI_Allreduce

Dear MV2 Support,

We see an issue when calling MPI_Allreduce on GPU memory buffers. The reduction produces wrong results or the program even seg faults.
I wrote up a more detailed description including a minimal reproducing example and some of our experiments here:
                https://gist.github.com/AndiH/b929b50b4c8d25137e0bfee25db63791<https://urldefense.com/v3/__https:/gist.github.com/AndiH/b929b50b4c8d25137e0bfee25db63791__;!!KGKeukY!mzGbEBbFzC6husDgfEvXunnb3UPLwL594G78RXrsJVo4LO7VTQv4egSuOcAN5Ckv8tRa5OTWYyKLHsI$>

Is this a bug?

New year’s greetings,

-Andreas
—
NVIDIA Application Lab
Jülich Supercomputing Centre
Forschungszentrum Jülich, Germany
+49 2461 61 1825

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cse.ohio-state.edu/pipermail/mvapich-discuss/attachments/20200123/49555838/attachment.html>


More information about the mvapich-discuss mailing list