[mvapich-discuss] MVAPICH2 HPL performance issue

Mikhail Posypkin mposypkin at gmail.com
Wed Apr 2 10:55:34 EDT 2014


Thank you very much! This recipe resolved the issue!


2014-04-02 13:31 GMT+04:00 Panda, Dhabaleswar <panda at cse.ohio-state.edu>:

>  I believe you are using MPI+OpenMP version of HPL. You need to disable
> affinity.
>
> Please take a look at the following sections (6.16, 6.17) and FAQ entry
> 9.1.4 of
> MVAPICH2 2.0rc1 user guide:
>
>
> http://mvapich.cse.ohio-state.edu/support/user_guide_mvapich2-2.0rc1.html#x1-780006.16
>
>
> http://mvapich.cse.ohio-state.edu/support/user_guide_mvapich2-2.0rc1.html#x1-1130009.1.4
>
>
> Let us know if this resolves the issue or not.
>
> Thanks,
>
> DK
>
>
>  ------------------------------
> *From:* mvapich-discuss [mvapich-discuss-bounces at cse.ohio-state.edu] on
> behalf of Mikhail Posypkin [mposypkin at gmail.com]
> *Sent:* Wednesday, April 02, 2014 3:42 AM
> *To:* mvapich-discuss at cse.ohio-state.edu
> *Subject:* [mvapich-discuss] MVAPICH2 HPL performance issue
>
>   Dear colleagues,
>
>  Mvapich2 1.9 and mvapich2 2.0rc demonstrate very poor performance in HPL
> Linpack test
> from netlib.org. The running time of the HPL test compiled with mvapich
> is at least 10 times greater then for the same test compiled with OpenMPI.
> I tried 16 and 32 processes on two 16 core servers. Suprsingly even if I
> run 'xhpl' executable from the command line on the cluster front-end server
> (1 MPI process) the difference in performance is the same (approximately 10
> times). I assume we build it with wrong flags or make some other
> installation error. Could you please help us to resolve this issue?
>
>  All the best,
> Mikhail
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cse.ohio-state.edu/pipermail/mvapich-discuss/attachments/20140402/0a7df06a/attachment-0001.html>


More information about the mvapich-discuss mailing list