[mvapich-discuss] [SPAM] Re: mvapich and gromacs4.6

hpc at lzu.edu.cn hpc at lzu.edu.cn
Sun Nov 17 20:20:28 EST 2013


Dear Sreeram Potluri


   Thank you response.you understanding is right. I install gromacs4.6 for GPU with mvapich2-2.0a, I use intel complier when I install mvapich2. but when I running gromacs in two nodes, show error:

Error on node 0, will try to stop all the nodes
Halting parallel program mdrun_mpi on CPU 0 out of 2

gcq#319: "Your Country Raised You, Your Country Fed You, and Just Like Any Other Country it Will Break You" (Gogol Bordello)

[cli_0]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0

-------------------------------------------------------
Program mdrun_mpi, VERSION 4.6.3
Source code file: /home/*/gromacs-4.6.3/src/gmxlib/gmx_detect_hardware.c, line: 349

Fatal error:
Incorrect launch configuration: mismatching number of PP MPI processes and GPUs per node.
mdrun_mpi was started with 1 PP MPI process per node, but you provided 2 GPUs.
For more information and tips for troubleshooting, please check the GROMACS
website at http://www.gromacs.org/Documentation/Errors
-------------------------------------------------------

"We Can Dance Like Iggy Pop" (Red Hot Chili Peppers)

Error on node 1, will try to stop all the nodes
Halting parallel program mdrun_mpi on CPU 1 out of 2

gcq#5: "We Can Dance Like Iggy Pop" (Red Hot Chili Peppers)

[cli_1]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 1


 

-----原始邮件-----
发件人: "sreeram potluri" <potluri.2 at osu.edu>
发送时间: 2013-11-17 08:09:57 (星期日)
收件人: hpc at lzu.edu.cn
抄送: mvapich-discuss <mvapich-discuss at cse.ohio-state.edu>
主题: [SPAM] Re: [mvapich-discuss] mvapich and gromacs4.6


Dear Yang, 


To make sure I understand you correctly, you are seeing an issue when running Gromacs across two or more GPUs on the same node, using MVAPICH2? 


MVAPICH2 does has support for GPU-GPU MPI communication on multi-GPU nodes. Can you give some more information below about the issue? 


the configure options you used to build the library 


the command you are using for the run 


a backtrace, if its a hang, or the error, if its a failure



configuration of the nodes


This will help us see what could be going on. 


Best
Sreeram Potluri



On Fri, Nov 15, 2013 at 7:28 AM, <hpc at lzu.edu.cn> wrote:
Dear All

   I want to install gromacs4.6 at infiniband network.I installed the mvapich2-2.0a, gromacs4.6 support the GPUs,but when I running mpirun with mvapich,I found mvapich is only running one GPU, if I use two or more GPU, gromacs is not running. Anyone meet this problem? openmpi is support muliti-GPU.I don not know if the mvapich is only support one GPU. Maybe I install process have problem. anyone have suggestion for me, I am very appreciate. thanks!

Best wishes


yang

_______________________________________________
mvapich-discuss mailing list
mvapich-discuss at cse.ohio-state.edu
http://mailman.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cse.ohio-state.edu/pipermail/mvapich-discuss/attachments/20131118/2264200d/attachment.html>


More information about the mvapich-discuss mailing list