[mvapich-discuss] Fatal error in MPI_Init
Jonathan Perkins
perkinjo at cse.ohio-state.edu
Mon Nov 14 15:41:07 EST 2011
On Tue, 2011-11-15 at 00:29 +0530, Bhargava Ramu Kavati wrote:
> Hi,
> I am trying to run OSU MPI test apps with MVAPICH2-1.7 on my
> Infiniband setup (two machines connected back-to-back). I see the
> error at an early stage i.e during MPI_Init() with the command
>
> "mpirun_rsh -np 2
> -hostfile /root/hostfile_test /root/ofed_pkgs/mvapich2-1.7/osu_benchmarks/osu_latency"
>
>
> The complete error thrown is
> [cli_0]: aborting job:
> Fatal error in MPI_Init:
> Other MPI error
>
> [test1:mpispawn_0][readline] Unexpected End-Of-File on file descriptor
> 5. MPI process died?
> [test1:mpispawn_0][child_handler] MPI process (rank: 0, pid: 15261)
> exited with status 1
> [test1:mpispawn_0][mtpmi_processops] Error while reading PMI socket.
> MPI process died?
>
> [cli_1]: aborting job:
> Fatal error in MPI_Init:
> Other MPI error
>
> [test2:mpispawn_1][readline] Unexpected End-Of-File on file descriptor
> 5. MPI process died?
> [test2:mpispawn_1][mtpmi_processops] Error while reading PMI socket.
> MPI process died?
> [test2:mpispawn_1][child_handler] MPI process (rank: 1, pid: 16453)
> exited with status 1
>
> Can you please guide me what are all the OFED specific params
> checked/queried during MPI_Init().
>
> Thanks & Regards,
> Ramu
> _______________________________________________
> mvapich-discuss mailing list
> mvapich-discuss at cse.ohio-state.edu
> http://mail.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss
Ramu:
Can you please reconfigure using the --enable-g=dbg option. After
rebuilding you can also set the MV2_DEBUG_CORESIZE and
MV2_DEBUG_SHOW_BACKTRACE options as indicated by the following link.
http://mvapich.cse.ohio-state.edu/support/user_guide_mvapich2-1.8_alpha1.html#x1-1050009.1.10
--
Jonathan Perkins
More information about the mvapich-discuss
mailing list