[mvapich-discuss] mvapich218a1p1 Error while reading PMI socket

bright.yang at vaisala.com bright.yang at vaisala.com
Wed Nov 23 11:23:30 EST 2011


Hi,

 

  I got this error message while running mpirun_rsh. I saw some old
message on the same error but I can't find solution for it. -

$ mpirun_rsh -np 24 -hostfile hosts57 wrf.exe

  starting wrf task             1   of starting wrf task            24

             3 starting wrf task  starting wrf task  starting wrf task
starting wrf task  starting wrf task   of   starting wrf task starting
wrf task      starting wrf task           21starting wrf task
23  starting wrf task     of  of           22starting wrf task
18             11              24            9            7
10  starting wrf task            6           24  starting wrf task

starting wrf task starting wrf task            24 of           17
19 of

starting wrf task

           13           24

 starting wrf task  of           24starting wrf task

starting wrf task  of      of            5           4             24 of
starting wrf task    of            24

  of              24

 of           24

           24          15starting wrf task

 starting wrf task           20 of                16 of           12
24          14              24 of            24    of            2

 of

 

             0          24 of           24             8 of
24

           24

          24

 

 of            24

 

 of           24 of  of

 

           24          24          24

 

 

[compute-0-5.local:mpispawn_0][readline] Unexpected End-Of-File on file
descriptor 10. MPI process died?

[compute-0-5.local:mpispawn_0][mtpmi_processops] Error while reading PMI
socket. MPI process died?

[compute-0-7.local:mpispawn_1][readline] Unexpected End-Of-File on file
descriptor 7. MPI process died?

[compute-0-7.local:mpispawn_1][mtpmi_processops] Error while reading PMI
socket. MPI process died?

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 15,
pid: 5330) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 22,
pid: 5337) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 23,
pid: 5338) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 19,
pid: 5334) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 13,
pid: 5328) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 12,
pid: 5327) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 20,
pid: 5335) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 18,
pid: 5333) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 17,
pid: 5332) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 16,
pid: 5331) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 21,
pid: 5336) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 5, pid:
11421) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 7, pid:
11423) exited with status 1

[compute-0-7.local:mpispawn_1][child_handler] MPI process (rank: 14,
pid: 5329) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 1, pid:
11417) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 4, pid:
11420) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 0, pid:
11416) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 3, pid:
11419) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 6, pid:
11422) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 11,
pid: 11427) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 10,
pid: 11426) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 9, pid:
11425) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 2, pid:
11418) exited with status 1

[compute-0-5.local:mpispawn_0][child_handler] MPI process (rank: 8, pid:
11424) exited with status 1

 

Thanks.

 

Bright Yang

Vaisala, Boulder

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.cse.ohio-state.edu/pipermail/mvapich-discuss/attachments/20111123/942e92c2/attachment-0001.html


More information about the mvapich-discuss mailing list