[mvapich-discuss] Error in Executing GAMESS with MPE

vaibhav dutt vaibhavsupersaiyan9 at gmail.com
Thu Feb 9 14:03:55 EST 2012


Hi,

With the 1.7 release I get an error like:

[cli_2]: readline failed
[cli_0]: readline failed
[cli_3]: readline failed
Fatal error in MPI_Init_thread: Error message texts are not
available[cli_0]: aborting job:
Fatal error in MPI_Init_thread: Error message texts are not available
[cli_4]: readline failed
[cli_7]: readline failed
[compute-0-1.local:mpispawn_1][read_size] Unexpected End-Of-File on file
descriptor 13. MPI process died?
[compute-0-10.local:mpispawn_2][read_size] Unexpected End-Of-File on file
descriptor 13. MPI process died?
[compute-0-11.local:mpispawn_3][read_size] Unexpected End-Of-File on file
descriptor 13. MPI process died?
[cli_2]: readline failed
Fatal error in MPI_Init_thread: Error message texts are not
available[cli_2]: aborting job:
Fatal error in MPI_Init_thread: Error message texts are not available
[cli_1]: readline failed
[cli_5]: readline failed
[cli_6]: readline failed
[compute-0-10.local:mpispawn_2][handle_mt_peer] Error while reading PMI
socket. MPI process died?
[compute-0-10.local:mpispawn_2][child_handler] MPI process (rank: 16, pid:
21624) terminated with signal 2 -> abort job
[compute-0-1.local:mpispawn_1][handle_mt_peer] Error while reading PMI
socket. MPI process died?
[compute-0-1.local:mpispawn_1][child_handler] MPI process (rank: 9, pid:
624) terminated with signal 2 -> abort job
[compute-0-11.local:mpispawn_3][handle_mt_peer] Error while reading PMI
socket. MPI process died?
[compute-0-11.local:mpispawn_3][child_handler] MPI process (rank: 24, pid:
26775) terminated with signal 2 -> abort job
RC=1

The 1.6 version works fine when I use MPE for NAS benchmarks. But it starts
giving this error as soon
as I try to profile GAMESS.
On Wed, Feb 8, 2012 at 4:57 PM, Krishna Kandalla <
kandalla at cse.ohio-state.edu> wrote:

> Hi Vaibhav,
>          Thank you for your post. Could you please indicate which
> version of MVAPICH2 you are using currently?  We have verified that
> some of the basic tests work fine with MPE support with both 1.7 and
> latest 1.8a2 releases. If you are yet to try either of these versions,
> please go ahead and give it a shot.
>
> Thanks,
> Krishna
>
> On Wed, Feb 8, 2012 at 4:40 PM, vaibhav dutt
> <vaibhavsupersaiyan9 at gmail.com> wrote:
> > Hi,
> >
> > I am trying to profile the quantum chemistry software GAMESS with MPE.
> > GAMESS executes just fine with mvapich2 without MPE.
> >
> > But now when I try to execute GAMESS with MPE on 4 nodes (8 cores per
> node),
> > I get an error like:
> >
> > [cli_2]: readline failed
> > [cli_4]: [cli_1]: readline failed
> > [cli_5]: readline failed
> > [cli_6]: readline failed
> > [cli_0]: readline failed
> > Fatal error in MPI_Init_thread: Error message texts are not
> > available[cli_0]: aborting job:
> > Fatal error in MPI_Init_thread: Error message texts are not available
> > readline failed
> > [cli_7]: readline failed
> > [cli_2]: readline failed
> > Fatal error in MPI_Init_thread: Error message texts are not
> > available[cli_2]: aborting job:
> > Fatal error in MPI_Init_thread: Error message texts are not available
> > [cli_3]: readline failed
> > handle_mt_peer: fail to read...: Success
> > handle_mt_peer: fail to read...: Success
> > handle_mt_peer: fail to read...: Success
> > RC=1
> >
> > Can somebody help me with this.
> >
> > Thanks
> >
> >
> > _______________________________________________
> > mvapich-discuss mailing list
> > mvapich-discuss at cse.ohio-state.edu
> > http://mail.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss
> >
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.cse.ohio-state.edu/pipermail/mvapich-discuss/attachments/20120209/05604cac/attachment.html


More information about the mvapich-discuss mailing list