[mvapich-discuss] mvapich2-1.6rc1 problem (fwd)

Jonathan Perkins perkinjo at cse.ohio-state.edu
Mon Dec 6 15:37:01 EST 2010


Can you give the output of `mpiname -a' with the mvapich2-1.5 version
as well?  I'm wondering about the --with-pm selection.  It seems that
you should not be setting this to remshell if you're trying to use
mpirun_rsh.

On Mon, Dec 6, 2010 at 2:11 PM, Bryan Putnam <bfp at purdue.edu> wrote:
> On Mon, 6 Dec 2010, Jonathan Perkins wrote:
>
>> I'm not sure what is causing your problem.  I've built your program
>> with mvapich2-1.6rc1 on a couple of our systems and things are running
>> fine.  Did you give any special configuration options during your
>> build of mvapich2-1.6rc1?  Did you recompile your application with
>> 1.6rc1?  What does `mpiname -a' output?
>
> Hi,
>
> Yes, I recompiled the code using the newly build mvapich2-1.6rc1.
>
> The results of "mpiname -a" give
>
> steele-adm 1069% mpiname -a
> MVAPICH2 1.6rc1 2010-11-12 ch3:mrail
>
> Compilation
> CC: icc -O3 -O3 -fpic -DNDEBUG
> CXX: icpc -O3 -O3 -fpic -DNDEBUG
> F77: ifort  -O3 -fpic -DNDEBUG -L/usr/lib64
> F90: ifort  -O3 -fpic -DNDEBUG
>
> Configuration
> --with-rdma=gen2 --with-ib-libpath=/usr/lib64 --enable-fast
> --enable-threads --enable-debuginfo --enable-shared --enable-f77
> --enable-f90 --enable-cxx --enable-romio --with-pm=remshell --without-mpe
> --prefix=/apps/rhel5/mvapich2-1.6rc1/64/ib-intel-11.1.072
>
>
> My configure options are:
>
> $MPI_SRC/configure \
>  --with-rdma=gen2 \
>  --with-ib-libpath=/usr/lib64 \
>  --enable-fast \
>  --enable-threads \
>  --enable-debuginfo \
>  --enable-shared \
>  --enable-f77 \
>  --enable-f90 \
>  --enable-cxx \
>  --enable-romio \
>  --with-pm=remshell \
>  --without-mpe \
>  --prefix=$MPI_INSTALL/$CVER \
>  > configure_$CVER.log 2>&1
>
>
> Note that although this build was done using intel-11.1.072, I'm seeing
> the same problem with gcc-4.4.0 builds. Note that my mvapich2-1.5 build is
> still working correctly.
>
> Thanks,
> Bryan
>
>
>> > On Mon, Dec 6, 2010 at 10:52 AM, Bryan Putnam <bfp at purdue.edu> wrote:
>> >
>> >
>> > I've found that one of the standard "ping pong" programs is failing to run
>> > with mvapich2-1.6rc1. This code runs successfully with mvapich2-1.5. I'm
>> > seeing the problem on both IB and iWARP systems.
>> >
>> >
>> > steele-d000 1004% mpirun_rsh -hostfile $PBS_NODEFILE -np 2 ./pp2
>> > MPI process (rank: 0) terminated unexpectedly on
>> > steele-d000.rcac.purdue.edu
>> > Exit code -5 signaled from steele-d000
>> > MPI process (rank: 1) terminated unexpectedly on
>> > steele-d001.rcac.purdue.edu
>> >
>> > The C code is attached.
>> >
>> > Thanks for taking a look,
>> >
>> > Bryan
>> > _______________________________________________
>> > mvapich-discuss mailing list
>> > mvapich-discuss at cse.ohio-state.edu
>> > http://mail.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss
>> >
>> >
>>
>>
>>
>> --
>> Jonathan Perkins
>> http://www.cse.ohio-state.edu/~perkinjo
>>
>
>
>



-- 
Jonathan Perkins
http://www.cse.ohio-state.edu/~perkinjo



More information about the mvapich-discuss mailing list