[mvapich-discuss] Segfault when building HDF5 with MVAPICH2 2.1rc1 on SLES11 SP3
Jonathan Perkins
perkinjo at cse.ohio-state.edu
Fri Feb 6 09:50:05 EST 2015
Hi Matt. I'm not very sure what is going on but as a first step can you
share the output of `mpiname -a' from your MVAPICH2 build? It may also
to add the -show option to the Makefile command that builds the
executable that is failing with the segmentation fault and send that
output as well.
On Fri, Feb 06, 2015 at 08:53:04AM -0500, Thompson, Matt (GSFC-610.1)[SCIENCE SYSTEMS AND APPLICATIONS INC] wrote:
> MVAPICH Discuss,
>
> I have an issue that is weirdly specific. When I try to build either
> HDF5-1.8.12 or the latest stable HDF5-1.8.14 (with --enable-parallel) with
> MVAPICH2 2.1rc1 on SLES 11 SP3, the HDF5 build fails with a segfault:
>
> >libtool: link: mpicc -std=c99 -O3 -fPIC -o H5make_libsettings H5make_libsettings.o -L/discover/swdev/USER/Baselibs/TmpBaselibs/GMAO-Baselibs-4_0_6-FixHDF5/x86_64-unknown-linux-gnu/ifort/Linux/lib /discover/swdev/USER/Baselibs/TmpBaselibs/GMAO-Baselibs-4_0_6-FixHDF5/x86_64-unknown-linux-gnu/ifort/Linux/lib/libsz.a -lz -ldl -lm
> >LD_LIBRARY_PATH="$LD_LIBRARY_PATH`echo -lm | \
> > sed -e 's/-L/:/g' -e 's/ //g'`" \
> > ./H5make_libsettings > H5lib_settings.c || \
> > (test $HDF5_Make_Ignore && echo "*** Error ignored") || \
> > (rm -f H5lib_settings.c ; exit 1)
> >/bin/sh: line 4: 1838 Segmentation fault (core dumped) LD_LIBRARY_PATH="$LD_LIBRARY_PATH`echo -lm | sed -e 's/-L/:/g' -e 's/ //g'`" ./H5make_libsettings > H5lib_settings.c
>
> I can be fairly certain with that specificity because I've tried the
> following things (all with Intel 15.0.0.090):
>
> MVAPICH2 2.1rc1 on SLES 11 SP1: Works
> MVAPICH2 2.1rc1 on SLES 11 SP3: FAIL
> Intel MPI 5.0.1.135 on SLES 11 SP1: Works
> Intel MPI 5.0.1.135 on SLES 11 SP3: Works
> MPT 2.11 on SLES 11 SP3: Works
>
> I've also tried without --enable-parallel:
>
> No Parallel HDF5 on SLES 11 SP3: Works
>
> though in that case, the C compiler would be gcc not icc (since it's not
> calling mpicc which points to icc).
>
> Other than that, everything else is the same in each environment.
>
> I also tried compiling with -O0 -g -traceback and got the same failure.
> Looking at the core in gdb:
>
> >(gdb) backtrace
> >#0 0x00002aaaabfe0802 in _int_free () from /lib64/libc.so.6
> >#1 0x00002aaaabfe3b5c in free () from /lib64/libc.so.6
> >#2 0x00002aaaaf70c35d in ?? () from /lib64/libnss_sss.so.2
> >#3 0x00002aaaaf70c6f0 in ?? () from /lib64/libnss_sss.so.2
> >#4 0x00002aaaaf70a275 in _nss_sss_getpwuid_r () from /lib64/libnss_sss.so.2
> >#5 0x00002aaaac00fb2c in getpwuid_r@@GLIBC_2.2.5 () from /lib64/libc.so.6
> >#6 0x00002aaaac00f37f in getpwuid () from /lib64/libc.so.6
> >#7 0x0000000000401993 in print_header () at H5make_libsettings.c:185
> >#8 0x0000000000401d3a in main () at H5make_libsettings.c:290
>
> From this testing it seems like it isn't the compiler, it isn't *just* the
> operating system, and it isn't *just* the MPI stack, but rather the
> combination of MVAPICH2 2.1rc1 and SLES 11 SP3. This has cropped up because
> part of the supercomputer I work on has transitioned to SLES 11 SP3. And in
> attempting to rebuild some libraries to diagnose some issues, this came up.
>
> Now I have asked better computer engineers than I here to try to figure this
> out as well, but I was wondering if anyone here might know why one would
> fail while others succeed? That is, if you've seen something similar?
>
> Matt
> --
> Matt Thompson SSAI, Sr Software Test Engr
> NASA GSFC, Global Modeling and Assimilation Office
> Code 610.1, 8800 Greenbelt Rd, Greenbelt, MD 20771
> Phone: 301-614-6712 Fax: 301-614-6246
> _______________________________________________
> mvapich-discuss mailing list
> mvapich-discuss at cse.ohio-state.edu
> http://mailman.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss
--
Jonathan Perkins
More information about the mvapich-discuss
mailing list