[mvapich-discuss] Segmentation fault when compiling with Intel 15, mvapich2 2.1, and openmp
Jonathan Perkins
perkinjo at cse.ohio-state.edu
Wed Oct 21 18:26:32 EDT 2015
Hi Craig. We haven't seen this issue with the Intel 2016 compilers. Can
you try this out?
On Wed, Oct 21, 2015 at 1:07 PM Craig Tierney - NOAA Affiliate <
craig.tierney at noaa.gov> wrote:
> Jonathan,
>
> Are there any updates to this issue? Intel's last response to this was on
> 5/13 and they say it is an mvapich2 issue, not Intel.
>
> Craig
>
> On Thu, Sep 10, 2015 at 3:23 PM, Jonathan Perkins <
> perkinjo at cse.ohio-state.edu> wrote:
>
>> Hi Craig.
>>
>> There has been extensive discussion about this issue between TACC and
>> Intel. To the best of our
>> knowledge, the issue is been localized to the Intel compiler. This seems
>> to affect versions 15.0.2 and up.
>>
>> On Thu, Sep 10, 2015 at 4:07 PM, Craig Tierney - NOAA Affiliate <
>> craig.tierney at noaa.gov> wrote:
>>
>>> --===============3875389897677332204==
>>> Content-Type: multipart/alternative;
>>> boundary="047d7bdca5c272ca21051f6a262e"
>>>
>>> --047d7bdca5c272ca21051f6a262e
>>> Content-Type: text/plain; charset="UTF-8"
>>>
>>>
>>> Hello,
>>>
>>> I am having a problem building and running codes with Intel 15.X and
>>> Mvapich2 2.1 that use openmp. The following code shows the problem:
>>>
>>> program test
>>> use mpi
>>> integer :: ierr, iprov
>>> real(8) :: a
>>> call mpi_init_thread(MPI_THREAD_FUNNELED,iprov,ierr)
>>> call random_number(a)
>>> write(*,*)"hello"
>>> call mpi_finalize(ierr)
>>> end program test
>>>
>>> If this is compiled as:
>>>
>>> # mpif90 -openmp -O3 test.f90 -o test
>>>
>>> And run, the following error message is generated.
>>>
>>> # mpiexec.hydra -np 2 ./testmpifort
>>>
>>>
>>> ===================================================================================
>>> = BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
>>> = PID 17905 RUNNING AT fe8
>>> = EXIT CODE: 11
>>> = CLEANING UP REMAINING PROCESSES
>>> = YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
>>>
>>> ===================================================================================
>>> YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Segmentation fault
>>> (signal 11)
>>> This typically refers to a problem with your application.
>>> Please see the FAQ page for debugging suggestions
>>>
>>> When you get the stack trace, you see:
>>>
>>> (gdb) bt
>>> #0 0x000000000040b0c4 in init_resource ()
>>> #1 0x000000000040b02a in reentrancy_init ()
>>> #2 0x000000000040af48 in for__reentrancy_init ()
>>> #3 0x00002aaaab329115 in for_rtl_init_ () from
>>> /home/admin/software/apps/mvapich2/2.1-intel/lib/libmpi.so.12
>>> #4 0x0000000000403249 in main ()
>>>
>>> Intel believes this is a mvapich2 issue, not a compiler issue.
>>>
>>>
>>> https://software.intel.com/en-us/forums/intel-fortran-compiler-for-linux-and-mac-os-x/topic/540673
>>>
>>> Thanks,
>>> Craig
>>>
>>> --047d7bdca5c272ca21051f6a262e
>>> Content-Type: text/html; charset="UTF-8"
>>> Content-Transfer-Encoding: quoted-printable
>>>
>>> <div dir=3D"ltr">Hello,<div><br></div><div>I am having a problem
>>> building a=
>>> nd running codes with Intel 15.X and Mvapich2 2.1 that use openmp.=C2=A0
>>> Th=
>>> e following code shows the
>>> problem:</div><div><br></div><div><div>program t=
>>> est</div><div>=C2=A0 =C2=A0 use mpi</div><div>=C2=A0 =C2=A0 integer ::
>>> ierr=
>>> , iprov</div><div>=C2=A0 =C2=A0 real(8) :: a</div><div>=C2=A0 =C2=A0
>>> call m=
>>> pi_init_thread(MPI_THREAD_FUNNELED,iprov,ierr)</div><div>=C2=A0 =C2=A0
>>> call=
>>> random_number(a)</div><div>=C2=A0 =C2=A0
>>> write(*,*)"hello"</div>=
>>> <div>=C2=A0 =C2=A0 call mpi_finalize(ierr)</div><div>end program
>>> test</div>=
>>> </div><div><br></div><div>If this is compiled
>>> as:</div><div><br></div><div>=
>>> # mpif90 -openmp -O3 test.f90 -o test</div><div><br></div><div>And run,
>>> the=
>>> following error message is generated.</div><div><br></div><div>#
>>> mpiexec.h=
>>> ydra -np 2
>>> ./testmpifort</div><div><br></div><div>=3D=3D=3D=3D=3D=3D=3D=3D=
>>>
>>> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
>>>
>>> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
>>>
>>> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
>>> </div><div>=3D =C2=A0 BAD TERMINATION OF ONE OF YOUR APPLICATION
>>> PROCESSES<=
>>> /div><div>=3D =C2=A0 PID 17905 RUNNING AT fe8</div><div>=3D =C2=A0 EXIT
>>> COD=
>>> E: 11</div><div>=3D =C2=A0 CLEANING UP REMAINING PROCESSES</div><div>=3D
>>> =
>>> =C2=A0 YOU CAN IGNORE THE BELOW CLEANUP
>>> MESSAGES</div><div>=3D=3D=3D=3D=3D=
>>>
>>> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
>>>
>>> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
>>>
>>> =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
>>> =3D=3D=3D</div><div>YOUR APPLICATION TERMINATED WITH THE EXIT STRING:
>>> Segme=
>>> ntation fault (signal 11)</div><div>This typically refers to a problem
>>> with=
>>> your application.</div><div>Please see the FAQ page for debugging
>>> suggesti=
>>> ons</div><div><br></div><div>When you get the stack trace, you
>>> see:</div><d=
>>> iv><br></div><div><div>(gdb) bt</div><div>#0 =C2=A00x000000000040b0c4 in
>>> in=
>>> it_resource ()</div><div>#1 =C2=A00x000000000040b02a in reentrancy_init
>>> ()<=
>>> /div><div>#2 =C2=A00x000000000040af48 in for__reentrancy_init
>>> ()</div><div>=
>>> #3 =C2=A00x00002aaaab329115 in for_rtl_init_ () from
>>> /home/admin/software/a=
>>> pps/mvapich2/2.1-intel/lib/libmpi.so.12</div><div>#4
>>> =C2=A00x00000000004032=
>>> 49 in main ()</div></div><div><br></div><div>Intel believes this is a
>>> mvapi=
>>> ch2 issue, not a compiler issue.</div><div><br></div><div><a
>>> href=3D"https:=
>>> //
>>> software.intel.com/en-us/forums/intel-fortran-compiler-for-linux-and-mac-=
>>> os-x/topic/540673
>>> <http://software.intel.com/en-us/forums/intel-fortran-compiler-for-linux-and-mac-=os-x/topic/540673>
>>> ">https://software.intel.com/en-us/forums/intel-fortran-co=
>>> mpiler-for-linux-and-mac-os-x/topic/540673
>>> <https://software.intel.com/en-us/forums/intel-fortran-co=mpiler-for-linux-and-mac-os-x/topic/540673>
>>> </a><br></div><div><br></div><div=
>>> >Thanks,</div><div>Craig</div></div>
>>>
>>> --047d7bdca5c272ca21051f6a262e--
>>>
>>> --===============3875389897677332204==
>>> Content-Type: text/plain; charset="us-ascii"
>>> MIME-Version: 1.0
>>> Content-Transfer-Encoding: 7bit
>>> Content-Disposition: inline
>>>
>>> _______________________________________________
>>> mvapich-discuss mailing list
>>> mvapich-discuss at cse.ohio-state.edu
>>> http://mailman.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss
>>>
>>> --===============3875389897677332204==--
>>>
>>
>>
>>
>> --
>> Jonathan Perkins
>> http://www.cse.ohio-state.edu/~perkinjo
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cse.ohio-state.edu/pipermail/mvapich-discuss/attachments/20151021/76688e6b/attachment.html>
More information about the mvapich-discuss
mailing list