[Mvapich-discuss] Using MVAPICH2 is Singularity container

vru.inbri at yahoo.co.uk vru.inbri at yahoo.co.uk
Thu Feb 18 14:00:02 EST 2021


 When using a different number of processors the error becomes:
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
Does it help?
    On Thursday, 18 February 2021, 19:38:48 CET, vru.inbri--- via Mvapich-discuss <mvapich-discuss at lists.osu.edu> wrote:  
 
 Hi 
I built a Singularity container with Ubuntu, GNU compilers and MVAPICH2 2.3.5 
When trying to run it on our cluster it fails with errors like:
Fatal error in PMPI_Waitall:Other MPI error, error stack:PMPI_Waitall(419)..................: MPI_Waitall(count=7, req_array=0x55d7f03d0290, status_array=0x55d7f03b4e50) failedMPIR_Waitall_impl(248).............:MPIDI_CH3I_Progress(285)...........:handle_read(1350)..................:handle_read_individual(1408).......:MPIDI_CH3I_MRAIL_Parse_header(1502): Control shouldn't reach here in prototype, header %d (errno 71)
As a test I also installed the same OS, compilers and libraries in an empty virtual machine (directly, without using singularity) and everything works without problem
Does this make any sense for you?
Vru
_______________________________________________
Mvapich-discuss mailing list
Mvapich-discuss at lists.osu.edu
https://lists.osu.edu/mailman/listinfo/mvapich-discuss
  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.osu.edu/pipermail/mvapich-discuss/attachments/20210218/d8553b1a/attachment-0022.html>


More information about the Mvapich-discuss mailing list