[mvapich-discuss] Allreduce time when using MPI+OpenMP is too large comparing to when using MPI alone

Sarunya Pumma sarunya at vt.edu
Fri Mar 3 16:19:48 EST 2017


Hi,

I have observed a strange MPI_Allreduce time in my MPI + OpenMP program. My
program calls simple MPI_Allreduce which exchanges a message of size 318 kB
for 5000 times as shown below:

for (int i = 0; i < iter; i++) {
    MPI_Allreduce(msg_s, msg_r, count, MPI_FLOAT, MPI_SUM, MPI_COMM_WORLD);
}

I measured only the total time that the program spent on the loop.

I have compiled and run the program with 4 different options.

*1) MPI with 16 procs*
Compiled with icc without the -openmp flag
Ran on multiple machines. Each machine had 16 processes

*2) MPI with 1 proc*
Compiled with icc without the -openmp flag
Ran on multiple machines. Each machine had 1 process

*3) OMP + MPI with 1 proc 16 thds*
Compiled with icc with -openmp
Ran on multiple machines. Each machine had 1 process with 16 threads

*4) OMP + MPI with MV2_ENABLE_AFFINITY=0*
Compiled with icc with -openmp
Ran on multiple machines. Each machine had 1 process with 16 threads. Set
the environment variable MV2_ENABLE_AFFINITY=0

The results are shown in the graph

[image: Inline image 2]

Both *OMP + MPI *tends to have larger Allreduce time than the* MPI with 16
procs *when the number of machines is bigger. I expect the *OMP + MPI *time
to be as close as the *MPI with 1 proc* since there is only one process per
machine doing the communication, but the *OMP + MPI time* is about two
times higher than the *MPI with 1 proc* after 8 machines.

Is there anything that I can set to make the Allreduce time for *OMP + MPI*
better?

Thank you ver much

Best,
Sarunya
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cse.ohio-state.edu/pipermail/mvapich-discuss/attachments/20170303/f396d42f/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Pasted image at 2017_03_03 04_09 PM.png
Type: image/png
Size: 18004 bytes
Desc: not available
URL: <http://mailman.cse.ohio-state.edu/pipermail/mvapich-discuss/attachments/20170303/f396d42f/attachment-0001.png>


More information about the mvapich-discuss mailing list