[mvapich-discuss] osu benchmarks over tcp

Devendar Bureddy bureddy at cse.ohio-state.edu
Sun Aug 14 16:21:40 EDT 2011


Hi Hoot

Thanks for using OSU benchmarks.  This issue is not with benchmarks.
It seems, you are using OpenMPI library. You can check this issue in
openMPI mailing list.

You can also configure MVAPICH2 in TCP/IP  mode to test over a 10GigE.
 Please take a look at the following sections in mvapich2 user guide
for this.
http://mvapich.cse.ohio-state.edu/support/user_guide_mvapich2-1.7rc1.html#x1-180004.9
http://mvapich.cse.ohio-state.edu/support/user_guide_mvapich2-1.7rc1.html#x1-190004.10

Please let us know if you see any issues with MVAPICH2.

-Devendar

On Sun, Aug 14, 2011 at 10:19 AM, Hoot Thompson <hoot at ptpnow.com> wrote:
> I'm trying to run a simple test over a 10GigE interface and for some reason
> it's trying to use eth0 (192.x addresses) whereas the intended host
> addresses are 10.10.10. and 10.10.10.2. A simple mpirun test works as
> expected. Is there a configuration I'm missing?
>
> Thanks in advance,
>
> Hoot
>
> hoot at u1-1104:~/osu$ mpirun -np 2 -hostfile hosts /home/hoot/hello
> hello, world
> hello, world
>
>
> hoot at u1-1104:~/osu$ mpirun -np 2 -hostfile hosts osu_bibw
> # OSU MPI Bi-Directional Bandwidth Test v3.3
> # Size     Bi-Bandwidth (MB/s)
> [u2-1104][[23864,1],1][btl_tcp_endpoint.c:638:mca_btl_tcp_endpoint_complete_connect]
> connect() to 192.168.122.133 failed: No route to host (113)
> [u1-1104][[23864,1],0][../../../../../../ompi/mca/btl/tcp/btl_tcp_endpoint.c:638:mca_btl_tcp_endpoint_complete_connect]
> connect() to 192.168.122.12 failed: No route to host (113)
> _______________________________________________
> mvapich-discuss mailing list
> mvapich-discuss at cse.ohio-state.edu
> http://mail.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss
>
>



More information about the mvapich-discuss mailing list