[mvapich-discuss] Problem in running mpi processes on TCP/IP

Devesh Sharma devesh28 at gmail.com
Tue Apr 8 02:14:51 EDT 2008


Hello Matthew,

Thanks for replying, This I have alredy tried but I dont see and data
traffic on ib0 interface (visual indication is LED blinking), event I run
Pallas
after boting mpd in the same way!

secondly on starting node (for mpd) mdptrace -l shows IP of ibnode1 and IP
on node2 and vice versa for other node
what is the reason?


On Mon, Apr 7, 2008 at 10:29 PM, Matthew Koop <koop at cse.ohio-state.edu>
wrote:

> Devesh,
>
> In addition to specifying ibnode1 and ibnode2 in the mpd.hosts, can you
> add --ifhn=ibnode1 to the mpdboot command (assuming you are starting
> mpdboot on ibnode1).
>
> mpdboot -n 2 --ifhn=ibnode1
>
> Let us know if this helps. Thanks.
>
> Matt
>
> On Mon, 7 Apr 2008, Devesh Sharma wrote:
>
> > Hello all,
> >
> > I am trying to run an MPI job beween two nodes using TCP/IP. The nodes
> > consists of two different ethernet interfaces those are eth0 and ib0
> (using
> > ipoib)
> > I want to run my job through ib0 interface, host-names for both the
> > interfaces are different node1 and node2 for eth0 and ibnode1 and
> ibnode2
> > for ib0,
> > but when I run mpdboot by specifying ibnode1 and ibnode2 in the mdp.host
> > file, mpdtrace shows
> > node1 and node2 still!
> > How can I select ib0 as communication interface?
> >
> > -Devesh
> >
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.cse.ohio-state.edu/pipermail/mvapich-discuss/attachments/20080408/21861dbc/attachment.html


More information about the mvapich-discuss mailing list