[mvapich-discuss] Plans for MPMD support for mpirun_rsh

Gus Correa gus at ldeo.columbia.edu
Fri Dec 11 14:20:27 EST 2009


Hi Jonathan, list

Thank you Jonathan for showing alternative ways to launch
MPMD programs with MVAPICH.
I haven't tried them but I will.
I was under the impression that there were no MPMD mechanisms
for MVAPIC2 at all, and it is great news that there are ways to do it.

However, at least from the standpoint of system administration,
there are advantages of using mpirun_rsh, instead of mpd,
to launch programs.
Moreover, MVAPICH2 seems to be migrating towards mpirun_rsh,
or am I mistaken about that?
Hence, the request for extending the mpirun_rsh
capability to handle MPMD.
Note that both myself and Craig administer clusters with
a multiple users (large clusters, many users, in Craig's
case, I suppose).

Likewise, here I prefer to use the OSC mpiexec to launch programs
compiled with MPICH2, instead of managing the mpd daemon population.
This simplifies my system administration (and user education) work.

Many thanks again,
Gus Correa
---------------------------------------------------------------------
Gustavo Correa
Lamont-Doherty Earth Observatory - Columbia University
Palisades, NY, 10964-8000 - USA
---------------------------------------------------------------------


Jonathan Perkins wrote:
> On Fri, Dec 11, 2009 at 12:23:38PM -0500, Gus Correa wrote:
>> Hi Jonathan, Craig, list
> 
> Hi Gus, comments inline.
> 
>> I second Craig's question and request for MPMD support.
>> That would be really great for Earth Science applications,
>> maybe for other areas as well.
>>
>> For instance, the whole atmosphere/ocean/climate community
>> relies mostly on MPMD for most modern coupled climate models.
>> Typically you have five or more executables working in MPMD mode:
>> atmosphere, ocean, ice, land, etc, coordinated by a flux coupler.
>> Here is one example:
>> http://www.ccsm.ucar.edu/models/ccsm3.0/
>> Other computational frameworks in Earth Science follow the same
>> paradigm and require MPMD.
>>
>> MPMD is particularly convenient in Linux clusters,
>> because it can keep reasonably small sized executables,
>> stacks, heaps, etc, whereas a SPMD executable containing
>> all code for multiple component models tends to be much larger,
>> and may even lead to memory paging, which is disastrous for MPI.
> 
> Agreed.
> 
>> We are currently restricted to use OpenMPI and MPICH2 for
>> these MPMD models, which are at the forefront of our research efforts.
>> It is a pity that so far we cannot use MVAPICH2 to test and
>> run these MPMD programs, although we use MVAPICH2
>> for smaller and less challenging SPMD programs with excellent results.
>>
>> Since MPICH2 supports MPMD, through the "mpiexec -configfile" option,
>> I wonder how difficult it would be to port
>> this capability to MVAPICH2 also.
> 
> This limitation only exists with the current incarnation of mpirun_rsh.
> You can use mpd/mpiexec to launch mvapich2.  Both mpiexec and mpirun_rsh
> are built by default when building mvapich2-1.4, therefore you can use
> mvapich2 for this applications in the same manner that you use mpich2.
> 
>> Thank you very much,
>> Gus Correa
>> ---------------------------------------------------------------------
>> Gustavo Correa
>> Lamont-Doherty Earth Observatory - Columbia University
>> Palisades, NY, 10964-8000 - USA
>> ---------------------------------------------------------------------
>>
>>
>>
>> Jonathan Perkins wrote:
>>> On Mon, Dec 07, 2009 at 03:52:18PM -0700, Craig Tierney wrote:
>>>> >From what I can tell in the documentation, mpirun_rsh
>>>> is the favored launch mechanism.  However, I don't see
>>>> a way to launch MPMD applications.   If I read
>>>> the docs correctly (that it isn't supported), Are there
>>>> plans to add this to mpirun_rsh?
>>> Yes, we plan to add this in a future mvapich2 release.
>>>
>>>> Craig
>>>>
>>>> -- 
>>>> Craig Tierney (craig.tierney at noaa.gov)
>>>> _______________________________________________
>>>> mvapich-discuss mailing list
>>>> mvapich-discuss at cse.ohio-state.edu
>>>> http://mail.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss
>>>
>>> ------------------------------------------------------------------------
>>>
>>> _______________________________________________
>>> mvapich-discuss mailing list
>>> mvapich-discuss at cse.ohio-state.edu
>>> http://mail.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss
>> _______________________________________________
>> mvapich-discuss mailing list
>> mvapich-discuss at cse.ohio-state.edu
>> http://mail.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss
> 



More information about the mvapich-discuss mailing list