[mvapich-discuss] Shell Environment Variables and mpirun_rsh

Thompson, Matt (GSFC-610.1)[SCIENCE SYSTEMS AND APPLICATIONS INC] matthew.thompson at nasa.gov
Fri Apr 19 09:09:05 EDT 2013


On 04/19/2013 08:55 AM, Jonathan Perkins wrote:
> On Fri, Apr 19, 2013 at 08:33:51AM -0400, Thompson, Matt (GSFC-610.1)[SCIENCE SYSTEMS AND APPLICATIONS INC] wrote:
>> All,
>>
>> I just had a question about the use of environment variables with
>> MVAPICH2 1.8.1 as stated in the MVAPICH2 User's Guide in regards to
>> mpirun_rsh. I understand that the best way is to pass them in on the
>> command-line directly before the executable. But the User's Guide
>> also says:
>>
>>> Alternatively, you may also place environmental variables in your
>>> shell environment (e.g. .bashrc). These will be automatically picked
>>> up when the application starts executing.
>>
>> I've tried this by doing this:
>>
>>    setenv MV2_ENABLE_AFFINITY    0
>>    setenv MV2_CPU_BINDING_POLICY scatter
>>    setenv MV2_CPU_BINDING_LEVEL  socket
>>    mpirun_rsh -hostfile $PBS_NODEFILE -np $PROCS ./GEOSgcm.x
>
> The above blurb "shell environment" should read "shell personal
> initialization file".  Your example above doesn't work because
> mpirun_rsh does not forward your environment to the remote processes.
>
> It looks like you're using some version of CSH.  Based on what the blurb
> above intended you would want to place the setenv statements in your
> .cshrc or .tcshrc file.  The reason that works is because when the new
> shells are launched for your remote processes they will use the rc files
> to initialize their environment.

Ahh. I see. Thanks for helping out. That makes more sense. I guess I 
wouldn't have thought to use .tcshrc since the tcsh script I'm using to 
run is launched with tcsh -f...so we don't pollute the environment of 
the script!

> Since you're using mvapich2-1.8.1, I suggest you put your environment
> settings for mvapich2 in a ~/.mvapich2.conf.
> http://mvapich.cse.ohio-state.edu/support/user_guide_mvapich2-1.9rc1.html#x1-490006.3

Nice. I hadn't seen that, but that's a useful hint. Is this equivalent 
to using the '-config' option for mpirun_rsh? I.e., I could put those 
variables in a file next to my script and do "-config mvapich2envs.conf'?

> MV2_ENABLE_AFFINITY = 0
> MV2_CPU_BINDING_POLICY = scatter # no effect because affinity is disabled
> MV2_CPU_BINDING_LEVEL = socket # no effect because affinity is disabled

Ah. Whoops. I now looked back and realized I didn't pass in 
MV2_ENABLE_AFFINITY=0 as I thought. I've been using OpenMP too much 
lately...and that sucker pretty much is a default option so I always 
type it. And now I know I can just put it in my .mvapich2.conf and never 
forget again!

> If you move to mvapich2-1.9rc1 you can take advantage of the new -export
> option of mpirun_rsh where the environment is forwarded for you.
>
> setenv MV2_ENABLE_AFFINITY    0
> setenv MV2_CPU_BINDING_POLICY scatter
> setenv MV2_CPU_BINDING_LEVEL  socket
> mpirun_rsh -export -hostfile $PBS_NODEFILE -np $PROCS ./GEOSgcm.x
>

Looks like a good reason to upgrade! We have been holding off because of 
a weird string broadcast issue MVAPICH2 1.9a had with our code 
(sometimes it'd broadcast Fortran NULL characters at the end as padding, 
sometimes /377 chars...). I'll have it installed and give it a try.

Thanks,
Matt Thompson

-- 
Matt Thompson, PhD     SSAI, Sr Software Test Engr
NASA GSFC, Global Modeling and Assimilation Office
Code 610.1, 8800 Greenbelt Rd, Greenbelt, MD 20771
Phone: 301-614-6712              Fax: 301-614-6246


More information about the mvapich-discuss mailing list