[mvapich-discuss] Re: [mvapich] Announcing the release of MVAPICH2 1.8.1

Devendar Bureddy bureddy at cse.ohio-state.edu
Sat Sep 29 11:16:26 EDT 2012


Hi Wang Riping

It seems, your setup is missing the installation of libibumad library.
 It is required to have this library in order to build MVAPICH2-1.8.1.
  Can you please retry after installing this library?
We have removed this dependency in MVAPICH2-1.9a.  You should be able
to install 1.9a without requiring libibumad  with extra
--disable-mcast configuration option.

-Devendar

On Sat, Sep 29, 2012 at 4:02 AM, Riping Wang <wang.riping.81 at gmail.com> wrote:
> Hi,
>
> I have this error when configure mvapich2 1.8.1 on Fedora 14 64 bit machine.
> '''
> configure: error: 'libibumad not found. Did you specify --with-ib-libpath=?'
> configure: error: channels/mrail configure failed
> configure: error: src/mpid/ch3 configure failed
> '''
> Could you help me to solve it?
>
> Thank you very much.
> WANG Riping
> 2012.9.29
>
>
>
> On 29 September 2012 11:30, Dhabaleswar Panda <panda at cse.ohio-state.edu>
> wrote:
>>
>> The MVAPICH team is pleased to announce the release of MVAPICH2 1.8.1.
>> This is a bug-fix release compared to MVAPICH2 1.8.
>>
>> Bug-Fixes for MVAPICH2 1.8.1. (since MVAPICH2 1.8GA release) are listed
>> below:
>>
>>     - Fix issue in intra-node knomial bcast
>>     - Handle gethostbyname return values gracefully
>>     - Fix corner case issue in two-level gather code path
>>     - Fix bug in CUDA events/streams pool management
>>     - Fix in GPU device pointer detection
>>         - Thanks to Brody Huval from Stanford for the report
>>     - Fix issue in selecting CUDA run-time variables when running on
>> single
>>       node in SMP only mode
>>     - Fix ptmalloc initialization issue when MALLOC_CHECK_ is defined in
>> the
>>       environment
>>         - Thanks to Mehmet Belgin from Georgia Institute of Technology for
>> the
>>           report
>>     - Fix memory corruption and handle heterogeneous architectures in
>> gather
>>       collective
>>     - Fix issue in detecting the correct HCA type
>>     - Fix issue in ring start-up to select correct HCA when MV2_IBA_HCA is
>>       specified
>>     - Fix SEGFAULT in MPI_Finalize when IB loop-back is used
>>     - Fix memory corruption on nodes with 64-cores
>>         - Thanks to M Xie for the report
>>     - Fix hang in MPI_Finalize with Nemesis interface when ptmalloc
>>       initialization fails
>>         - Thanks to Carson Holt from OICR for the report
>>     - Fix memory corruption in shared memory communication
>>         - Thanks to Craig Tierney from NOAA for the report and testing the
>>           patch
>>     - Fix issue in IB ring start-up selection with mpiexec.hydra
>>     - Option for selecting non-default gid-index in a loss-less fabric
>> setup
>>       in RoCE mode
>>     - Improved error reporting
>>     - Option to disable signal handler setup
>>
>> Most of these bug-fixes are also available with MVAPICH2 1.9a release.
>> MVAPICH2 users are strongly requested to upgrade their 1.8 installations to
>> 1.8.1 or 1.9a.
>>
>> For downloading MVAPICH2 1.8.1, associated user guide, quick start guide,
>> and accessing the SVN, please visit the following URL:
>>
>> http://mvapich.cse.ohio-state.edu
>>
>> All questions, feedbacks, bug reports, hints for performance tuning,
>> patches and enhancements are welcome. Please post it to the mvapich-discuss
>> mailing list (mvapich-discuss at cse.ohio-state.edu).
>>
>> Thanks,
>>
>> The MVAPICH Team
>> _______________________________________________
>> mvapich mailing list
>> mvapich at cse.ohio-state.edu
>> http://mail.cse.ohio-state.edu/mailman/listinfo/mvapich
>
>
>
>
> --
> ******************************************************************************
> WANG Riping
> Ph.D student,
> Institute for Study of the Earth's Interior,Okayama University,
> 827 Yamada, Misasa, Tottori-ken 682-0193, Japan
> Tel: +81-858-43-3739(Office), +81-858-43-1215(Inst)
> E-mail: wang.riping.81 at gmail.com
> ******************************************************************************
>
>
>
>
> _______________________________________________
> mvapich-discuss mailing list
> mvapich-discuss at cse.ohio-state.edu
> http://mail.cse.ohio-state.edu/mailman/listinfo/mvapich-discuss
>



-- 
Devendar


More information about the mvapich-discuss mailing list