[Mvapich] Announcing the release of MPI4Dask 0.1

Panda, Dhabaleswar panda at cse.ohio-state.edu
Mon Jan 25 22:14:58 EST 2021


This release announcement from the OSU HiBD team is relevant to the users of the MVAPICH2-GDR library for GPU clusters. Thus, this announcement is being forwarded to the MVAPICH list.

Thanks, 

The MVAPICH Team

====================================

The OSU High-Performance Big Data (HiBD) team is pleased to announce
the first release of MPI4Dask 0.1, which is a custom version of the
Dask Distributed library with support for high-performance MPI
communication backend. The MPI device in MPI4Dask uses mpi4py over the
MVAPICH2-GDR library and targets modern HPC clusters built with GPUs
and high-performance interconnects.

The first release of the MPI4Dask package is equipped with the following
features:

* MPI4Dask 0.1:

    - Based on Dask Distributed 2021.01.0
    - Compliant with user-level Dask APIs and packages
    - Support for MPI-based communication in Dask for cluster of GPUs
      - Implements point-to-point communication co-routines
      - Efficient chunking mechanism implemented for large messages
    - Built on top of mpi4py over the MVAPICH2-GDR library
    - Supports starting execution of Dask programs using Dask-MPI
    - Tested with
      - Mellanox InfiniBand adapters (FDR and EDR)
      - Various multi-core platforms
      - NVIDIA V100 and Quadro RTX 5000 GPUs

For downloading MPI4Dask 0.1 package, the associated user
guide, please visit the following URL:

http://hibd.cse.ohio-state.edu

Sample performance numbers for MPI4Dask using benchmarks can
be viewed by visiting the `Performance' tab of the above website.

All questions, feedback and bug reports are welcome. Please post to
mvapich-discuss at lists.osu.edu.

Thanks,

The High-Performance Big Data (HiBD) Team
http://hibd.cse.ohio-state.edu

PS: The number of organizations using the HiBD stacks has crossed 335
(from 37 countries). Similarly, the number of downloads from the HiBD
site has crossed 39,000.  The HiBD team would like to thank all its
users and organizations!!


More information about the Mvapich-discuss mailing list