[Mvapich-discuss] Announcing the release of MPI4Dask 0.2

Panda, Dhabaleswar panda at cse.ohio-state.edu
Sun Mar 28 22:18:06 EDT 2021


This release announcement from the OSU HiBD team is relevant to the users of the MVAPICH2, MVAPICH2-X
and MVAPICH2-GDR libraries. Thus, this announcement is being forwarded to the MVAPICH-discuss list.

Thanks,

The MVAPICH Team

==============================================================

The OSU High-Performance Big Data (HiBD) team is pleased to announce
the release of MPI4Dask 0.2, which is a custom version of the Dask
Distributed library with support for high-performance MPI
communication backend. The MPI device in MPI4Dask uses mpi4py over the
MVAPICH2, MVAPICH2-X, or MVAPICH2-GDR library and targets modern HPC
clusters built with CPUs, GPUs, and high-performance interconnects.

This release of the MPI4Dask package is equipped with the following
features:

* MPI4Dask 0.2 (Features since 0.1 release are indicated as NEW):

    - Based on Dask Distributed 2021.01.0
    - Compliant with user-level Dask APIs and packages
    - Support for MPI-based communication GPU-based Dask applications
    - (NEW) Support for MPI-based communication CPU-based Dask applications
    - Implements point-to-point communication co-routines
    - Efficient chunking mechanism implemented for large messages
    - (NEW) Built on top of mpi4py over the MVAPICh2, MVAPICH2-X, and
      MVAPICH2-GDR library
    - Supports starting execution of Dask programs using Dask-MPI
    - Tested with
      - (NEW) CPU-based Dask applications using numPy and Pandas data frames
      - (NEW) GPU-based Dask applications using cuPy and cuDF
      - Mellanox InfiniBand adapters (FDR and EDR)
      - Various multi-core platforms
      - NVIDIA V100 and Quadro RTX 5000 GPUs

For downloading MPI4Dask 0.2 package, the associated user
guide, please visit the following URL:

http://hibd.cse.ohio-state.edu

Sample performance numbers for MPI4Dask using benchmarks can
be viewed by visiting the `Performance' tab of the above website.

All questions, feedback and bug reports are welcome. Please post to
mvapich-discuss at lists.osu.edu.

Thanks,

The High-Performance Big Data (HiBD) Team
http://hibd.cse.ohio-state.edu

PS: The number of organizations using the HiBD stacks has crossed 340
(from 38 countries). Similarly, the number of downloads from the HiBD
site has crossed 39,400.  The HiBD team would like to thank all its
users and organizations!!




More information about the Mvapich-discuss mailing list