You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was able to successfully launch a dataproc cluster with rapids v23.12 stable, cuda 11.8 (both parsed via the metadata flags --rapids-version and --cuda-version)
The install_gpu_driver.sh script downloads ubuntu 18.04, and has outdated versions of cuda and gpu drivers ; installs rapids v22.12 and cuda 11.2 by default; doesn't include cuda 12 as an option
The latest rapids 24.02 is only compatible with ubuntu 20.04 and 22.04 (with cuda 11.8 or 12.) ; Thus the install scripts need to be updated accordingly with the newer drivers as well
To test RAPIDS libraries in the notebook environment, we need to edit rapids.sh script to activate the conda environment (dask-rapids) and register it as a kernel in Jupyter Lab/ notebook
For now the users will have to manually conda activate and register the dask-rapids kernel from the terminal in Jupyter.
Alternatively, users can use the dataproc:conda.env.config.uri, which is absolute path to a Conda environment YAML config file located in Cloud Storage. This file will be used to create and activate a new Conda environment on the cluster. But this option is redundant because you first have to export the conda env into a .yaml file
The text was updated successfully, but these errors were encountered:
CC @jacobtomlinson
Following the instructions to deploy RAPIDS in Dataproc:
I was able to successfully launch a dataproc cluster with rapids v23.12 stable, cuda 11.8 (both parsed via the metadata flags
--rapids-version
and--cuda-version
)The install_gpu_driver.sh script downloads ubuntu 18.04, and has outdated versions of cuda and gpu drivers ; installs rapids v22.12 and cuda 11.2 by default; doesn't include cuda 12 as an option
The latest rapids 24.02 is only compatible with ubuntu 20.04 and 22.04 (with cuda 11.8 or 12.) ; Thus the install scripts need to be updated accordingly with the newer drivers as well
To test RAPIDS libraries in the notebook environment, we need to edit rapids.sh script to activate the conda environment (
dask-rapids
) and register it as a kernel in Jupyter Lab/ notebookFor now the users will have to manually conda activate and register the dask-rapids kernel from the terminal in Jupyter.
Alternatively, users can use the dataproc:conda.env.config.uri, which is absolute path to a Conda environment YAML config file located in Cloud Storage. This file will be used to create and activate a new Conda environment on the cluster. But this option is redundant because you first have to export the conda env into a .yaml file
The text was updated successfully, but these errors were encountered: