Skip to content

Commit d42e2aa

Browse files
authored
Iterate on "installing from tarballs" documentation. (#813)
* Explain what a "release tarball" is * Rework install instructions * Show how to test an installation in a bit more detail * Adjust section headings, pulling "Using Dockerfiles" out into its own section instead of burying it I'm not touching the `install_rocm_from_artifacts.py` section yet... that's probably going to be most flexible and convenient but it needs some quality of life changes to the script that I'll want us to make together with docs changes.
1 parent 03c70bd commit d42e2aa

File tree

1 file changed

+100
-28
lines changed

1 file changed

+100
-28
lines changed

RELEASES.md

Lines changed: 100 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -23,9 +23,10 @@ Table of contents:
2323
- [Using PyTorch Python packages](#using-pytorch-python-packages)
2424
- [Installing from tarballs](#installing-from-tarballs)
2525
- [Installing release tarballs](#installing-release-tarballs)
26-
- [Installing per-commit CI build tarballs](#installing-per-commit-ci-build-tarballs)
26+
- [Installing per-commit CI build tarballs manually](#installing-per-commit-ci-build-tarballs-manually)
2727
- [Installing tarballs using `install_rocm_from_artifacts.py`](#installing-tarballs-using-install_rocm_from_artifactspy)
2828
- [Using installed tarballs](#using-installed-tarballs)
29+
- [Using Dockerfiles](#using-dockerfiles)
2930

3031
## Installing releases using pip
3132

@@ -231,36 +232,93 @@ instructions in the AMD ROCm documentation.
231232

232233
## Installing from tarballs
233234

234-
<!-- TODO: clean up these sections and confirm the instructions work -->
235-
236-
Here's a quick way assuming you copied all the tar files into `${BUILD_ARTIFACTS_DIR}` to "install" TheRock into `${BUILD_ARTIFACTS_DIR}/output_dir`
235+
Standalone "ROCm SDK tarballs" are assembled from the same
236+
[artifacts](docs/development/artifacts.md) as the Python packages which can be
237+
[installed using pip](#installing-releases-using-pip), without the additional
238+
wrapper Python wheels or utility scripts.
237239

238240
### Installing release tarballs
239241

240-
Release tarballs are already flattened and simply need untarring, follow the below instructions.
242+
Release tarballs are automatically uploaded to both
243+
[GitHub releases](https://github.com/ROCm/TheRock/releases) and AWS S3 buckets.
244+
The S3 buckets do not yet have index pages.
245+
246+
| Release page | S3 bucket | Description |
247+
| --------------------------------------------------------------------------------- | ---------------------------------------------------------------------------- | ------------------------------------------------- |
248+
| [`nightly-tarball`](https://github.com/ROCm/TheRock/releases/tag/nightly-tarball) | [therock-nightly-tarball](https://therock-nightly-tarball.s3.amazonaws.com/) | Nightly builds from the `main` branch |
249+
| [`dev-tarball`](https://github.com/ROCm/TheRock/releases/tag/dev-tarball) | [therock-dev-tarball](https://therock-dev-tarball.s3.amazonaws.com/) | ⚠️ Development builds from project maintainers ⚠️ |
250+
251+
After downloading, simply extract the release tarball into place:
241252

242253
```bash
243-
echo "Unpacking artifacts"
244-
pushd "${BUILD_ARTIFACTS_DIR}"
245-
mkdir output_dir
246-
tar -xf *.tar.gz -C output_dir
247-
popd
254+
mkdir therock-tarball && cd therock-tarball
255+
# For example...
256+
wget https://github.com/ROCm/TheRock/releases/download/nightly-tarball/therock-dist-linux-gfx110X-dgpu-6.5.0rc20250610.tar.gz
257+
258+
mkdir install
259+
tar -xf *.tar.gz -C install
248260
```
249261

250-
### Installing per-commit CI build tarballs
262+
### Installing per-commit CI build tarballs manually
251263

252-
Our CI builds artifacts which need to be "flattened" by the `build_tools/fileset_tool.py artifact-flatten` command before they can be used. You will need to have a checkout (see for example [Clone and fetch sources](https://github.com/ROCm/TheRock/blob/main/docs/development/windows_support.md#clone-and-fetch-sources)) in `${SOURCE_DIR}` to use this tool and a Python environment.
264+
<!-- TODO: Hide this section by default?
265+
Maybe move into artifacts.md or another developer page. -->
253266

254-
```bash
255-
echo "Unpacking artifacts"
256-
pushd "${BUILD_ARTIFACTS_DIR}"
257-
mkdir output_dir
258-
python "${SOURCE_DIR}/build_tools/fileset_tool.py artifact-flatten *.tar.xz -o output_dir --verbose
259-
popd
260-
```
267+
Our CI builds artifacts at every commit. These can be installed by "flattening"
268+
them from the expanded artifacts down to a ROCm SDK "dist folder" using the
269+
`artifact-flatten` command from
270+
[`build_tools/fileset_tool.py`](https://github.com/ROCm/TheRock/blob/main/build_tools/fileset_tool.py).
271+
272+
1. Download TheRock's source code and setup your Python environment:
273+
274+
```bash
275+
# Clone the repository
276+
git clone https://github.com/ROCm/TheRock.git
277+
cd TheRock
278+
279+
# Init python virtual environment and install python dependencies
280+
python3 -m venv .venv && source .venv/bin/activate
281+
pip install -r requirements.txt
282+
```
283+
284+
1. Find the CI workflow run that you want to install from. For example, search
285+
through recent successful runs of the `ci.yml` workflow for `push` events on
286+
the `main` branch
287+
[using this page](https://github.com/ROCm/TheRock/actions/workflows/ci.yml?query=branch%3Amain+is%3Asuccess+event%3Apush)
288+
(choosing a build that took more than a few minutes - documentation only
289+
changes skip building and uploading).
290+
291+
1. Download the artifacts for that workflow run from S3 using either the
292+
[AWS CLI](https://aws.amazon.com/cli/) or
293+
[AWS SDK for Python (Boto3)](https://aws.amazon.com/sdk-for-python/):
294+
295+
<!-- TODO: replace URLs with cloudfront / some other CDN instead of raw S3 -->
296+
297+
```bash
298+
export LOCAL_ARTIFACTS_DIR=~/therock-artifacts
299+
export LOCAL_INSTALL_DIR=${LOCAL_ARTIFACTS_DIR}/install
300+
mkdir -p ${LOCAL_ARTIFACTS_DIR}
301+
mkdir -p ${LOCAL_INSTALL_DIR}
302+
303+
# Example: https://github.com/ROCm/TheRock/actions/runs/15575624591
304+
export RUN_ID=15575624591
305+
export OPERATING_SYSTEM=linux # or 'windows'
306+
aws s3 cp s3://therock-artifacts/${RUN_ID}-${OPERATING_SYSTEM}/ \
307+
${LOCAL_ARTIFACTS_DIR} \
308+
--no-sign-request --recursive --exclude "*" --include "*.tar.xz"
309+
```
310+
311+
1. Flatten the artifacts:
312+
313+
```bash
314+
python build_tools/fileset_tool.py artifact-flatten \
315+
${LOCAL_ARTIFACTS_DIR}/*.tar.xz -o ${LOCAL_INSTALL_DIR}
316+
```
261317

262318
### Installing tarballs using `install_rocm_from_artifacts.py`
263319

320+
<!-- TODO: move this above the manual `tar -xf` commands? -->
321+
264322
This script installs ROCm community builds produced by TheRock from either a developer/nightly tarball, a specific CI runner build or an already existing installation of TheRock. This script is used by CI and can be used locally.
265323

266324
Examples:
@@ -289,19 +347,33 @@ By default for CI workflow retrieval, all artifacts (excluding test artifacts) w
289347

290348
### Using installed tarballs
291349

292-
The quickest way is to run `rocminfo`
350+
After installing (downloading and extracting) a tarball, you can test it by
351+
running programs from the `bin/` directory:
293352

294353
```bash
295-
echo "Running rocminfo"
296-
pushd "${BUILD_ARTIFACTS_DIR}"
297-
./output_dir/bin/rocminfo
298-
popd
354+
ls install
355+
# bin include lib libexec llvm share
356+
357+
# Now test some of the installed tools:
358+
./install/bin/rocminfo
359+
./install/bin/test_hip_api
299360
```
300361

301-
## Where to get artifacts
362+
You may also want to add the install directory to your `PATH` or set other
363+
environment variables like `ROCM_HOME`.
364+
365+
## Using Dockerfiles
302366

303-
- [Releases](https://github.com/ROCm/TheRock/releases): Our releases page has the latest "developer" release of our tarball artifacts and source code.
367+
We publish [Dockerfiles](https://www.docker.com/) with packages preinstalled
368+
for your convenience. See
369+
https://github.com/orgs/ROCm/packages?repo_name=TheRock for the full list.
304370

305-
- [Packages](https://github.com/orgs/ROCm/packages?repo_name=TheRock): We currently publish docker images for LLVM targets we support (as well as a container for our build machines)
371+
| Package | Description |
372+
| ------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------ |
373+
| [`therock_build_manylinux_x86_64`](https://github.com/ROCm/TheRock/pkgs/container/therock_build_manylinux_x86_64) | Container for our CI/CD pipelines<br>(This does _not_ include ROCm or PyTorch but can be used to build them) |
374+
| [`therock_pytorch_dev_ubuntu_24_04_gfx942`](https://github.com/ROCm/TheRock/pkgs/container/therock_pytorch_dev_ubuntu_24_04_gfx942) | Ubuntu with PyTorch for ROCm gfx942 |
375+
| [`therock_pytorch_dev_ubuntu_24_04_gfx1100`](https://github.com/ROCm/TheRock/pkgs/container/therock_pytorch_dev_ubuntu_24_04_gfx1100) | Ubuntu with PyTorch for ROCm gfx1100 |
376+
| [`therock_pytorch_dev_ubuntu_24_04_gfx1151`](https://github.com/ROCm/TheRock/pkgs/container/therock_pytorch_dev_ubuntu_24_04_gfx1151) | Ubuntu with PyTorch for ROCm gfx1151 |
377+
| [`therock_pytorch_dev_ubuntu_24_04_gfx1201`](https://github.com/ROCm/TheRock/pkgs/container/therock_pytorch_dev_ubuntu_24_04_gfx1201) | Ubuntu with PyTorch for ROCm gfx1201 |
306378

307-
- [Per-commit CI builds](https://github.com/ROCm/TheRock/actions/workflows/ci.yml?query=branch%3Amain+is%3Asuccess): Each of our latest passing CI builds has its own artifacts you can leverage. This is the latest and greatest! We will eventually support a nightly release that is at a higher quality bar than CI. Note a quick recipe for getting all of these from the s3 bucket is to use this quick command `aws s3 cp s3://therock-artifacts . --recursive --exclude "*" --include "${RUN_ID}-${OPERATING_SYSTEM}/*.tar.xz" --no-sign-request` where ${RUN_ID} is the runner id you selected (see the URL). Check the [AWS docs](https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html) to get the aws cli.
379+
<!-- TODO: how-to's for using the dockerfiles -->

0 commit comments

Comments
 (0)