Skip to content

Commit

Permalink
Merge pull request #317 from AIM-Harvard/patch0001
Browse files Browse the repository at this point in the history
Patch0001
  • Loading branch information
surajpaib authored Feb 6, 2025
2 parents 860b90b + a7ae531 commit db64439
Show file tree
Hide file tree
Showing 8 changed files with 571 additions and 130 deletions.
3 changes: 2 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
#* Variables
SHELL := /usr/bin/env bash
PYTHON := python
PYTHON := python3
python := python3
PYTHONPATH := `pwd`

#* Docker variables
Expand Down
8 changes: 4 additions & 4 deletions docs/replication-guide/baselines.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ By default, we configure this for Task 1. You can adapt this for Task 2 and Task

You can start training by running this in the root code folder,
```bash
lighter fit --config_file ./experiments/baselines/supervised_training/supervised_random_init.yaml
lighter fit --config_file=./experiments/baselines/supervised_training/supervised_random_init.yaml
```

## Transfer learning
Expand All @@ -20,7 +20,7 @@ This baseline is only used for Task 2 and Task 3 as we use the random init basel

You can start training by running this in the root code folder,
```bash
lighter fit --config_file ./experiments/baselines/supervised_training/supervised_finetune.yaml
lighter fit --config_file=./experiments/baselines/supervised_training/supervised_finetune.yaml
```

## Med3D / MedicalNet
Expand All @@ -31,7 +31,7 @@ We have provided re-implementations of Med3D to fit into our YAML workflows at `

You can start training by running this in the root code folder,
```bash
lighter fit --config_file ./experiments/baselines/med3d/finetune.yaml
lighter fit --config_file=./experiments/baselines/med3d/finetune.yaml
```

## Models Genesis
Expand All @@ -42,5 +42,5 @@ We have provided re-implementations of Models Genesis to fit into our YAML workf

You can start training by running this in the root code folder,
```bash
lighter fit --config_file ./experiments/baselines/models_genesis/finetune.yaml
lighter fit --config_file=./experiments/baselines/models_genesis/finetune.yaml
```
2 changes: 1 addition & 1 deletion docs/replication-guide/fm_adaptation.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ By default, we configure this for Task 1. You can adapt this for Task 2 and Task

You can start training by running this in the root code folder,
```bash
lighter fit --config_file ./experiments/adaptation/fmcib_finetune.yaml
lighter fit --config_file=./experiments/adaptation/fmcib_finetune.yaml
```

## Adaptation through linear evaluation
Expand Down
4 changes: 2 additions & 2 deletions docs/replication-guide/inference.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ In this section, we detail how features (from the FM and pre-trained models) and

In order to extract features from our models, you can use the following, (at root folder location)
```bash
lighter predict --config_file ./experiments/inference/extract_features.yaml
lighter predict --config_file=./experiments/inference/extract_features.yaml
```

!!! note
Expand All @@ -33,7 +33,7 @@ This will pull all the models from hugging face. Following this you can use any
These can be run using (at root folder location)

```bash
lighter predict --config_file ./experiments/inference/get_predictions.yaml
lighter predict --config_file=./experiments/inference/get_predictions.yaml
```
As with the previous YAMLS, please follow the 'Note:' tags to place appropriate data paths and change relevant parameters. This YAML is to be used if you want to get target predictions from the models.

Expand Down
2 changes: 1 addition & 1 deletion docs/replication-guide/reproduce_fm.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,5 +51,5 @@ Now you can start training by running this in the root code folder,
```bash
lighter fit --config_file ./experiments/pretraining/fmcib_pretrain.yaml
lighter fit --config_file=./experiments/pretraining/fmcib_pretrain.yaml
```
6 changes: 3 additions & 3 deletions fmcib/utils/idc_helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -215,14 +215,14 @@ def process_series_dir(series_dir: Path):

elif len(seg_files) != 0:
dcmseg2nii(str(seg_files[0]), str(series_dir), tag="GTV-")

# Build the main image NIfTI
try:
series_id = str(list(series_dir.glob("CT*.dcm"))[0]).split("_")[-2]
except IndexError:
logger.warning(f"No 'CT*.dcm' file found under {series_dir}. Skipping.")
return None

dicom_image = DcmInputAdapter().ingest(str(series_dir), series_id=series_id)
nii_output_adapter = NiiOutputAdapter()
nii_output_adapter.write(dicom_image, f"{series_dir}/image", gzip=True)
Expand All @@ -231,7 +231,7 @@ def process_series_dir(series_dir: Path):
logger.warning(f"No RTSTRUCT or SEG file found in {series_dir}. Skipping.")
return None

# Read the image (generated above)
# Read the image (generated above)
image_path = series_dir / "image.nii.gz"
if not image_path.exists():
logger.warning(f"No image file found at {image_path}. Skipping.")
Expand Down
669 changes: 553 additions & 116 deletions poetry.lock

Large diffs are not rendered by default.

7 changes: 5 additions & 2 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -51,9 +51,9 @@ pydantic = "1.10.13"
lightly = "1.4.19"
werkzeug = "^3.0.1"
urllib3 = ">=2.2.2"
aiohttp = "^3.9.1"
aiohttp = "^3.11.12"
pip = "^24.0"
tornado = ">=6.4.1"
tornado = "^6.4.2"
wandb = "^0.16.3"
mpmath = "1.3.0"
zipp = ">=3.19.1"
Expand All @@ -64,6 +64,9 @@ scikit-learn = ">=1.5.0"
requests = ">=2.32.2"
mkdocs-material = ">=9.5.32"
certifi = ">=2024.07.04"
sqlalchemy = "^2.0.37"
gunicorn = "^23.0.0"
jinja2 = "^3.1.5"

[tool.poetry.group.dev.dependencies]
black = "^24.4.2"
Expand Down

0 comments on commit db64439

Please sign in to comment.