Skip to content

Add support for using custom Environments and Strategies #608

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 11 commits into
base: main
Choose a base branch
from

Conversation

amorehead
Copy link
Contributor

What does this PR do?

Adds the ability for one to employ custom (e.g., Lightning Fabric) Environments (e.g., SlurmEnvironment) and Strategies (e.g., DeepSpeedStrategy) during model training or evaluation.

Before submitting

  • Did you make sure title is self-explanatory and the description concisely explains the PR?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you test your PR locally with pytest command?
  • Did you run pre-commit hooks with pre-commit run -a command?

Did you have fun?

Make sure you had fun coding 🙃 ⚡

@codecov-commenter
Copy link

codecov-commenter commented Oct 7, 2023

Codecov Report

Attention: 29 lines in your changes are missing coverage. Please review.

Comparison is base (bddbc24) 83.24% compared to head (27fd562) 78.60%.

Files Patch % Lines
src/__init__.py 37.50% 10 Missing ⚠️
src/train.py 60.86% 9 Missing ⚠️
src/eval.py 50.00% 8 Missing ⚠️
src/models/mnist_module.py 50.00% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #608      +/-   ##
==========================================
- Coverage   83.24%   78.60%   -4.65%     
==========================================
  Files          11       12       +1     
  Lines         376      430      +54     
==========================================
+ Hits          313      338      +25     
- Misses         63       92      +29     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@giladturok
Copy link

giladturok commented Aug 1, 2024

@amorehead Can you explain your rationale behind adding these additional configs? Super curious to know your thought process.

Is it related to PyTorch Lighting's use of a "strategy" defined here: https://lightning.ai/docs/pytorch/stable/extensions/strategy.html?

And some example strategies:

Screenshot 2024-08-01 at 12 56 51 PM

@amorehead
Copy link
Contributor Author

amorehead commented Aug 1, 2024

Hey, @gil2rok. Yes, Lightning's strategy class is what I had in mind here. I wanted to make it possible with this template to use any arbitrary (e.g., advanced) training strategy such as DeepSpeed with only a few lines of code changes.

@giladturok
Copy link

Hey, @gil2rok. Yes, Lightning's strategy class is what I had in mind here. I wanted to make it possible with this template to use any arbitrary (e.g., advanced) training strategy such as DeepSpeed with only a few lines of code changes.

Because this pull request has not been accepted, how did you use these advanced strategies? Did you just manually add them the PyTorch Lightning trainer arguments?

callbacks=callbacks,
logger=logger,
plugins=plugins,
strategy=strategy,
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As you can see here, now one can specify an optional strategy for Lightning to use e.g., via OmegaConf YAML config files.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gil2rok

@amorehead
Copy link
Contributor Author

@ashleve, may I ask for you to review these changes when you get a chance? I've been using them in my own forked projects for a couple years now, and I think the community would benefit from having them be available by default.

@giladturok
Copy link

giladturok commented Apr 19, 2025 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants