Skip to content

Adding backwards optimization of initial conditions as a trainer class #203

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

jsschreck
Copy link
Collaborator

Keeping it minimal for now.

I will add a documentation page eventually for this, but I wanted to get it in front of eyes other than mine to double check that I have it implemented correctly according to Hakim/Vonich paper https://arxiv.org/pdf/2504.20238.

@jsschreck jsschreck requested review from djgagne and WillyChap July 8, 2025 16:24
@djgagne
Copy link
Collaborator

djgagne commented Jul 18, 2025

From the CI check:

tests/test_models.py:8: in <module>
    from credit.models import load_model
credit/models/__init__.py:6: in <module>
    from credit.models.crossformer import CrossFormer
credit/models/crossformer.py:12: in <module>
    from credit.models.unet_attention_modules import load_unet_attention
E   ModuleNotFoundError: No module named 'credit.models.unet_attention_modules'
=========================== short test summary info ============================
ERROR tests/test_models.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants