Skip to content

Conversation

@RohitP2005
Copy link
Contributor

@RohitP2005 RohitP2005 commented Jan 14, 2025

Description

This pull request has implemented deprecation decorators as wrapper for a function's arguement and the fucntion itself and has added -W flag to ensure the deprecation warnings are treated as error

Related to: #2028

Type of change

  • New feature (non-breaking change that adds functionality)
  • Optimization (back-end change that speeds up the code)
  • Bug fix (non-breaking change which fixes an issue)
  • Deprecation (non-breaking change that deprecates functionality with warnings)

Key checklist:

  • No style issues: $ pre-commit run --all-files (or $ nox -s pre-commit) (see CONTRIBUTING.md for how to set this up to run automatically when committing locally)
  • All tests pass: $ python -m pytest -W error (or $ nox -s tests) (ensure warnings are treated as errors to catch deprecation notices)
  • The documentation builds: $ python -m pytest --doctest-plus src (or $ nox -s doctests)

You can run integration tests, unit tests, and doctests together at once using $ nox -s quick.

Further checks:

  • Code is commented, particularly in hard-to-understand areas, including deprecation warnings and helper functions.
  • Tests added for any new functionality or to prove that the deprecation warnings are effective.
  • Confirm that any borrowed code from Astropy is appropriately attributed according to its license.

@RohitP2005
Copy link
Contributor Author

Previous Discussions : #4745 (comment)

Copy link
Contributor

@kratman kratman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think one of the main reasons for doing this is to track the date of the deprecation warnings. This does not seem to be in these changes.

Realistically the warnings work as-is, but we want to automate the removal of them. Right now I use git-blame to remove old warnings every 6 months or so. The extra decorators and such are just additional overhead unless we use it for something better than the current warnings

@kratman
Copy link
Contributor

kratman commented Jan 14, 2025

Original issue request: "Replace existing deprecation warnings/errors with deprecation package so tests fail after a few releases."

@RohitP2005
Copy link
Contributor Author

RohitP2005 commented Jan 14, 2025

Okay , I misunderstood that the deprecation package was not required and we needed a pythonic way.
Now i understand @kratman , Thankyou for the clarification .

I will just go ahead with the deprecation package with my next commit

Apologies for the confusion

@RohitP2005
Copy link
Contributor Author

@kratman, I've implemented the deprecation package for four of the existing deprecation warnings. Could you please guide me on any potential improvements?

Regarding renamed and deprecated arguments, it seems the deprecation package doesn't natively handle these cases. Do you have any suggestions on how to approach this? I considered writing custom decorators to manage them. Does that sound like a good plan?

@codecov
Copy link

codecov bot commented Jan 16, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 97.65%. Comparing base (1b84266) to head (71a5197).
Report is 1 commits behind head on develop.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #4757      +/-   ##
===========================================
- Coverage    98.57%   97.65%   -0.93%     
===========================================
  Files          304      304              
  Lines        23656    23672      +16     
===========================================
- Hits         23320    23117     -203     
- Misses         336      555     +219     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@kratman
Copy link
Contributor

kratman commented Jan 17, 2025

Does the package you use allow for tests to fail when the deprecation is old?

@RohitP2005
Copy link
Contributor Author

Does the package you use allow for tests to fail when the deprecation is old?

No, the deprecation package does not natively support automatically failing tests when a deprecation is old

Copy link
Member

@agriyakhetarpal agriyakhetarpal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I agree with the general idea here, and the @deprecated decorator looks neat. If there's no way to make the deprecated functions fail directly, that's fine – we should proceed with your suggestion in the PR description:

has added -W flag to ensure the deprecation warnings are treated as an error

I don't see this change here, but yes, please feel free to configure pytest with this, as the deprecation warnings would then be caught in the tests if they use it. Then, you can fix any pending deprecation warnings unrelated to your changes that show up in the test suite. If there are a bunch of them and they are not straightforward to fix, it is also okay to filter them using the filterwarnings configuration option and open a new issue about removing them to work on later.

See also: https://docs.pytest.org/en/stable/how-to/capture-warnings.html#controlling-warnings

Regarding deprecating a specific argument in a function, yes, adding a custom decorator for this sounds great. If you think it will require a bit of experimentation, I'm happy to help review it in a follow-up PR (you may edit the PR description to mark this PR as related to #2028, but not fixing it). Limiting the scope of this PR sounds good to me to get it in.

Thanks!

@RohitP2005
Copy link
Contributor Author

you can mark this PR as draft until pybtex is sorted !

Since the tests are failing as expected due to deprecation.
how should i deal with them

@agriyakhetarpal
Copy link
Member

Okay, after looking at the CI logs more closely, I'm not sure I follow, as the pybtex issue no longer shows up?

For the BPX tests, we should file an upstream issue about the Pydantic deprecation (or better, a PR fixing it) and ignore the warning in the filters here for the time being until it is fixed and makes it to a release.

For the deprecation warnings in the tests that are coming from PyBaMM's deprecated APIs, those are the actual ones we need to resolve. Usually, we deprecate methods/arguments/etc. by adding a DeprecationWarning, but we never remove the deprecated methods/arguments in the tests that might be using them. This PR has been a step forward towards catching such uses, so that they become hard errors instead of showing up as warnings. In some tests, we are catching and thereby testing if the appropriate DeprecationWarnings are raised when calling a deprecated method/argument – these are the ones that are useful, and should be kept.

I hope this is clear!

@RohitP2005
Copy link
Contributor Author

Okay, after looking at the CI logs more closely, I'm not sure I follow, as the pybtex issue no longer shows up?

For the BPX tests, we should file an upstream issue about the Pydantic deprecation (or better, a PR fixing it) and ignore the warning in the filters here for the time being until it is fixed and makes it to a release.

For the deprecation warnings in the tests that are coming from PyBaMM's deprecated APIs, those are the actual ones we need to resolve. Usually, we deprecate methods/arguments/etc. by adding a DeprecationWarning, but we never remove the deprecated methods/arguments in the tests that might be using them. This PR has been a step forward towards catching such uses, so that they become hard errors instead of showing up as warnings. In some tests, we are catching and thereby testing if the appropriate DeprecationWarnings are raised when calling a deprecated method/argument – these are the ones that are useful, and should be kept.

I hope this is clear!

Understood very well
That clarifies everything

@RohitP2005
Copy link
Contributor Author

Hey @agriyakhetarpal i fixed the tests with deprecated functions but i need help with the
test_bpx.py file

@RohitP2005
Copy link
Contributor Author

Hey @agriyakhetarpal i fixed the tests with deprecated functions but i need help with the test_bpx.py file

It has been currently ignored in the pytest

Copy link
Member

@agriyakhetarpal agriyakhetarpal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, @RohitP2005! Just a few more comments to go here. Also, there's another warning in the doctests that we need to ignore, see: https://github.com/pybamm-team/PyBaMM/actions/runs/14079804552/job/39430386846?pr=4757

Comment on lines 70 to 108
def test_symbol_auxiliary_domains(self):
a = pybamm.Symbol(
"a",
domain="test",
auxiliary_domains={
"secondary": "sec",
"tertiary": "tert",
"quaternary": "quat",
},
)
assert a.domain == ["test"]
assert a.secondary_domain == ["sec"]
assert a.tertiary_domain == ["tert"]
assert a.tertiary_domain == ["tert"]
assert a.quaternary_domain == ["quat"]
assert a.domains == {
"primary": ["test"],
"secondary": ["sec"],
"tertiary": ["tert"],
"quaternary": ["quat"],
}

a = pybamm.Symbol("a", domain=["t", "e", "s"])
assert a.domain == ["t", "e", "s"]
with pytest.raises(TypeError):
a = pybamm.Symbol("a", domain=1)
b = pybamm.Symbol("b", domain="test sec")
with pytest.raises(pybamm.DomainError, match="All domains must be different"):
b.domains = {"primary": "test", "secondary": "test"}
with pytest.raises(pybamm.DomainError, match="All domains must be different"):
b = pybamm.Symbol(
"b",
domain="test",
auxiliary_domains={"secondary": ["test sec"], "tertiary": ["test sec"]},
)

with pytest.raises(NotImplementedError, match="auxiliary_domains"):
a.auxiliary_domains

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I know why this test was removed?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@RohitP2005, this conversation is still relevant – as I'm not sure why the test was removed. It should be reinstated, or we need a valid reason for why it should be removed.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again, this conversation is still relevant – I'd like to know why we need to remove this test?

@RohitP2005
Copy link
Contributor Author

Hey @agriyakhetarpal this issue have been here for a while,
I have made the changes as requested .
Kindly trigger the CI so that we can merge it asap

@RohitP2005
Copy link
Contributor Author

@agriyakhetarpal I just need a little help, the test past successfully locally in my machine.
I tried silencing and the doctest also run without fails .

is there smth i am missing because of which the CI are failing

@agriyakhetarpal
Copy link
Member

agriyakhetarpal commented Apr 11, 2025

It fails on my end, could you please check if pytest is picking up the configuration from pyproject.toml?

@RohitP2005
Copy link
Contributor Author

@agriyakhetarpal I just need to confirm the CI with this commit, kindly trigger the CIs.

@RohitP2005
Copy link
Contributor Author

Hey @agriyakhetarpal I just bumped down the version of setuptools to handle the deprecated_pkgresources warning, will that be a feasible solution. ?

@agriyakhetarpal
Copy link
Member

Yes, I'm okay with it if you can't ignore the warning in any other way. Thanks!

@RohitP2005
Copy link
Contributor Author

@agriyakhetarpal i guess the PR is ready to merge, Once the PR completes the CI

Copy link
Member

@agriyakhetarpal agriyakhetarpal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, @RohitP2005. This is inching closer to a merge now. I still have a few more comments:

"-nauto",
"-vra",
"-ra",
"--showlocals",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This conversation is still relevant, as I'm not sure why we need to include this option with this PR – is it related to the changes? It's useful, but if it doesn't need to be included for the main changes here, please consider adding it in another PR.

"--showlocals",
"--strict-config",
"--strict-markers",
"--ignore=tests/unit/test_parameters/test_bpx.py",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rather than ignoring the file completely, let's skip/xfail the relevant tests in that file. As I previously mentioned: if you can't address/capture the warnings coming out of them, please open an issue in the BPX repo and link it here.

Comment on lines 70 to 108
def test_symbol_auxiliary_domains(self):
a = pybamm.Symbol(
"a",
domain="test",
auxiliary_domains={
"secondary": "sec",
"tertiary": "tert",
"quaternary": "quat",
},
)
assert a.domain == ["test"]
assert a.secondary_domain == ["sec"]
assert a.tertiary_domain == ["tert"]
assert a.tertiary_domain == ["tert"]
assert a.quaternary_domain == ["quat"]
assert a.domains == {
"primary": ["test"],
"secondary": ["sec"],
"tertiary": ["tert"],
"quaternary": ["quat"],
}

a = pybamm.Symbol("a", domain=["t", "e", "s"])
assert a.domain == ["t", "e", "s"]
with pytest.raises(TypeError):
a = pybamm.Symbol("a", domain=1)
b = pybamm.Symbol("b", domain="test sec")
with pytest.raises(pybamm.DomainError, match="All domains must be different"):
b.domains = {"primary": "test", "secondary": "test"}
with pytest.raises(pybamm.DomainError, match="All domains must be different"):
b = pybamm.Symbol(
"b",
domain="test",
auxiliary_domains={"secondary": ["test sec"], "tertiary": ["test sec"]},
)

with pytest.raises(NotImplementedError, match="auxiliary_domains"):
a.auxiliary_domains

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Again, this conversation is still relevant – I'd like to know why we need to remove this test?

@RohitP2005
Copy link
Contributor Author

@agriyakhetarpal

  1. I used addopts = [ "-nauto", "-vra", "-ra", "--showlocals" to debug the failing test and forgot to remove them, thanks for pointing it out

  2. I will try using the @pytest.mark.xfail decorators to suppress the test, if it persists , i will open a issue in BPX

  3. Since the auxilary domains are deprecated I thought the test are not required, maybe i will rewrite it to test for deprecation warning in case of use of auxilary domainx

@RohitP2005
Copy link
Contributor Author

@agriyakhetarpal does the points sum up requested changes

@RohitP2005
Copy link
Contributor Author

@agriyakhetarpal I have rewritten tests for auxiliary domains
And I think that should cover the codecov test coverage

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants