Skip to content

331 fix and improve tests #333

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Feb 14, 2024
Merged

331 fix and improve tests #333

merged 6 commits into from
Feb 14, 2024

Conversation

d-perl
Copy link
Contributor

@d-perl d-perl commented Feb 13, 2024

Fixes #331
Improves the test infrastructure for dodal:

  • adds fixtures to mock config file paths so that devices relying on them can be created in CI
  • runs device instantiation tests on all beamlines and simulated versions thereof

Instructions to reviewer on how to test:

  1. Run new tests

Checks for reviewer

  • Would the PR title make sense to a scientist on a set of release notes
  • If a new device has been added does it follow the standards

@d-perl d-perl added the needed_for_release Issues that must be complete before the next release label Feb 13, 2024
Copy link
Contributor

@DiamondJoseph DiamondJoseph left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Leaving as comment to not block anything, but I have some reservations.
We're not making use of configuration files (at least, currently) and the testing rightly makes that optional- but it still runs the tests twice for each non-mx beamline when we have no conditional devices, but if we begin adding optional beamlines which rely on things other than "am I the live beamline or not", e.g. devices which are sometimes removed from the beamline, devices which might be used in one profile but not others, we'll be unable to test that without completely rewriting these tests again.

@@ -0,0 +1,299 @@
#
#
BeamLine BL03I
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is all of this required for the tests?
BeamLine BL03S?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I could chop some of it out, good point

Comment on lines 33 to 41
with patch.dict(os.environ, {"BEAMLINE": bl}, clear=True):
bl_mod = mock_bl(beamline)
devices = make_all_devices(bl_mod, fake_with_ophyd_sim=True)
for device_name, device in devices.items():
assert device_name in beamline_utils.ACTIVE_DEVICES
assert follows_bluesky_protocols(device)
assert len(beamline_utils.ACTIVE_DEVICES) == len(devices)
beamline_utils.clear_devices()
del bl_mod
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not really extensible for @skip_device using other conditions that the value of beamline.BL.

Maybe testing the configurations when @skip_device is !x when it is default x should be in test_beamline, and this only test only the default case?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure I totally understood the second part of this comment but I agree with the first - instead I included skipped devices in these tests

@d-perl
Copy link
Contributor Author

d-perl commented Feb 14, 2024

Thanks @DiamondJoseph, I basically agree with your concerns.

I updated the tests to only run once for each beamline, but create skipped devices instead.

Unfortunately we do use data from these config files in instantiating several devices, so it was only because they were being skipped that tests could run at all. Even more annoyingly, because the default files are on the DLS filesystem they might only break in CI.

Copy link

codecov bot commented Feb 14, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (7c73805) 89.97% compared to head (0029972) 92.89%.
Report is 13 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #333      +/-   ##
==========================================
+ Coverage   89.97%   92.89%   +2.92%     
==========================================
  Files          83       83              
  Lines        3171     3182      +11     
==========================================
+ Hits         2853     2956     +103     
+ Misses        318      226      -92     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Collaborator

@noemifrisina noemifrisina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks

@d-perl d-perl merged commit 5b8ef3e into main Feb 14, 2024
@d-perl d-perl deleted the 331_fix_tests branch February 14, 2024 14:59
@DiamondJoseph
Copy link
Contributor

@dperl-dls Still getting failing tests #332 rebased on top of main

I've seen it with p45 and i04, so there's still some state not being cleaned up properly.


=========================== short test summary info ============================
FAILED tests/beamlines/unit_tests/test_i24.py::test_device_creation - AssertionError: assert 'BL04I-MO-VGON-01:' == 'BL24I-MO-VGON-01:'
  
  - BL24I-MO-VGON-01:
  ?   ^
  + BL04I-MO-VGON-01:
  ?   ^
==== 1 failed, 378 passed, 5 skipped, 29 deselected, 64 warnings in 14.22s =====

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needed_for_release Issues that must be complete before the next release
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Improve device creation tests
3 participants