Skip to content

Revamp testing workflow and use pytest instead of unittest #6889

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

smokestacklightnin
Copy link
Member

@smokestacklightnin smokestacklightnin commented Aug 2, 2024

Here is a summary of what is included in this PR:

  • As part of this workflow, pytest has been added so that we can drop the previous custom test collection system, resolving a long-standing TODO. pytest automatically collects all unittest tests.
  • Moving to pytest also allows tests to take advantage of marks, which will make it simple to run different types of tests
  • End-to-end tests have been split out into a separate job from unit tests, which should reduce total test time and decrease flakiness
  • Fixed a bad import statement for the penguin example in kubeflow_gcp_perf_test.py. @roseayeon @gbaned can you verify that this is the correct import?
  • End-to-end tests look like they require a few extra environment variables which point to a container image in order to actually run. I wasn't able to test these locally. @roseayeon @gbaned is there a container that you've been using to run these e2e tests? When the containers are not present, the tests requiring them are now skipped.
  • Run all tests by default, leaving users the option to skip e2e tests with the -m "not e2e" flag
  • Remove if __name__ == "__main__" section from tests because it is not necessary, or even run, when using pytest. Nontrivial functionality from those sections was preserved.
  • Remove deprecated pytest_runner from dependencies
  • Consolidate setup python and cache actions into just setup python action. Upgraded to v5.
  • Use setup-bazel workflow instead of using curl to download bazel
  • Linting was removed from test workflow in anticipation of adding linting in a subsequent PR
  • Update CONTRIBUTING.md to reflect all changes
  • Use a matrix to test multiple python versions and both e2e and non-e2e tests, minimizing repeated workflow code.
  • Remove enable_v2_behavior from tests that previously called it because tensorflow v2 is what is currently used.
  • Add xfail marks to classes and methods that have failing tests

Copy link

google-cla bot commented Aug 2, 2024

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

Copy link
Contributor

@peytondmurray peytondmurray left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks nice, thanks for this! Aside from comments, one question: everywhere there is an e2e pytest mark there is a slow mark. Can we just get rid of the slow mark since it is redundant here?

Copy link
Contributor

@peytondmurray peytondmurray left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Thanks for this 🚀

@smokestacklightnin smokestacklightnin changed the title Change testing to use pytest Revamp testing workflow and use pytest instead of unittest Aug 6, 2024
@smokestacklightnin smokestacklightnin marked this pull request as ready for review August 6, 2024 23:15
Copy link
Contributor

@peytondmurray peytondmurray left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for updating the bazel version. I think we should get rid of tfx_test_installed.sh. What do you think?

@smokestacklightnin
Copy link
Member Author

I think we should get rid of tfx_test_installed.sh. What do you think?

I agree in concept. My only hesitation is that it might break something internal for Google. That said, if we are disconnecting from Google's infrastructure (which we are), then this won't break anything and it is not needed

@gbaned gbaned self-assigned this Aug 12, 2024
@gbaned gbaned requested a review from roseayeon August 12, 2024 06:30
@smokestacklightnin smokestacklightnin force-pushed the ci/testing/pytest-up-and-running branch from c5eee4b to bcfc527 Compare August 16, 2024 03:23
@smokestacklightnin
Copy link
Member Author

Now that I've rebased to include linting, I need to fix linting errors.

@roseayeon roseayeon requested a review from lego0901 August 19, 2024 06:42
@lego0901
Copy link
Member

Thanks for doing the work!

Some of the tests fail, but they also failed before this PR. The failures were hidden before this PR because only tests for changed files were run. @roseayeon @gbaned how would you like to handle this?

Could you add some feature to skip the failing tests in this moment to fix it later if everything is settled? And please tag me so that I can keep track of it. Thanks.

@smokestacklightnin smokestacklightnin force-pushed the ci/testing/pytest-up-and-running branch 2 times, most recently from 629d857 to 66384fb Compare August 20, 2024 04:59
@keerthanakadiri keerthanakadiri requested review from roseayeon and removed request for roseayeon August 22, 2024 05:39
@smokestacklightnin
Copy link
Member Author

@lego0901 @roseayeon All checks currently pass and the PR is ready for review and merge. All failing tests have been given xfail marks

@smokestacklightnin smokestacklightnin force-pushed the ci/testing/pytest-up-and-running branch from 03de455 to 7b504b6 Compare August 28, 2024 23:55
@lego0901
Copy link
Member

Hi @smokestacklightnin sorry for the late reply. Could you point me out where the list of the skipped tests is so that we can handle it later? Thanks!

@smokestacklightnin
Copy link
Member Author

Hi @smokestacklightnin sorry for the late reply. Could you point me out where the list of the skipped tests is so that we can handle it later? Thanks!

Hi @lego0901, no problem at all. To see which tests have xfail, run the following command in the repo:

$ grep "pytest.mark.xfail" -r tfx/

@smokestacklightnin
Copy link
Member Author

@lego0901 FYI, per the request in the Google/Quansight meeting today (Wednesday), I have added tests for both nightly and default constraints. As I am writing this comment, I am waiting for the tests to pass, and once they do, I believe this PR is ready for merge, provided it passes your review.

@smokestacklightnin smokestacklightnin force-pushed the ci/testing/pytest-up-and-running branch from 0a15b28 to 29adb58 Compare September 3, 2024 05:37
@peytondmurray peytondmurray merged commit ae2767a into tensorflow:master Sep 3, 2024
14 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants