From 58ccdd75eccade82ca7d6927caf4c0a135563a31 Mon Sep 17 00:00:00 2001 From: Jarek Potiuk Date: Thu, 9 Jan 2025 21:37:35 +0100 Subject: [PATCH] [v2-10-test] Backport pull_requttest_target removal This is a bulk change that synchronizes dev/ci scripts for v2-10-test branch with main #45266 - including follow-ups. Rather than cherry-picking relevant PRs, this one gets the latest version of the scripts from main and updates the branch with some changes to adapt them to v2-10-test (such as bringing back python 3.8 support, removing some providers checks after the bulk move of providers and making sure all tests are passing. This is far easier than cherry-picking the changes, because for the v2-10-test we stopped cherry-picking CI changes which was deemed unnecessary (we used to do it for all previous branches) but this made it far more difficult (if not impossible) to cherry-pick individual changes. Fortunately, the CI scripts are maintained in the way that their latest version **should** in principle work for a v2-* branch and hopefully after just a few adjustments we should be able to synchronize the changes from main by updating all relevant CI/DEV scripts, dockerfile images, workflows, pre-commits etc. Add actions in codeql workflows to scan github workflow actions (#45534) * add actions in codeql workflows to scan github workflow actions * add actions in codeql workflows to scan github workflow actions CodeQL scanning can run always on all code (#45541) The CodeQL scannig is fast and having custom configuration to select which scanning to run should be run makes it unnecessarily complex We can just run all CodeQL scans always. This has been suggested by actions codeql scan itself. Add explicit permissions for all workflow-run workflows (#45548) Those workflows inherit permissions from the calling workflows but it's good to add explicit permissions to indicate what is needed and in case we will also use the workflows for other purposes in the future - default permissions for older repos might be write so it's best to be explicit about the permissions. Found by CodeQL scanning Remove contents: write permission from generate-constraints (#45558) The write permission cannot be set for PRs from forks in the call workflow - so we have to come back to implicit permissions and make explicit permissions passing a bit differently. (cherry picked from commit ae32ebcc3c637902b8e62d549a02d537be76343c) Bump trove-classifiers from 2025.1.7.14 to 2025.1.10.15 (#45561) Bumps [trove-classifiers](https://github.com/pypa/trove-classifiers) from 2025.1.7.14 to 2025.1.10.15. - [Release notes](https://github.com/pypa/trove-classifiers/releases) - [Commits](https://github.com/pypa/trove-classifiers/compare/2025.1.7.14...2025.1.10.15) --- updated-dependencies: - dependency-name: trove-classifiers dependency-type: direct:development update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> (cherry picked from commit f3fd262de274b4fd1e26d36f9cead0a56775a309) Add optional --image-file-dir to store loaded files elsewhere (#45564) While backorting the "pull_request_target" removal to v2-10-test branches it turned out that there is not enough disk space on Public runner to load all 5 images and keep the file dump at the same time in the same filesystem. This PR allows to choose where the load/save files will be stored and in the github runner environment we store the files in "/mnt" wnich is a separate folder with 40GB free. (cherry picked from commit 662804921feb2c9539c882914a4134bf773c0e3e) Fix --from-pr feature for image load and stabilize help This is a follow-up after #45564 - it fixes the `--from-pr` and `--from-run` to work (it was failing with file does not exist). Also found out that gettempdir might return different directory depending on which is your designated tmp directory (for example in MacOS this is is a longer path in /var/.....) - so we have to force the default during help generation to always return "/tmp" so that the --help images do not change depending on which system you are and what your tmp directory is. --- .dockerignore | 7 + .github/actions/breeze/action.yml | 12 +- .../actions/checkout_target_commit/action.yml | 81 -- .github/actions/install-pre-commit/action.yml | 59 +- .github/actions/post_tests_success/action.yml | 4 +- .../actions/prepare_all_ci_images/action.yml | 68 + .../prepare_breeze_and_image/action.yml | 48 +- .../prepare_single_ci_image/action.yml | 56 + .../workflows/additional-ci-image-checks.yml | 33 +- .../workflows/additional-prod-image-tests.yml | 67 +- .github/workflows/automatic-backport.yml | 78 ++ .github/workflows/backport-cli.yml | 125 ++ .github/workflows/basic-tests.yml | 181 +-- .github/workflows/build-images.yml | 259 ---- .github/workflows/ci-image-build.yml | 129 +- ...ecks-mypy-docs.yml => ci-image-checks.yml} | 144 +- .github/workflows/ci.yml | 445 +++--- .github/workflows/codeql-analysis.yml | 53 +- .github/workflows/finalize-tests.yml | 32 +- .github/workflows/generate-constraints.yml | 43 +- .github/workflows/helm-tests.yml | 25 +- .../workflows/integration-system-tests.yml | 209 +++ .github/workflows/integration-tests.yml | 103 -- .github/workflows/k8s-tests.yml | 93 +- .github/workflows/news-fragment.yml | 82 ++ .github/workflows/prod-image-build.yml | 155 +- .github/workflows/prod-image-extra-checks.yml | 17 +- .github/workflows/push-image-cache.yml | 92 +- .github/workflows/recheck-old-bug-report.yml | 1 + .github/workflows/release_dockerhub_image.yml | 67 +- .github/workflows/run-unit-tests.yml | 100 +- .github/workflows/special-tests.yml | 171 +-- ...oviders.yml => test-provider-packages.yml} | 189 ++- .gitignore | 17 +- .pre-commit-config.yaml | 346 ++--- Dockerfile | 116 +- Dockerfile.ci | 228 +-- RELEASE_NOTES.rst | 77 +- airflow/executors/executor_loader.py | 2 +- .../auth_manager/cli_commands/user_command.py | 8 +- airflow/providers/influxdb/hooks/influxdb.py | 3 +- .../providers/microsoft/azure/hooks/adx.py | 4 +- airflow/providers/mongo/hooks/mongo.py | 8 +- airflow/providers_manager.py | 1 - airflow/reproducible_build.yaml | 4 +- .../03_contributors_quick_start.rst | 4 +- contributing-docs/08_static_code_checks.rst | 512 ++++--- contributing-docs/testing/helm_unit_tests.rst | 3 +- .../testing/integration_tests.rst | 68 +- contributing-docs/testing/unit_tests.rst | 33 +- dev/breeze/README.md | 88 +- dev/breeze/doc/01_installation.rst | 72 +- dev/breeze/doc/02_customizing.rst | 34 + dev/breeze/doc/03_developer_tasks.rst | 25 +- dev/breeze/doc/04_troubleshooting.rst | 47 +- dev/breeze/doc/05_test_commands.rst | 188 +-- dev/breeze/doc/06_managing_docker_images.rst | 113 +- .../doc/09_release_management_tasks.rst | 54 +- dev/breeze/doc/10_advanced_breeze_topics.rst | 15 +- ...002-implement-standalone-python-command.md | 2 +- .../adr/0016-use-uv-tool-to-install-breeze.md | 56 + dev/breeze/doc/ci/01_ci_environment.md | 97 +- dev/breeze/doc/ci/02_images.md | 205 ++- dev/breeze/doc/ci/03_github_variables.md | 2 +- dev/breeze/doc/ci/04_selective_checks.md | 177 +-- dev/breeze/doc/ci/05_workflows.md | 241 ++-- dev/breeze/doc/ci/06_debugging.md | 64 + dev/breeze/doc/ci/06_diagrams.md | 466 ------ dev/breeze/doc/ci/07_debugging.md | 88 -- dev/breeze/doc/ci/07_running_ci_locally.md | 187 +++ dev/breeze/doc/ci/08_running_ci_locally.md | 141 -- dev/breeze/doc/ci/README.md | 5 +- dev/breeze/doc/images/image_artifacts.png | Bin 0 -> 47666 bytes dev/breeze/doc/images/output-commands.svg | 104 +- dev/breeze/doc/images/output_ci-image.svg | 34 +- dev/breeze/doc/images/output_ci-image.txt | 2 +- .../doc/images/output_ci-image_build.svg | 176 ++- .../doc/images/output_ci-image_build.txt | 2 +- .../output_ci-image_export-mount-cache.svg | 118 ++ .../output_ci-image_export-mount-cache.txt | 1 + .../output_ci-image_import-mount-cache.svg | 118 ++ .../output_ci-image_import-mount-cache.txt | 1 + .../doc/images/output_ci-image_load.svg | 202 +++ .../doc/images/output_ci-image_load.txt | 1 + .../doc/images/output_ci-image_pull.svg | 70 +- .../doc/images/output_ci-image_pull.txt | 2 +- .../doc/images/output_ci-image_save.svg | 140 ++ .../doc/images/output_ci-image_save.txt | 1 + .../doc/images/output_ci-image_verify.svg | 56 +- .../doc/images/output_ci-image_verify.txt | 2 +- dev/breeze/doc/images/output_ci.svg | 2 +- ...output_ci_find-backtracking-candidates.svg | 2 +- .../doc/images/output_ci_fix-ownership.svg | 8 +- .../doc/images/output_ci_free-space.svg | 8 +- .../images/output_ci_get-workflow-info.svg | 6 +- .../doc/images/output_ci_resource-check.svg | 6 +- dev/breeze/doc/images/output_cleanup.svg | 10 +- .../doc/images/output_compile-ui-assets.svg | 119 ++ .../doc/images/output_compile-ui-assets.txt | 1 + .../doc/images/output_compile-www-assets.svg | 10 +- dev/breeze/doc/images/output_down.svg | 14 +- dev/breeze/doc/images/output_exec.svg | 6 +- .../images/output_generate-migration-file.svg | 14 +- dev/breeze/doc/images/output_k8s.svg | 14 +- .../doc/images/output_k8s_build-k8s-image.svg | 54 +- .../doc/images/output_k8s_build-k8s-image.txt | 2 +- .../images/output_k8s_configure-cluster.svg | 6 +- .../images/output_k8s_configure-cluster.txt | 2 +- .../doc/images/output_k8s_create-cluster.svg | 6 +- .../doc/images/output_k8s_create-cluster.txt | 2 +- .../doc/images/output_k8s_delete-cluster.svg | 4 +- .../doc/images/output_k8s_delete-cluster.txt | 2 +- .../doc/images/output_k8s_deploy-airflow.svg | 6 +- .../doc/images/output_k8s_deploy-airflow.txt | 2 +- dev/breeze/doc/images/output_k8s_k9s.svg | 4 +- dev/breeze/doc/images/output_k8s_k9s.txt | 2 +- dev/breeze/doc/images/output_k8s_logs.svg | 4 +- dev/breeze/doc/images/output_k8s_logs.txt | 2 +- .../images/output_k8s_run-complete-tests.svg | 84 +- .../images/output_k8s_run-complete-tests.txt | 2 +- .../doc/images/output_k8s_setup-env.svg | 8 +- dev/breeze/doc/images/output_k8s_shell.svg | 4 +- dev/breeze/doc/images/output_k8s_shell.txt | 2 +- dev/breeze/doc/images/output_k8s_status.svg | 4 +- dev/breeze/doc/images/output_k8s_status.txt | 2 +- dev/breeze/doc/images/output_k8s_tests.svg | 6 +- dev/breeze/doc/images/output_k8s_tests.txt | 2 +- .../images/output_k8s_upload-k8s-image.svg | 6 +- .../images/output_k8s_upload-k8s-image.txt | 2 +- dev/breeze/doc/images/output_prod-image.svg | 16 +- dev/breeze/doc/images/output_prod-image.txt | 2 +- .../doc/images/output_prod-image_build.svg | 198 ++- .../doc/images/output_prod-image_build.txt | 2 +- .../doc/images/output_prod-image_load.svg | 182 +++ .../doc/images/output_prod-image_load.txt | 1 + .../doc/images/output_prod-image_pull.svg | 70 +- .../doc/images/output_prod-image_pull.txt | 2 +- .../doc/images/output_prod-image_save.svg | 140 ++ .../doc/images/output_prod-image_save.txt | 1 + .../doc/images/output_prod-image_verify.svg | 56 +- .../doc/images/output_prod-image_verify.txt | 2 +- .../doc/images/output_release-management.svg | 2 +- ...anagement_clean-old-provider-artifacts.svg | 8 +- ...release-management_create-minor-branch.svg | 6 +- ...elease-management_generate-constraints.svg | 58 +- ...elease-management_generate-constraints.txt | 2 +- ...management_generate-issue-content-core.svg | 20 +- ...ment_generate-issue-content-helm-chart.svg | 16 +- ...ement_generate-issue-content-providers.svg | 10 +- ...management_generate-providers-metadata.svg | 6 +- ...e-management_install-provider-packages.svg | 88 +- ...e-management_install-provider-packages.txt | 2 +- ...ase-management_prepare-airflow-package.svg | 12 +- ...ase-management_prepare-airflow-tarball.svg | 4 +- ...-management_prepare-helm-chart-package.svg | 8 +- ...-management_prepare-helm-chart-tarball.svg | 20 +- ...e-management_prepare-provider-packages.svg | 52 +- ...e-management_prepare-provider-packages.txt | 2 +- ...lease-management_prepare-python-client.svg | 18 +- ...ut_release-management_start-rc-process.svg | 20 +- ...ut_release-management_start-rc-process.txt | 2 +- ...utput_release-management_start-release.svg | 8 +- ...utput_release-management_tag-providers.svg | 8 +- ..._release-management_update-constraints.svg | 22 +- ...se-management_verify-provider-packages.svg | 86 +- ...se-management_verify-provider-packages.txt | 2 +- dev/breeze/doc/images/output_sbom.svg | 18 +- .../output_sbom_build-all-airflow-images.svg | 20 +- ...put_sbom_export-dependency-information.svg | 86 +- ...put_sbom_export-dependency-information.txt | 2 +- ...utput_setup_check-all-params-in-groups.svg | 76 +- ...utput_setup_check-all-params-in-groups.txt | 2 +- dev/breeze/doc/images/output_setup_config.svg | 26 +- dev/breeze/doc/images/output_setup_config.txt | 2 +- ...output_setup_regenerate-command-images.svg | 82 +- ...output_setup_regenerate-command-images.txt | 2 +- .../doc/images/output_setup_self-upgrade.svg | 4 +- .../output_setup_synchronize-local-mounts.svg | 6 +- .../doc/images/output_setup_version.svg | 6 +- dev/breeze/doc/images/output_shell.svg | 254 ++-- dev/breeze/doc/images/output_shell.txt | 2 +- .../doc/images/output_start-airflow.svg | 210 ++- .../doc/images/output_start-airflow.txt | 2 +- .../doc/images/output_static-checks.svg | 190 +-- .../doc/images/output_static-checks.txt | 2 +- dev/breeze/doc/images/output_testing.svg | 38 +- dev/breeze/doc/images/output_testing.txt | 2 +- .../output_testing_core-integration-tests.svg | 256 ++++ .../output_testing_core-integration-tests.txt | 1 + .../doc/images/output_testing_core-tests.svg | 484 +++++++ .../doc/images/output_testing_core-tests.txt | 1 + .../doc/images/output_testing_db-tests.svg | 504 ------- .../doc/images/output_testing_db-tests.txt | 1 - .../output_testing_docker-compose-tests.svg | 44 +- .../output_testing_docker-compose-tests.txt | 2 +- .../doc/images/output_testing_helm-tests.svg | 44 +- .../doc/images/output_testing_helm-tests.txt | 2 +- .../output_testing_integration-tests.svg | 252 ---- .../output_testing_integration-tests.txt | 1 - .../images/output_testing_non-db-tests.svg | 464 ------ .../images/output_testing_non-db-tests.txt | 1 - ...ut_testing_providers-integration-tests.svg | 260 ++++ ...ut_testing_providers-integration-tests.txt | 1 + .../images/output_testing_providers-tests.svg | 524 +++++++ .../images/output_testing_providers-tests.txt | 1 + ...output_testing_python-api-client-tests.svg | 224 +++ ...output_testing_python-api-client-tests.txt | 1 + .../images/output_testing_system-tests.svg | 240 ++++ .../images/output_testing_system-tests.txt | 1 + .../doc/images/output_testing_tests.svg | 576 -------- .../doc/images/output_testing_tests.txt | 1 - dev/breeze/pyproject.toml | 5 +- .../airflow_breeze/commands/ci_commands.py | 15 +- .../commands/ci_image_commands.py | 330 ++++- .../commands/ci_image_commands_config.py | 56 +- .../commands/common_image_options.py | 84 +- .../airflow_breeze/commands/common_options.py | 71 +- .../commands/developer_commands.py | 135 +- .../commands/developer_commands_config.py | 23 +- .../commands/kubernetes_commands.py | 102 +- .../commands/kubernetes_commands_config.py | 2 - .../airflow_breeze/commands/main_command.py | 6 +- .../commands/minor_release_command.py | 2 +- .../commands/production_image_commands.py | 168 ++- .../production_image_commands_config.py | 35 +- .../commands/release_candidate_command.py | 14 +- .../commands/release_management_commands.py | 177 ++- .../release_management_commands_config.py | 6 +- .../airflow_breeze/commands/sbom_commands.py | 318 ++++- .../commands/sbom_commands_config.py | 26 +- .../airflow_breeze/commands/setup_commands.py | 19 +- .../commands/setup_commands_config.py | 1 - .../commands/testing_commands.py | 814 +++++++---- .../commands/testing_commands_config.py | 423 +++--- .../airflow_breeze/configure_rich_click.py | 2 +- .../src/airflow_breeze/global_constants.py | 186 ++- .../airflow_breeze/params/build_ci_params.py | 2 - .../params/build_prod_params.py | 18 +- .../params/common_build_params.py | 30 +- .../src/airflow_breeze/params/shell_params.py | 108 +- .../src/airflow_breeze/pre_commit_ids.py | 24 +- .../provider_documentation.py | 194 ++- .../prepare_providers/provider_packages.py | 36 +- .../provider_issue_TEMPLATE.md.jinja2 | 4 +- .../templates/CHANGELOG_TEMPLATE.rst.jinja2 | 10 + .../PROVIDER_CHANGELOG_TEMPLATE.rst.jinja2 | 3 +- .../PROVIDER_COMMITS_TEMPLATE.rst.jinja2 | 5 +- .../PROVIDER_README_TEMPLATE.rst.jinja2 | 3 +- .../get_provider_info_TEMPLATE.py.jinja2 | 3 +- .../templates/pyproject_TEMPLATE.toml.jinja2 | 7 +- .../src/airflow_breeze/utils/backtracking.py | 2 +- .../src/airflow_breeze/utils/black_utils.py | 4 +- dev/breeze/src/airflow_breeze/utils/cdxgen.py | 124 +- .../src/airflow_breeze/utils/coertions.py | 2 +- .../src/airflow_breeze/utils/console.py | 9 +- .../utils/custom_param_types.py | 3 +- .../utils/docker_command_utils.py | 21 +- .../airflow_breeze/utils/functools_cache.py | 23 +- dev/breeze/src/airflow_breeze/utils/github.py | 126 ++ dev/breeze/src/airflow_breeze/utils/image.py | 59 +- .../airflow_breeze/utils/kubernetes_utils.py | 7 +- .../utils/mark_image_as_refreshed.py | 2 +- .../src/airflow_breeze/utils/packages.py | 207 ++- .../src/airflow_breeze/utils/parallel.py | 3 +- .../src/airflow_breeze/utils/path_utils.py | 25 +- .../src/airflow_breeze/utils/platforms.py | 6 +- .../utils/projects_google_spreadsheet.py | 252 ++++ .../utils/provider_dependencies.py | 9 +- .../utils/publish_docs_helpers.py | 36 +- .../airflow_breeze/utils/python_versions.py | 13 - .../src/airflow_breeze/utils/run_tests.py | 338 ++--- .../src/airflow_breeze/utils/run_utils.py | 160 ++- .../airflow_breeze/utils/selective_checks.py | 481 +++++-- .../airflow_breeze/utils/spelling_checks.py | 4 +- .../src/airflow_breeze/utils/version_utils.py | 54 + .../src/airflow_breeze/utils/versions.py | 2 +- .../airflow_breeze/utils/virtualenv_utils.py | 14 +- dev/breeze/tests/conftest.py | 9 + dev/breeze/tests/test_cache.py | 3 +- dev/breeze/tests/test_docker_command_utils.py | 22 + dev/breeze/tests/test_packages.py | 76 +- .../tests/test_provider_documentation.py | 30 +- .../tests/test_pytest_args_for_test_types.py | 294 ++-- dev/breeze/tests/test_run_test_args.py | 94 ++ dev/breeze/tests/test_selective_checks.py | 1246 +++++++---------- dev/breeze/tests/test_shell_params.py | 20 - dev/breeze/uv.lock | 638 +++++---- dev/stats/explore_pr_candidates.ipynb | 38 +- .../guides/developer.rst | 9 +- docs/apache-airflow/core-concepts/dag-run.rst | 1 + docs/apache-airflow/core-concepts/dags.rst | 1 + docs/apache-airflow/core-concepts/tasks.rst | 6 +- pyproject.toml | 58 +- scripts/ci/cleanup_docker.sh | 5 +- .../ci/constraints/ci_commit_constraints.sh | 3 - scripts/ci/docker-compose/base.yml | 2 +- scripts/ci/docker-compose/devcontainer.env | 4 - .../ci/docker-compose/forward-credentials.yml | 1 + .../docker-compose/integration-keycloak.yml | 62 + .../integration-openlineage.yml | 4 +- .../keycloak/init-keycloak-db.sh | 27 + .../keycloak/keycloak-entrypoint.sh | 45 + .../providers-and-tests-sources.yml | 5 +- ...tart_arm_instance_and_connect_to_docker.sh | 91 -- scripts/ci/install_breeze.sh | 10 +- .../base_operator_partial_arguments.py | 164 --- scripts/ci/pre_commit/boring_cyborg.py | 19 +- ...eck_cncf_k8s_used_for_k8s_executor_only.py | 7 +- .../pre_commit/check_common_sql_dependency.py | 30 +- .../ci/pre_commit/check_deferrable_default.py | 128 -- scripts/ci/pre_commit/check_deprecations.py | 194 --- .../pre_commit/check_imports_in_providers.py | 105 ++ .../ci/pre_commit/check_min_python_version.py | 2 +- .../ci/pre_commit/check_pre_commit_hooks.py | 2 +- .../pre_commit/check_provider_yaml_files.py | 15 +- ...eck_providers_subpackages_all_have_init.py | 44 +- scripts/ci/pre_commit/check_system_tests.py | 10 +- .../check_system_tests_hidden_in_index.py | 4 +- .../ci/pre_commit/check_template_fields.py | 40 + .../check_tests_in_right_folders.py | 1 + .../pre_commit/check_ti_vs_tis_attributes.py | 5 +- .../ci/pre_commit/checkout_no_credentials.py | 7 + .../ci/pre_commit/common_precommit_utils.py | 45 + .../ci/pre_commit/compat_cache_on_methods.py | 69 - scripts/ci/pre_commit/compile_ui_assets.py | 89 ++ .../ci/pre_commit/compile_ui_assets_dev.py | 65 + scripts/ci/pre_commit/compile_www_assets.py | 17 +- ...corator_operator_implements_custom_name.py | 2 +- .../pre_commit/generate_airflow_diagrams.py | 14 +- scripts/ci/pre_commit/helm_lint.py | 2 +- scripts/ci/pre_commit/kubeconform.py | 2 +- scripts/ci/pre_commit/lint_ui.py | 37 + .../pre_commit/{www_lint.py => lint_www.py} | 0 scripts/ci/pre_commit/migration_reference.py | 14 +- scripts/ci/pre_commit/mypy_folder.py | 21 +- scripts/ci/pre_commit/sync_init_decorator.py | 204 --- .../pre_commit/update_build_dependencies.py | 110 -- .../pre_commit/update_common_sql_api_stubs.py | 8 +- scripts/ci/pre_commit/update_er_diagram.py | 13 +- .../pre_commit/update_example_dags_paths.py | 13 +- scripts/ci/pre_commit/update_installers.py | 152 -- .../update_installers_and_pre_commit.py | 189 +++ .../update_providers_build_files.py | 112 ++ .../ci/pre_commit/validate_operators_init.py | 8 +- .../ci/pre_commit/vendor_k8s_json_schema.py | 2 +- scripts/ci/pre_commit/version_heads_map.py | 82 +- .../run_breeze_command_with_retries.sh | 43 + .../run_integration_tests_with_retry.sh | 15 +- .../run_system_tests.sh} | 24 +- scripts/ci/testing/run_unit_tests.sh | 140 ++ scripts/tools/free_up_disk_space.sh | 39 + tests/api_connexion/test_auth.py | 6 +- tests/operators/test_bash.py | 1 + tests/plugins/test_plugins_manager.py | 5 +- tests/sensors/test_external_task_sensor.py | 5 +- .../providers/papermill/input_notebook.ipynb | 2 +- 356 files changed, 13857 insertions(+), 11298 deletions(-) delete mode 100644 .github/actions/checkout_target_commit/action.yml create mode 100644 .github/actions/prepare_all_ci_images/action.yml create mode 100644 .github/actions/prepare_single_ci_image/action.yml create mode 100644 .github/workflows/automatic-backport.yml create mode 100644 .github/workflows/backport-cli.yml delete mode 100644 .github/workflows/build-images.yml rename .github/workflows/{static-checks-mypy-docs.yml => ci-image-checks.yml} (72%) create mode 100644 .github/workflows/integration-system-tests.yml delete mode 100644 .github/workflows/integration-tests.yml create mode 100644 .github/workflows/news-fragment.yml rename .github/workflows/{check-providers.yml => test-provider-packages.yml} (57%) create mode 100644 dev/breeze/doc/adr/0016-use-uv-tool-to-install-breeze.md create mode 100644 dev/breeze/doc/ci/06_debugging.md delete mode 100644 dev/breeze/doc/ci/06_diagrams.md delete mode 100644 dev/breeze/doc/ci/07_debugging.md create mode 100644 dev/breeze/doc/ci/07_running_ci_locally.md delete mode 100644 dev/breeze/doc/ci/08_running_ci_locally.md create mode 100644 dev/breeze/doc/images/image_artifacts.png create mode 100644 dev/breeze/doc/images/output_ci-image_export-mount-cache.svg create mode 100644 dev/breeze/doc/images/output_ci-image_export-mount-cache.txt create mode 100644 dev/breeze/doc/images/output_ci-image_import-mount-cache.svg create mode 100644 dev/breeze/doc/images/output_ci-image_import-mount-cache.txt create mode 100644 dev/breeze/doc/images/output_ci-image_load.svg create mode 100644 dev/breeze/doc/images/output_ci-image_load.txt create mode 100644 dev/breeze/doc/images/output_ci-image_save.svg create mode 100644 dev/breeze/doc/images/output_ci-image_save.txt create mode 100644 dev/breeze/doc/images/output_compile-ui-assets.svg create mode 100644 dev/breeze/doc/images/output_compile-ui-assets.txt create mode 100644 dev/breeze/doc/images/output_prod-image_load.svg create mode 100644 dev/breeze/doc/images/output_prod-image_load.txt create mode 100644 dev/breeze/doc/images/output_prod-image_save.svg create mode 100644 dev/breeze/doc/images/output_prod-image_save.txt create mode 100644 dev/breeze/doc/images/output_testing_core-integration-tests.svg create mode 100644 dev/breeze/doc/images/output_testing_core-integration-tests.txt create mode 100644 dev/breeze/doc/images/output_testing_core-tests.svg create mode 100644 dev/breeze/doc/images/output_testing_core-tests.txt delete mode 100644 dev/breeze/doc/images/output_testing_db-tests.svg delete mode 100644 dev/breeze/doc/images/output_testing_db-tests.txt delete mode 100644 dev/breeze/doc/images/output_testing_integration-tests.svg delete mode 100644 dev/breeze/doc/images/output_testing_integration-tests.txt delete mode 100644 dev/breeze/doc/images/output_testing_non-db-tests.svg delete mode 100644 dev/breeze/doc/images/output_testing_non-db-tests.txt create mode 100644 dev/breeze/doc/images/output_testing_providers-integration-tests.svg create mode 100644 dev/breeze/doc/images/output_testing_providers-integration-tests.txt create mode 100644 dev/breeze/doc/images/output_testing_providers-tests.svg create mode 100644 dev/breeze/doc/images/output_testing_providers-tests.txt create mode 100644 dev/breeze/doc/images/output_testing_python-api-client-tests.svg create mode 100644 dev/breeze/doc/images/output_testing_python-api-client-tests.txt create mode 100644 dev/breeze/doc/images/output_testing_system-tests.svg create mode 100644 dev/breeze/doc/images/output_testing_system-tests.txt delete mode 100644 dev/breeze/doc/images/output_testing_tests.svg delete mode 100644 dev/breeze/doc/images/output_testing_tests.txt rename scripts/ci/pre_commit/check_providers_init.py => dev/breeze/src/airflow_breeze/utils/functools_cache.py (66%) mode change 100755 => 100644 create mode 100644 dev/breeze/src/airflow_breeze/utils/projects_google_spreadsheet.py create mode 100644 dev/breeze/tests/test_run_test_args.py create mode 100644 scripts/ci/docker-compose/integration-keycloak.yml create mode 100755 scripts/ci/docker-compose/keycloak/init-keycloak-db.sh create mode 100755 scripts/ci/docker-compose/keycloak/keycloak-entrypoint.sh delete mode 100755 scripts/ci/images/ci_start_arm_instance_and_connect_to_docker.sh delete mode 100755 scripts/ci/pre_commit/base_operator_partial_arguments.py delete mode 100755 scripts/ci/pre_commit/check_deferrable_default.py delete mode 100755 scripts/ci/pre_commit/check_deprecations.py create mode 100755 scripts/ci/pre_commit/check_imports_in_providers.py create mode 100755 scripts/ci/pre_commit/check_template_fields.py delete mode 100755 scripts/ci/pre_commit/compat_cache_on_methods.py create mode 100755 scripts/ci/pre_commit/compile_ui_assets.py create mode 100755 scripts/ci/pre_commit/compile_ui_assets_dev.py create mode 100755 scripts/ci/pre_commit/lint_ui.py rename scripts/ci/pre_commit/{www_lint.py => lint_www.py} (100%) delete mode 100755 scripts/ci/pre_commit/sync_init_decorator.py delete mode 100755 scripts/ci/pre_commit/update_build_dependencies.py delete mode 100755 scripts/ci/pre_commit/update_installers.py create mode 100755 scripts/ci/pre_commit/update_installers_and_pre_commit.py create mode 100755 scripts/ci/pre_commit/update_providers_build_files.py create mode 100755 scripts/ci/testing/run_breeze_command_with_retries.sh rename scripts/ci/{images/ci_stop_arm_instance.sh => testing/run_system_tests.sh} (61%) create mode 100755 scripts/ci/testing/run_unit_tests.sh create mode 100644 scripts/tools/free_up_disk_space.sh diff --git a/.dockerignore b/.dockerignore index dba7378a3b778..197f14a03695c 100644 --- a/.dockerignore +++ b/.dockerignore @@ -34,12 +34,15 @@ !chart !docs !licenses +!providers/ +!task_sdk/ # Add those folders to the context so that they are available in the CI container !scripts # Add tests and kubernetes_tests to context. !tests +!tests_common !kubernetes_tests !helm_tests !docker_tests @@ -79,6 +82,7 @@ airflow/git_version # Exclude mode_modules pulled by "yarn" for compilation of www files generated by NPM airflow/www/node_modules +airflow/ui/node_modules # Exclude link to docs airflow/www/static/docs @@ -121,6 +125,9 @@ docs/_build/ docs/_api/ docs/_doctrees/ +# Exclude new providers docs generated files +providers/**/docs/_api/ + # files generated by memray *.py.*.html *.py.*.bin diff --git a/.github/actions/breeze/action.yml b/.github/actions/breeze/action.yml index 164914c3d525b..d7eaa1b088bee 100644 --- a/.github/actions/breeze/action.yml +++ b/.github/actions/breeze/action.yml @@ -21,10 +21,10 @@ description: 'Sets up Python and Breeze' inputs: python-version: description: 'Python version to use' - # Version of Python used for reproducibility of the packages built - # Python 3.8 tarfile produces different tarballs than Python 3.9+ tarfile that's why we are forcing - # Python 3.9 for all release preparation commands to make sure that the tarballs are reproducible default: "3.9" + use-uv: + description: 'Whether to use uv tool' + required: true outputs: host-python-version: description: Python version used in host @@ -36,13 +36,11 @@ runs: uses: actions/setup-python@v5 with: python-version: ${{ inputs.python-version }} - cache: 'pip' - cache-dependency-path: ./dev/breeze/pyproject.toml + # NOTE! Installing Breeze without using cache is FASTER than when using cache - uv is so fast and has + # so low overhead, that just running upload cache/restore cache is slower than installing it from scratch - name: "Install Breeze" shell: bash run: ./scripts/ci/install_breeze.sh - env: - PYTHON_VERSION: ${{ inputs.python-version }} - name: "Free space" shell: bash run: breeze ci free-space diff --git a/.github/actions/checkout_target_commit/action.yml b/.github/actions/checkout_target_commit/action.yml deleted file mode 100644 index e95e8b86254a0..0000000000000 --- a/.github/actions/checkout_target_commit/action.yml +++ /dev/null @@ -1,81 +0,0 @@ -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -# ---- -name: 'Checkout target commit' -description: > - Checks out target commit with the exception of .github scripts directories that come from the target branch -inputs: - target-commit-sha: - description: 'SHA of the target commit to checkout' - required: true - pull-request-target: - description: 'Whether the workflow is a pull request target workflow' - required: true - is-committer-build: - description: 'Whether the build is done by a committer' - required: true -runs: - using: "composite" - steps: - - name: "Checkout target commit" - uses: actions/checkout@v4 - with: - ref: ${{ inputs.target-commit-sha }} - persist-credentials: false - #################################################################################################### - # BE VERY CAREFUL HERE! THIS LINE AND THE END OF THE WARNING. IN PULL REQUEST TARGET WORKFLOW - # WE CHECK OUT THE TARGET COMMIT ABOVE TO BE ABLE TO BUILD THE IMAGE FROM SOURCES FROM THE - # INCOMING PR, RATHER THAN FROM TARGET BRANCH. THIS IS A SECURITY RISK, BECAUSE THE PR - # CAN CONTAIN ANY CODE AND WE EXECUTE IT HERE. THEREFORE, WE NEED TO BE VERY CAREFUL WHAT WE - # DO HERE. WE SHOULD NOT EXECUTE ANY CODE THAT COMES FROM THE PR. WE SHOULD NOT RUN ANY BREEZE - # COMMAND NOR SCRIPTS NOR COMPOSITE ACTIONS. WE SHOULD ONLY RUN CODE THAT IS EMBEDDED DIRECTLY IN - # THIS WORKFLOW - BECAUSE THIS IS THE ONLY CODE THAT WE CAN TRUST. - #################################################################################################### - - name: Checkout target branch to 'target-airflow' folder to use ci/scripts and breeze from there. - uses: actions/checkout@v4 - with: - path: "target-airflow" - ref: ${{ github.base_ref }} - persist-credentials: false - if: inputs.pull-request-target == 'true' && inputs.is-committer-build != 'true' - - name: > - Replace "scripts/ci", "dev", ".github/actions" and ".github/workflows" with the target branch - so that the those directories are not coming from the PR - shell: bash - run: | - echo - echo -e "\033[33m Replace scripts, dev, actions with target branch for non-committer builds!\033[0m" - echo - rm -rfv "scripts/ci" - rm -rfv "dev" - rm -rfv ".github/actions" - rm -rfv ".github/workflows" - rm -v ".dockerignore" || true - mv -v "target-airflow/scripts/ci" "scripts" - mv -v "target-airflow/dev" "." - mv -v "target-airflow/.github/actions" "target-airflow/.github/workflows" ".github" - mv -v "target-airflow/.dockerignore" ".dockerignore" || true - if: inputs.pull-request-target == 'true' && inputs.is-committer-build != 'true' - #################################################################################################### - # AFTER IT'S SAFE. THE `dev`, `scripts/ci` AND `.github/actions` and `.dockerignore` ARE NOW COMING - # FROM THE BASE_REF - WHICH IS THE TARGET BRANCH OF THE PR. WE CAN TRUST THAT THOSE SCRIPTS ARE - # SAFE TO RUN AND CODE AVAILABLE IN THE DOCKER BUILD PHASE IS CONTROLLED BY THE `.dockerignore`. - # ALL THE REST OF THE CODE COMES FROM THE PR, AND FOR EXAMPLE THE CODE IN THE `Dockerfile.ci` CAN - # BE RUN SAFELY AS PART OF DOCKER BUILD. BECAUSE IT RUNS INSIDE THE DOCKER CONTAINER AND IT IS - # ISOLATED FROM THE RUNNER. - #################################################################################################### diff --git a/.github/actions/install-pre-commit/action.yml b/.github/actions/install-pre-commit/action.yml index 57624f48e8928..5e9ed3f2a4eff 100644 --- a/.github/actions/install-pre-commit/action.yml +++ b/.github/actions/install-pre-commit/action.yml @@ -21,30 +21,53 @@ description: 'Installs pre-commit and related packages' inputs: python-version: description: 'Python version to use' - default: 3.9 + default: "3.9" uv-version: description: 'uv version to use' - default: 0.5.11 + default: "0.5.17" # Keep this comment to allow automatic replacement of uv version pre-commit-version: description: 'pre-commit version to use' - default: 4.0.1 - pre-commit-uv-version: - description: 'pre-commit-uv version to use' - default: 4.1.4 + default: "3.5.0" # Keep this comment to allow automatic replacement of pre-commit version runs: using: "composite" steps: - - name: Install pre-commit, uv, and pre-commit-uv + - name: Install pre-commit, uv shell: bash - run: > - pip install - pre-commit==${{inputs.pre-commit-version}} - uv==${{inputs.uv-version}} - pre-commit-uv==${{inputs.pre-commit-uv-version}} - - name: Cache pre-commit envs - uses: actions/cache@v4 + env: + UV_VERSION: ${{inputs.uv-version}} + PRE_COMMIT_VERSION: ${{inputs.pre-commit-version}} + run: | + pip install uv==${UV_VERSION} || true + uv tool install pre-commit==${PRE_COMMIT_VERSION} --with uv==${UV_VERSION} + working-directory: ${{ github.workspace }} + # We need to use tar file with archive to restore all the permissions and symlinks + - name: "Delete ~.cache" + run: | + du ~/ --max-depth=2 + echo + echo Deleting ~/.cache + echo + rm -rf ~/.cache + echo + shell: bash + - name: "Restore pre-commit cache" + uses: apache/infrastructure-actions/stash/restore@c94b890bbedc2fc61466d28e6bd9966bc6c6643c with: - path: ~/.cache/pre-commit - key: "pre-commit-${{inputs.python-version}}-${{ hashFiles('.pre-commit-config.yaml') }}" - restore-keys: | - pre-commit-${{inputs.python-version}}- + key: cache-pre-commit-v4-${{ inputs.python-version }}-${{ hashFiles('.pre-commit-config.yaml') }} + path: /tmp/ + id: restore-pre-commit-cache + - name: "Restore .cache from the tar file" + run: tar -C ~ -xzf /tmp/cache-pre-commit.tar.gz + shell: bash + if: steps.restore-pre-commit-cache.outputs.stash-hit == 'true' + - name: "Show restored files" + run: | + echo "Restored files" + du ~/ --max-depth=2 + echo + shell: bash + if: steps.restore-pre-commit-cache.outputs.stash-hit == 'true' + - name: Install pre-commit hooks + shell: bash + run: pre-commit install-hooks || (cat ~/.cache/pre-commit/pre-commit.log && exit 1) + working-directory: ${{ github.workspace }} diff --git a/.github/actions/post_tests_success/action.yml b/.github/actions/post_tests_success/action.yml index 37b51154d3e13..b7b00a6fc0df3 100644 --- a/.github/actions/post_tests_success/action.yml +++ b/.github/actions/post_tests_success/action.yml @@ -33,7 +33,7 @@ runs: - name: "Upload artifact for warnings" uses: actions/upload-artifact@v4 with: - name: test-warnings-${{env.JOB_ID}} + name: test-warnings-${{ env.JOB_ID }} path: ./files/warnings-*.txt retention-days: 7 if-no-files-found: ignore @@ -50,5 +50,5 @@ runs: if: env.ENABLE_COVERAGE == 'true' && env.TEST_TYPES != 'Helm' && inputs.python-version != '3.12' with: name: coverage-${{env.JOB_ID}} - flags: python-${{env.PYTHON_MAJOR_MINOR_VERSION}},${{env.BACKEND}}-${{env.BACKEND_VERSION}} + flags: python-${{ env.PYTHON_MAJOR_MINOR_VERSION }},${{ env.BACKEND }}-${{ env.BACKEND_VERSION }} directory: "./files/coverage-reports/" diff --git a/.github/actions/prepare_all_ci_images/action.yml b/.github/actions/prepare_all_ci_images/action.yml new file mode 100644 index 0000000000000..d156818b9b283 --- /dev/null +++ b/.github/actions/prepare_all_ci_images/action.yml @@ -0,0 +1,68 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# +--- +name: 'Prepare all CI images' +description: 'Recreates current python CI images from artifacts for all python versions' +inputs: + python-versions-list-as-string: + description: 'Stringified array of all Python versions to test - separated by spaces.' + required: true + platform: + description: 'Platform for the build - linux/amd64 or linux/arm64' + required: true +runs: + using: "composite" + steps: + - name: "Cleanup docker" + run: ./scripts/ci/cleanup_docker.sh + shell: bash + # TODO: Currently we cannot loop through the list of python versions and have dynamic list of + # tasks. Instead we hardcode all possible python versions and they - but + # this should be implemented in stash action as list of keys to download. + # That includes 3.8 - 3.12 as we are backporting it to v2-10-test branch + # This is captured in https://github.com/apache/airflow/issues/45268 + - name: "Restore CI docker image ${{ inputs.platform }}:3.8" + uses: ./.github/actions/prepare_single_ci_image + with: + platform: ${{ inputs.platform }} + python: "3.8" + python-versions-list-as-string: ${{ inputs.python-versions-list-as-string }} + - name: "Restore CI docker image ${{ inputs.platform }}:3.9" + uses: ./.github/actions/prepare_single_ci_image + with: + platform: ${{ inputs.platform }} + python: "3.9" + python-versions-list-as-string: ${{ inputs.python-versions-list-as-string }} + - name: "Restore CI docker image ${{ inputs.platform }}:3.10" + uses: ./.github/actions/prepare_single_ci_image + with: + platform: ${{ inputs.platform }} + python: "3.10" + python-versions-list-as-string: ${{ inputs.python-versions-list-as-string }} + - name: "Restore CI docker image ${{ inputs.platform }}:3.11" + uses: ./.github/actions/prepare_single_ci_image + with: + platform: ${{ inputs.platform }} + python: "3.11" + python-versions-list-as-string: ${{ inputs.python-versions-list-as-string }} + - name: "Restore CI docker image ${{ inputs.platform }}:3.12" + uses: ./.github/actions/prepare_single_ci_image + with: + platform: ${{ inputs.platform }} + python: "3.12" + python-versions-list-as-string: ${{ inputs.python-versions-list-as-string }} diff --git a/.github/actions/prepare_breeze_and_image/action.yml b/.github/actions/prepare_breeze_and_image/action.yml index 41aa17092d589..26be0b76315ff 100644 --- a/.github/actions/prepare_breeze_and_image/action.yml +++ b/.github/actions/prepare_breeze_and_image/action.yml @@ -16,12 +16,21 @@ # under the License. # --- -name: 'Prepare breeze && current python image' -description: 'Installs breeze and pulls current python image' +name: 'Prepare breeze && current image (CI or PROD)' +description: 'Installs breeze and recreates current python image from artifact' inputs: - pull-image-type: - description: 'Which image to pull' - default: CI + python: + description: 'Python version for image to prepare' + required: true + image-type: + description: 'Which image type to prepare (ci/prod)' + default: "ci" + platform: + description: 'Platform for the build - linux/amd64 or linux/arm64' + required: true + use-uv: + description: 'Whether to use uv' + required: true outputs: host-python-version: description: Python version used in host @@ -29,17 +38,30 @@ outputs: runs: using: "composite" steps: + - name: "Cleanup docker" + run: ./scripts/ci/cleanup_docker.sh + shell: bash - name: "Install Breeze" uses: ./.github/actions/breeze + with: + use-uv: ${{ inputs.use-uv }} id: breeze - - name: Login to ghcr.io + - name: Check free space + run: df -H shell: bash - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - - name: Pull CI image ${{ env.PYTHON_MAJOR_MINOR_VERSION }}:${{ env.IMAGE_TAG }} + - name: Make /mnt/ directory writeable + run: sudo chown -R ${USER} /mnt shell: bash - run: breeze ci-image pull --tag-as-latest - if: inputs.pull-image-type == 'CI' - - name: Pull PROD image ${{ env.PYTHON_MAJOR_MINOR_VERSION }}:${{ env.IMAGE_TAG }} + - name: "Restore ${{ inputs.image-type }} docker image ${{ inputs.platform }}:${{ inputs.python }}" + uses: apache/infrastructure-actions/stash/restore@c94b890bbedc2fc61466d28e6bd9966bc6c6643c + with: + key: ${{ inputs.image-type }}-image-save-${{ inputs.platform }}-${{ inputs.python }} + path: "/mnt/" + - name: "Load ${{ inputs.image-type }} image ${{ inputs.platform }}:${{ inputs.python }}" + env: + PLATFORM: ${{ inputs.platform }} + PYTHON: ${{ inputs.python }} + IMAGE_TYPE: ${{ inputs.image-type }} + run: > + breeze ${IMAGE_TYPE}-image load --platform "${PLATFORM}" --python "${PYTHON}" --image-file-dir "/mnt" shell: bash - run: breeze prod-image pull --tag-as-latest - if: inputs.pull-image-type == 'PROD' diff --git a/.github/actions/prepare_single_ci_image/action.yml b/.github/actions/prepare_single_ci_image/action.yml new file mode 100644 index 0000000000000..ecae9f802c966 --- /dev/null +++ b/.github/actions/prepare_single_ci_image/action.yml @@ -0,0 +1,56 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# +--- +name: 'Prepare single CI image' +description: > + Recreates current python image from artifacts (needed for the hard-coded actions calling all + possible Python versions in "prepare_all_ci_images" action. Hopefully we can get rid of it when + the https://github.com/apache/airflow/issues/45268 is resolved and we contribute capability of + downloading multiple keys to the stash action. +inputs: + python: + description: 'Python version for image to prepare' + required: true + python-versions-list-as-string: + description: 'Stringified array of all Python versions to prepare - separated by spaces.' + required: true + platform: + description: 'Platform for the build - linux/amd64 or linux/arm64' + required: true +runs: + using: "composite" + steps: + - name: Check free space + run: df -H + shell: bash + - name: Make /mnt/ directory writeable + run: sudo chown -R ${USER} /mnt + shell: bash + - name: "Restore CI docker images ${{ inputs.platform }}:${{ inputs.python }}" + uses: apache/infrastructure-actions/stash/restore@c94b890bbedc2fc61466d28e6bd9966bc6c6643c + with: + key: ci-image-save-${{ inputs.platform }}-${{ inputs.python }} + path: "/mnt/" + if: contains(inputs.python-versions-list-as-string, inputs.python) + - name: "Load CI image ${{ inputs.platform }}:${{ inputs.python }}" + env: + PLATFORM: ${{ inputs.platform }} + PYTHON: ${{ inputs.python }} + run: breeze ci-image load --platform "${PLATFORM}" --python "${PYTHON}" --image-file-dir "/mnt/" + shell: bash + if: contains(inputs.python-versions-list-as-string, inputs.python) diff --git a/.github/workflows/additional-ci-image-checks.yml b/.github/workflows/additional-ci-image-checks.yml index 82b143d2f03e4..a6b7bdafcb5af 100644 --- a/.github/workflows/additional-ci-image-checks.yml +++ b/.github/workflows/additional-ci-image-checks.yml @@ -32,10 +32,6 @@ on: # yamllint disable-line rule:truthy description: "The array of labels (in json form) determining self-hosted runners." required: true type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string python-versions: description: "The list of python versions (stringified JSON array) to run the tests on." required: true @@ -64,6 +60,10 @@ on: # yamllint disable-line rule:truthy description: "Docker cache specification to build the image (registry, local, disabled)." required: true type: string + disable-airflow-repo-cache: + description: "Disable airflow repo cache read from main." + required: true + type: string canary-run: description: "Whether this is a canary run (true/false)" required: true @@ -84,6 +84,8 @@ on: # yamllint disable-line rule:truthy description: "Whether to use uv to build the image (true/false)" required: true type: string +permissions: + contents: read jobs: # Push early BuildX cache to GitHub Registry in Apache repository, This cache does not wait for all the # tests to complete - it is run very early in the build process for "main" merges in order to refresh @@ -99,8 +101,6 @@ jobs: contents: read # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. - # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the - # "in-workflow-build" condition packages: write secrets: inherit with: @@ -113,9 +113,10 @@ jobs: python-versions: ${{ inputs.python-versions }} branch: ${{ inputs.branch }} constraints-branch: ${{ inputs.constraints-branch }} - use-uv: ${{ inputs.use-uv}} + use-uv: ${{ inputs.use-uv }} include-success-outputs: ${{ inputs.include-success-outputs }} docker-cache: ${{ inputs.docker-cache }} + disable-airflow-repo-cache: ${{ inputs.disable-airflow-repo-cache }} if: inputs.branch == 'main' # Check that after earlier cache push, breeze command will build quickly @@ -144,8 +145,13 @@ jobs: run: ./scripts/ci/cleanup_docker.sh - name: "Install Breeze" uses: ./.github/actions/breeze + with: + use-uv: ${{ inputs.use-uv }} - name: "Login to ghcr.io" - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin + env: + actor: ${{ github.actor }} + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + run: echo "$GITHUB_TOKEN" | docker login ghcr.io -u "$actor" --password-stdin - name: "Check that image builds quickly" run: breeze shell --max-time 600 --platform "linux/amd64" @@ -154,21 +160,24 @@ jobs: # # There is no point in running this one in "canary" run, because the above step is doing the # # same build anyway. # build-ci-arm-images: -# name: Build CI ARM images (in-workflow) +# name: Build CI ARM images # uses: ./.github/workflows/ci-image-build.yml # permissions: # contents: read # packages: write # secrets: inherit # with: +# platform: "linux/arm64" # push-image: "false" +# upload-image-artifact: "true" +# upload-mount-cache-artifact: ${{ inputs.canary-run }} # runs-on-as-json-public: ${{ inputs.runs-on-as-json-public }} # runs-on-as-json-self-hosted: ${{ inputs.runs-on-as-json-self-hosted }} -# image-tag: ${{ inputs.image-tag }} # python-versions: ${{ inputs.python-versions }} -# platform: "linux/arm64" # branch: ${{ inputs.branch }} # constraints-branch: ${{ inputs.constraints-branch }} -# use-uv: ${{ inputs.use-uv}} +# use-uv: ${{ inputs.use-uv }} # upgrade-to-newer-dependencies: ${{ inputs.upgrade-to-newer-dependencies }} # docker-cache: ${{ inputs.docker-cache }} +# disable-airflow-repo-cache: ${{ inputs.disable-airflow-repo-cache }} +# diff --git a/.github/workflows/additional-prod-image-tests.yml b/.github/workflows/additional-prod-image-tests.yml index 4c9606e1343e6..7b55121571471 100644 --- a/.github/workflows/additional-prod-image-tests.yml +++ b/.github/workflows/additional-prod-image-tests.yml @@ -32,10 +32,6 @@ on: # yamllint disable-line rule:truthy description: "Branch used to construct constraints URL from." required: true type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string upgrade-to-newer-dependencies: description: "Whether to upgrade to newer dependencies (true/false)" required: true @@ -48,6 +44,10 @@ on: # yamllint disable-line rule:truthy description: "Docker cache specification to build the image (registry, local, disabled)." required: true type: string + disable-airflow-repo-cache: + description: "Disable airflow repo cache read from main." + required: true + type: string canary-run: description: "Whether to run the canary run (true/false)" required: true @@ -56,6 +56,12 @@ on: # yamllint disable-line rule:truthy description: "Which version of python should be used by default" required: true type: string + use-uv: + description: "Whether to use uv" + required: true + type: string +permissions: + contents: read jobs: prod-image-extra-checks-main: name: PROD image extra checks (main) @@ -66,12 +72,12 @@ jobs: default-python-version: ${{ inputs.default-python-version }} branch: ${{ inputs.default-branch }} use-uv: "false" - image-tag: ${{ inputs.image-tag }} build-provider-packages: ${{ inputs.default-branch == 'main' }} upgrade-to-newer-dependencies: ${{ inputs.upgrade-to-newer-dependencies }} chicken-egg-providers: ${{ inputs.chicken-egg-providers }} constraints-branch: ${{ inputs.constraints-branch }} docker-cache: ${{ inputs.docker-cache }} + disable-airflow-repo-cache: ${{ inputs.disable-airflow-repo-cache }} if: inputs.default-branch == 'main' && inputs.canary-run == 'true' prod-image-extra-checks-release-branch: @@ -83,12 +89,12 @@ jobs: default-python-version: ${{ inputs.default-python-version }} branch: ${{ inputs.default-branch }} use-uv: "false" - image-tag: ${{ inputs.image-tag }} build-provider-packages: ${{ inputs.default-branch == 'main' }} upgrade-to-newer-dependencies: ${{ inputs.upgrade-to-newer-dependencies }} chicken-egg-providers: ${{ inputs.chicken-egg-providers }} constraints-branch: ${{ inputs.constraints-branch }} docker-cache: ${{ inputs.docker-cache }} + disable-airflow-repo-cache: ${{ inputs.disable-airflow-repo-cache }} if: inputs.default-branch != 'main' && inputs.canary-run == 'true' test-examples-of-prod-image-building: @@ -111,36 +117,30 @@ jobs: persist-credentials: false - name: "Cleanup docker" run: ./scripts/ci/cleanup_docker.sh - - name: "Install Breeze" - uses: ./.github/actions/breeze - - name: Login to ghcr.io - shell: bash - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - - name: Pull PROD image ${{ inputs.default-python-version}}:${{ inputs.image-tag }} - run: breeze prod-image pull --tag-as-latest - env: - PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" - IMAGE_TAG: "${{ inputs.image-tag }}" - - name: "Setup python" - uses: actions/setup-python@v5 + - name: "Prepare breeze & PROD image: ${{ inputs.default-python-version }}" + uses: ./.github/actions/prepare_breeze_and_image with: - python-version: ${{ inputs.default-python-version }} - cache: 'pip' - cache-dependency-path: ./dev/requirements.txt + platform: "linux/amd64" + image-type: "prod" + python: ${{ inputs.default-python-version }} + use-uv: ${{ inputs.use-uv }} - name: "Test examples of PROD image building" + env: + GITHUB_REPOSITORY: ${{ github.repository }} + DEFAULT_BRANCH: ${{ inputs.default-branch }} + DEFAULT_PYTHON_VERSION: ${{ inputs.default-python-version }} run: " cd ./docker_tests && \ python -m pip install -r requirements.txt && \ - TEST_IMAGE=\"ghcr.io/${{ github.repository }}/${{ inputs.default-branch }}\ - /prod/python${{ inputs.default-python-version }}:${{ inputs.image-tag }}\" \ + TEST_IMAGE=\"ghcr.io/$GITHUB_REPOSITORY/$DEFAULT_BRANCH\ + /prod/python$DEFAULT_PYTHON_VERSION\" \ python -m pytest test_examples_of_prod_image_building.py -n auto --color=yes" test-docker-compose-quick-start: timeout-minutes: 60 - name: "Docker-compose quick start with PROD image verifying" + name: "Docker Compose quick start with PROD image verifying" runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} env: - IMAGE_TAG: "${{ inputs.image-tag }}" PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} @@ -155,14 +155,13 @@ jobs: with: fetch-depth: 2 persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: "Install Breeze" - uses: ./.github/actions/breeze - - name: Login to ghcr.io - shell: bash - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - - name: "Pull image ${{ inputs.default-python-version}}:${{ inputs.image-tag }}" - run: breeze prod-image pull --tag-as-latest + - name: "Prepare breeze & PROD image: ${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + image-type: "prod" + python: ${{ env.PYTHON_MAJOR_MINOR_VERSION }} + use-uv: ${{ inputs.use-uv }} + id: breeze - name: "Test docker-compose quick start" run: breeze testing docker-compose-tests diff --git a/.github/workflows/automatic-backport.yml b/.github/workflows/automatic-backport.yml new file mode 100644 index 0000000000000..4c72401a5d317 --- /dev/null +++ b/.github/workflows/automatic-backport.yml @@ -0,0 +1,78 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# +--- +name: Automatic Backport +on: # yamllint disable-line rule:truthy + push: + branches: + - main +permissions: + contents: read +jobs: + get-pr-info: + name: "Get PR information" + runs-on: ubuntu-latest + outputs: + branches: ${{ steps.pr-info.outputs.branches }} + commit-sha: ${{ github.sha }} + steps: + - name: Get commit SHA + id: get-sha + run: echo "COMMIT_SHA=${GITHUB_SHA}" >> $GITHUB_ENV + + - name: Find PR information + id: pr-info + uses: actions/github-script@v7 + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + with: + script: | + const { data: pullRequest } = await github.rest.repos.listPullRequestsAssociatedWithCommit({ + owner: context.repo.owner, + repo: context.repo.repo, + commit_sha: process.env.GITHUB_SHA + }); + if (pullRequest.length > 0) { + const pr = pullRequest[0]; + const backportBranches = pr.labels + .filter(label => label.name.startsWith('backport-to-')) + .map(label => label.name.replace('backport-to-', '')); + + console.log(`Commit ${process.env.GITHUB_SHA} is associated with PR ${pr.number}`); + console.log(`Backport branches: ${backportBranches}`); + core.setOutput('branches', JSON.stringify(backportBranches)); + } else { + console.log('No pull request found for this commit.'); + core.setOutput('branches', '[]'); + } + + trigger-backport: + name: "Trigger Backport" + uses: ./.github/workflows/backport-cli.yml + needs: get-pr-info + if: ${{ needs.get-pr-info.outputs.branches != '[]' }} + strategy: + matrix: + branch: ${{ fromJSON(needs.get-pr-info.outputs.branches) }} + fail-fast: false + permissions: + contents: write + pull-requests: write + with: + target-branch: ${{ matrix.branch }} + commit-sha: ${{ needs.get-pr-info.outputs.commit-sha }} diff --git a/.github/workflows/backport-cli.yml b/.github/workflows/backport-cli.yml new file mode 100644 index 0000000000000..53243006137a6 --- /dev/null +++ b/.github/workflows/backport-cli.yml @@ -0,0 +1,125 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# +--- +name: Backport Commit +on: # yamllint disable-line rule:truthy + workflow_dispatch: + inputs: + commit-sha: + description: "Commit sha to backport." + required: true + type: string + target-branch: + description: "Target branch to backport." + required: true + type: string + + workflow_call: + inputs: + commit-sha: + description: "Commit sha to backport." + required: true + type: string + target-branch: + description: "Target branch to backport." + required: true + type: string + +permissions: + # Those permissions are only active for workflow dispatch (only committers can trigger it) and workflow call + # Which is triggered automatically by "automatic-backport" push workflow (only when merging by committer) + # Branch protection prevents from pushing to the "code" branches + contents: write + pull-requests: write +jobs: + backport: + runs-on: ubuntu-latest + + steps: + - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" + id: checkout-for-backport + uses: actions/checkout@v4 + with: + persist-credentials: true + fetch-depth: 0 + + - name: Install Python dependencies + run: | + python -m pip install --upgrade pip + python -m pip install cherry-picker==2.4.0 requests==2.32.3 + + - name: Run backport script + id: execute-backport + env: + GH_AUTH: ${{ secrets.GITHUB_TOKEN }} + TARGET_BRANCH: ${{ inputs.target-branch }} + COMMIT_SHA: ${{ inputs.commit-sha }} + run: | + git config --global user.email "name@example.com" + git config --global user.name "Your Name" + set +e + { + echo 'cherry_picker_output<> "${GITHUB_OUTPUT}" + continue-on-error: true + + - name: Parse backport output + id: parse-backport-output + env: + CHERRY_PICKER_OUTPUT: ${{ steps.execute-backport.outputs.cherry_picker_output }} + run: | + set +e + echo "${CHERRY_PICKER_OUTPUT}" + + url=$(echo "${CHERRY_PICKER_OUTPUT}" | \ + grep -o 'Backport PR created at https://[^ ]*' | \ + awk '{print $5}') + + url=${url:-"EMPTY"} + if [ "$url" == "EMPTY" ]; then + # If the backport failed, abort the workflow + cherry_picker --abort + fi + echo "backport-url=$url" >> "${GITHUB_OUTPUT}" + continue-on-error: true + + - name: Update Status + id: backport-status + env: + GH_TOKEN: ${{ github.token }} + REPOSITORY: ${{ github.repository }} + RUN_ID: ${{ github.run_id }} + COMMIT_SHA: ${{ inputs.commit-sha }} + TARGET_BRANCH: ${{ inputs.target-branch }} + BACKPORT_URL: ${{ steps.parse-backport-output.outputs.backport-url }} + run: | + COMMIT_INFO_URL="https://api.github.com/repos/$REPOSITORY/commits/" + COMMIT_INFO_URL="${COMMIT_INFO_URL}$COMMIT_SHA/pulls" + + PR_NUMBER=$(gh api \ + -H "Accept: application/vnd.github+json" \ + -H "X-GitHub-Api-Version: 2022-11-28" \ + /repos/$REPOSITORY/commits/$COMMIT_SHA/pulls \ + --jq '.[0].number') + + python ./dev/backport/update_backport_status.py \ + $BACKPORT_URL \ + $COMMIT_SHA $TARGET_BRANCH \ + "$PR_NUMBER" diff --git a/.github/workflows/basic-tests.yml b/.github/workflows/basic-tests.yml index 5141feae22380..356257a314a6e 100644 --- a/.github/workflows/basic-tests.yml +++ b/.github/workflows/basic-tests.yml @@ -24,6 +24,10 @@ on: # yamllint disable-line rule:truthy description: "The array of labels (in json form) determining public runners." required: true type: string + run-ui-tests: + description: "Whether to run UI tests (true/false)" + required: true + type: string run-www-tests: description: "Whether to run WWW tests (true/false)" required: true @@ -52,12 +56,12 @@ on: # yamllint disable-line rule:truthy description: "Whether to run only latest version checks (true/false)" required: true type: string - enable-aip-44: - description: "Whether to enable AIP-44 (true/false)" + use-uv: + description: "Whether to use uv in the image" required: true type: string -env: - AIRFLOW_ENABLE_AIP_44: "${{ inputs.enable-aip-44 }}" +permissions: + contents: read jobs: run-breeze-tests: timeout-minutes: 10 @@ -74,13 +78,11 @@ jobs: persist-credentials: false - name: "Cleanup docker" run: ./scripts/ci/cleanup_docker.sh - - uses: actions/setup-python@v5 + - name: "Install Breeze" + uses: ./.github/actions/breeze with: - python-version: "${{ inputs.default-python-version }}" - cache: 'pip' - cache-dependency-path: ./dev/breeze/pyproject.toml - - run: pip install --editable ./dev/breeze/ - - run: python -m pytest -n auto --color=yes + use-uv: ${{ inputs.use-uv }} + - run: uv tool run --from apache-airflow-breeze pytest -n auto --color=yes working-directory: ./dev/breeze/ tests-www: @@ -102,21 +104,32 @@ jobs: uses: actions/setup-node@v4 with: node-version: 21 - - name: "Cache eslint" - uses: actions/cache@v4 + - name: "Restore eslint cache (www)" + uses: apache/infrastructure-actions/stash/restore@c94b890bbedc2fc61466d28e6bd9966bc6c6643c with: - path: 'airflow/www/node_modules' - key: ${{ runner.os }}-www-node-modules-${{ hashFiles('airflow/www/**/yarn.lock') }} + path: airflow/www/node_modules/ + key: cache-www-node-modules-v1-${{ runner.os }}-${{ hashFiles('airflow/www/**/yarn.lock') }} + id: restore-eslint-cache - run: yarn --cwd airflow/www/ install --frozen-lockfile --non-interactive - run: yarn --cwd airflow/www/ run test env: FORCE_COLOR: 2 + - name: "Save eslint cache (www)" + uses: apache/infrastructure-actions/stash/save@c94b890bbedc2fc61466d28e6bd9966bc6c6643c + with: + path: airflow/www/node_modules/ + key: cache-www-node-modules-v1-${{ runner.os }}-${{ hashFiles('airflow/www/**/yarn.lock') }} + if-no-files-found: 'error' + retention-days: '2' + if: steps.restore-eslint-cache.outputs.stash-hit != 'true' - test-openapi-client: - timeout-minutes: 10 - name: "Test OpenAPI client" + install-pre-commit: + timeout-minutes: 5 + name: "Install pre-commit for cache" runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} - if: inputs.needs-api-codegen == 'true' + env: + PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" + if: inputs.basic-checks-only == 'true' steps: - name: "Cleanup repo" shell: bash @@ -124,80 +137,17 @@ jobs: - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" uses: actions/checkout@v4 with: - fetch-depth: 2 - persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 - with: - repository: "apache/airflow-client-python" - fetch-depth: 1 persist-credentials: false - path: ./airflow-client-python - name: "Install Breeze" uses: ./.github/actions/breeze - - name: "Generate client with breeze" - run: > - breeze release-management prepare-python-client --package-format both - --version-suffix-for-pypi dev0 --python-client-repo ./airflow-client-python - - name: "Show diff" - run: git diff --color HEAD - working-directory: ./airflow-client-python - - name: Install hatch - run: | - python -m pip install --upgrade pipx - pipx ensurepath - pipx install hatch --force - - name: Run tests - run: hatch run run-coverage - env: - HATCH_ENV: "test" - working-directory: ./clients/python - - name: "Install Airflow in editable mode with fab for webserver tests" - run: pip install -e ".[fab]" - - name: "Install Python client" - run: pip install ./dist/apache_airflow_client-*.whl - - name: "Initialize Airflow DB and start webserver" - run: | - airflow db init - # Let scheduler runs a few loops and get all DAG files from example DAGs serialized to DB - airflow scheduler --num-runs 100 - airflow users create --username admin --password admin --firstname Admin --lastname Admin \ - --role Admin --email admin@example.org - killall python || true # just in case there is a webserver running in the background - nohup airflow webserver --port 8080 & - echo "Started webserver" - env: - AIRFLOW__API__AUTH_BACKENDS: airflow.api.auth.backend.session,airflow.api.auth.backend.basic_auth - AIRFLOW__WEBSERVER__EXPOSE_CONFIG: "True" - AIRFLOW__CORE__LOAD_EXAMPLES: "True" - AIRFLOW_HOME: "${{ github.workspace }}/airflow_home" - - name: "Waiting for the webserver to be available" - run: | - timeout 30 bash -c 'until nc -z $0 $1; do echo "sleeping"; sleep 1; done' localhost 8080 - sleep 5 - - name: "Run test python client" - run: python ./clients/python/test_python_client.py - env: - FORCE_COLOR: "standard" - - name: "Stop running webserver" - run: killall python || true # just in case there is a webserver running in the background - if: always() - - name: "Upload python client packages" - uses: actions/upload-artifact@v4 with: - name: python-client-packages - path: ./dist/apache_airflow_client-* - retention-days: 7 - if-no-files-found: error - - name: "Upload logs from failed tests" - uses: actions/upload-artifact@v4 - if: failure() + use-uv: ${{ inputs.use-uv }} + id: breeze + - name: "Install pre-commit" + uses: ./.github/actions/install-pre-commit + id: pre-commit with: - name: python-client-failed-logs - path: "${{ github.workspace }}/airflow_home/logs" - retention-days: 7 + python-version: ${{steps.breeze.outputs.host-python-version}} # Those checks are run if no image needs to be built for checks. This is for simple changes that # Do not touch any of the python code or any of the important files that might require building @@ -206,6 +156,7 @@ jobs: timeout-minutes: 30 name: "Static checks: basic checks only" runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} + needs: install-pre-commit if: inputs.basic-checks-only == 'true' steps: - name: "Cleanup repo" @@ -217,20 +168,10 @@ jobs: persist-credentials: false - name: "Cleanup docker" run: ./scripts/ci/cleanup_docker.sh - - name: "Setup python" - uses: actions/setup-python@v5 - with: - python-version: ${{ inputs.default-python-version }} - cache: 'pip' - cache-dependency-path: ./dev/breeze/pyproject.toml - - name: "Setup python" - uses: actions/setup-python@v5 - with: - python-version: "${{ inputs.default-python-version }}" - cache: 'pip' - cache-dependency-path: ./dev/breeze/pyproject.toml - name: "Install Breeze" uses: ./.github/actions/breeze + with: + use-uv: ${{ inputs.use-uv }} id: breeze - name: "Install pre-commit" uses: ./.github/actions/install-pre-commit @@ -268,6 +209,7 @@ jobs: timeout-minutes: 45 name: "Upgrade checks" runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} + needs: install-pre-commit env: PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" if: inputs.canary-run == 'true' && inputs.latest-versions-only != 'true' @@ -281,12 +223,16 @@ jobs: persist-credentials: false - name: "Cleanup docker" run: ./scripts/ci/cleanup_docker.sh - # Install python from scratch. No cache used. We always want to have fresh version of everything - - uses: actions/setup-python@v5 + - name: "Install Breeze" + uses: ./.github/actions/breeze with: - python-version: "${{ inputs.default-python-version }}" - - name: "Install latest pre-commit" - run: pip install pre-commit + use-uv: ${{ inputs.use-uv }} + id: breeze + - name: "Install pre-commit" + uses: ./.github/actions/install-pre-commit + id: pre-commit + with: + python-version: ${{steps.breeze.outputs.host-python-version}} - name: "Autoupdate all pre-commits" run: pre-commit autoupdate - name: "Run automated upgrade for black" @@ -319,11 +265,12 @@ jobs: run: > pre-commit run --all-files --show-diff-on-failure --color always --verbose - --hook-stage manual update-installers || true + --hook-stage manual update-installers-and-pre-commit || true if: always() env: UPGRADE_UV: "true" UPGRADE_PIP: "false" + UPGRADE_PRE_COMMIT: "true" - name: "Run automated upgrade for pip" run: > pre-commit run @@ -340,11 +287,11 @@ jobs: runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} env: PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" - IMAGE_TAG: ${{ inputs.image-tag }} GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} VERBOSE: "true" + if: inputs.canary-run == 'true' steps: - name: "Cleanup repo" shell: bash @@ -357,6 +304,8 @@ jobs: run: ./scripts/ci/cleanup_docker.sh - name: "Install Breeze" uses: ./.github/actions/breeze + with: + use-uv: ${{ inputs.use-uv }} - name: "Cleanup dist files" run: rm -fv ./dist/* - name: Setup git for tagging @@ -366,18 +315,20 @@ jobs: - name: Install twine run: pip install twine - name: "Check Airflow create minor branch command" - run: breeze release-management create-minor-branch --version-branch 2-8 --answer yes + run: | + ./scripts/ci/testing/run_breeze_command_with_retries.sh \ + release-management create-minor-branch --version-branch 2-8 --answer yes - name: "Check Airflow RC process command" - run: > - breeze release-management start-rc-process --version 2.8.3rc1 --previous-version 2.8.0 - --sync-branch sync_v2_8_test --answer yes + run: | + ./scripts/ci/testing/run_breeze_command_with_retries.sh \ + release-management start-rc-process --version 2.8.3rc1 --previous-version 2.8.0 --answer yes - name: "Check Airflow release process command" - run: > - breeze release-management start-release --release-candidate 2.8.3rc1 --previous-release 2.8.0 - --answer yes + run: | + ./scripts/ci/testing/run_breeze_command_with_retries.sh \ + release-management start-release --release-candidate 2.8.3rc1 --previous-release 2.8.0 --answer yes - name: "Fetch all git tags" run: git fetch --tags >/dev/null 2>&1 || true - name: "Test airflow core issue generation automatically" run: | - breeze release-management generate-issue-content-core \ - --limit-pr-count 25 --latest --verbose + ./scripts/ci/testing/run_breeze_command_with_retries.sh \ + release-management generate-issue-content-core --limit-pr-count 25 --latest --verbose diff --git a/.github/workflows/build-images.yml b/.github/workflows/build-images.yml deleted file mode 100644 index 55e6c5d2018b9..0000000000000 --- a/.github/workflows/build-images.yml +++ /dev/null @@ -1,259 +0,0 @@ -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -# ---- -name: Build Images -run-name: > - Build images for ${{ github.event.pull_request.title }} ${{ github.event.pull_request._links.html.href }} -on: # yamllint disable-line rule:truthy - pull_request_target: - branches: - - main - - v2-10-stable - - v2-10-test -permissions: - # all other permissions are set to none - contents: read - pull-requests: read - packages: read -env: - ANSWER: "yes" - # You can override CONSTRAINTS_GITHUB_REPOSITORY by setting secret in your repo but by default the - # Airflow one is going to be used - CONSTRAINTS_GITHUB_REPOSITORY: >- - ${{ secrets.CONSTRAINTS_GITHUB_REPOSITORY != '' && - secrets.CONSTRAINTS_GITHUB_REPOSITORY || 'apache/airflow' }} - # This token is WRITE one - pull_request_target type of events always have the WRITE token - DB_RESET: "true" - GITHUB_REPOSITORY: ${{ github.repository }} - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - GITHUB_USERNAME: ${{ github.actor }} - IMAGE_TAG: "${{ github.event.pull_request.head.sha || github.sha }}" - INCLUDE_SUCCESS_OUTPUTS: "true" - USE_SUDO: "true" - VERBOSE: "true" - -concurrency: - group: build-${{ github.event.pull_request.number || github.ref }} - cancel-in-progress: true - -jobs: - build-info: - timeout-minutes: 10 - name: Build Info - # At build-info stage we do not yet have outputs so we need to hard-code the runs-on to public runners - runs-on: ["ubuntu-22.04"] - env: - TARGET_BRANCH: ${{ github.event.pull_request.base.ref }} - outputs: - image-tag: ${{ github.event.pull_request.head.sha || github.sha }} - python-versions: ${{ steps.selective-checks.outputs.python-versions }} - python-versions-list-as-string: ${{ steps.selective-checks.outputs.python-versions-list-as-string }} - default-python-version: ${{ steps.selective-checks.outputs.default-python-version }} - upgrade-to-newer-dependencies: ${{ steps.selective-checks.outputs.upgrade-to-newer-dependencies }} - run-tests: ${{ steps.selective-checks.outputs.run-tests }} - run-kubernetes-tests: ${{ steps.selective-checks.outputs.run-kubernetes-tests }} - ci-image-build: ${{ steps.selective-checks.outputs.ci-image-build }} - prod-image-build: ${{ steps.selective-checks.outputs.prod-image-build }} - docker-cache: ${{ steps.selective-checks.outputs.docker-cache }} - default-branch: ${{ steps.selective-checks.outputs.default-branch }} - force-pip: ${{ steps.selective-checks.outputs.force-pip }} - constraints-branch: ${{ steps.selective-checks.outputs.default-constraints-branch }} - runs-on-as-json-default: ${{ steps.selective-checks.outputs.runs-on-as-json-default }} - runs-on-as-json-public: ${{ steps.selective-checks.outputs.runs-on-as-json-public }} - runs-on-as-json-self-hosted: ${{ steps.selective-checks.outputs.runs-on-as-json-self-hosted }} - is-self-hosted-runner: ${{ steps.selective-checks.outputs.is-self-hosted-runner }} - is-committer-build: ${{ steps.selective-checks.outputs.is-committer-build }} - is-airflow-runner: ${{ steps.selective-checks.outputs.is-airflow-runner }} - is-amd-runner: ${{ steps.selective-checks.outputs.is-amd-runner }} - is-arm-runner: ${{ steps.selective-checks.outputs.is-arm-runner }} - is-vm-runner: ${{ steps.selective-checks.outputs.is-vm-runner }} - is-k8s-runner: ${{ steps.selective-checks.outputs.is-k8s-runner }} - chicken-egg-providers: ${{ steps.selective-checks.outputs.chicken-egg-providers }} - target-commit-sha: "${{steps.discover-pr-merge-commit.outputs.target-commit-sha || - github.event.pull_request.head.sha || - github.sha - }}" - if: github.repository == 'apache/airflow' - steps: - - name: Cleanup repo - shell: bash - run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" - - name: Discover PR merge commit - id: discover-pr-merge-commit - run: | - # Sometimes target-commit-sha cannot be - TARGET_COMMIT_SHA="$(gh api '${{ github.event.pull_request.url }}' --jq .merge_commit_sha)" - if [[ ${TARGET_COMMIT_SHA} == "" ]]; then - # Sometimes retrieving the merge commit SHA from PR fails. We retry it once. Otherwise we - # fall-back to github.event.pull_request.head.sha - echo - echo "Could not retrieve merge commit SHA from PR, waiting for 3 seconds and retrying." - echo - sleep 3 - TARGET_COMMIT_SHA="$(gh api '${{ github.event.pull_request.url }}' --jq .merge_commit_sha)" - if [[ ${TARGET_COMMIT_SHA} == "" ]]; then - echo - echo "Could not retrieve merge commit SHA from PR, falling back to PR head SHA." - echo - TARGET_COMMIT_SHA="${{ github.event.pull_request.head.sha }}" - fi - fi - echo "TARGET_COMMIT_SHA=${TARGET_COMMIT_SHA}" - echo "TARGET_COMMIT_SHA=${TARGET_COMMIT_SHA}" >> ${GITHUB_ENV} - echo "target-commit-sha=${TARGET_COMMIT_SHA}" >> ${GITHUB_OUTPUT} - if: github.event_name == 'pull_request_target' - # The labels in the event aren't updated when re-triggering the job, So lets hit the API to get - # up-to-date values - - name: Get latest PR labels - id: get-latest-pr-labels - run: | - echo -n "pull-request-labels=" >> ${GITHUB_OUTPUT} - gh api graphql --paginate -F node_id=${{github.event.pull_request.node_id}} -f query=' - query($node_id: ID!, $endCursor: String) { - node(id:$node_id) { - ... on PullRequest { - labels(first: 100, after: $endCursor) { - nodes { name } - pageInfo { hasNextPage endCursor } - } - } - } - }' --jq '.data.node.labels.nodes[]' | jq --slurp -c '[.[].name]' >> ${GITHUB_OUTPUT} - if: github.event_name == 'pull_request_target' - - uses: actions/checkout@v4 - with: - ref: ${{ env.TARGET_COMMIT_SHA }} - persist-credentials: false - fetch-depth: 2 - #################################################################################################### - # WE ONLY DO THAT CHECKOUT ABOVE TO RETRIEVE THE TARGET COMMIT AND IT'S PARENT. DO NOT RUN ANY CODE - # RIGHT AFTER THAT AS WE ARE GOING TO RESTORE THE TARGET BRANCH CODE IN THE NEXT STEP. - #################################################################################################### - - name: Checkout target branch to use ci/scripts and breeze from there. - uses: actions/checkout@v4 - with: - ref: ${{ github.base_ref }} - persist-credentials: false - #################################################################################################### - # HERE EVERYTHING IS PERFECTLY SAFE TO RUN. AT THIS POINT WE HAVE THE TARGET BRANCH CHECKED OUT - # AND WE CAN RUN ANY CODE FROM IT. WE CAN RUN BREEZE COMMANDS, WE CAN RUN SCRIPTS, WE CAN RUN - # COMPOSITE ACTIONS. WE CAN RUN ANYTHING THAT IS IN THE TARGET BRANCH AND THERE IS NO RISK THAT - # CODE WILL BE RUN FROM THE PR. - #################################################################################################### - - name: Cleanup docker - run: ./scripts/ci/cleanup_docker.sh - - name: Setup python - uses: actions/setup-python@v5 - with: - python-version: "3.9" - - name: Install Breeze - uses: ./.github/actions/breeze - #################################################################################################### - # WE RUN SELECTIVE CHECKS HERE USING THE TARGET COMMIT AND ITS PARENT TO BE ABLE TO COMPARE THEM - # AND SEE WHAT HAS CHANGED IN THE PR. THE CODE IS STILL RUN FROM THE TARGET BRANCH, SO IT IS SAFE - # TO RUN IT, WE ONLY PASS TARGET_COMMIT_SHA SO THAT SELECTIVE CHECKS CAN SEE WHAT'S COMING IN THE PR - #################################################################################################### - - name: Selective checks - id: selective-checks - env: - PR_LABELS: "${{ steps.get-latest-pr-labels.outputs.pull-request-labels }}" - COMMIT_REF: "${{ env.TARGET_COMMIT_SHA }}" - VERBOSE: "false" - AIRFLOW_SOURCES_ROOT: "${{ github.workspace }}" - run: breeze ci selective-check 2>> ${GITHUB_OUTPUT} - - name: env - run: printenv - env: - PR_LABELS: ${{ steps.get-latest-pr-labels.outputs.pull-request-labels }} - GITHUB_CONTEXT: ${{ toJson(github) }} - - - build-ci-images: - name: Build CI images - permissions: - contents: read - packages: write - secrets: inherit - needs: [build-info] - uses: ./.github/workflows/ci-image-build.yml - # Only run this it if the PR comes from fork, otherwise build will be done "in-PR-workflow" - if: | - needs.build-info.outputs.ci-image-build == 'true' && - github.event.pull_request.head.repo.full_name != 'apache/airflow' - with: - runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} - runs-on-as-json-self-hosted: ${{ needs.build-info.outputs.runs-on-as-json-self-hosted }} - do-build: ${{ needs.build-info.outputs.ci-image-build }} - target-commit-sha: ${{ needs.build-info.outputs.target-commit-sha }} - pull-request-target: "true" - is-committer-build: ${{ needs.build-info.outputs.is-committer-build }} - push-image: "true" - use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} - image-tag: ${{ needs.build-info.outputs.image-tag }} - platform: "linux/amd64" - python-versions: ${{ needs.build-info.outputs.python-versions }} - branch: ${{ needs.build-info.outputs.default-branch }} - constraints-branch: ${{ needs.build-info.outputs.constraints-branch }} - upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} - docker-cache: ${{ needs.build-info.outputs.docker-cache }} - - generate-constraints: - name: Generate constraints - needs: [build-info, build-ci-images] - uses: ./.github/workflows/generate-constraints.yml - with: - runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} - python-versions-list-as-string: ${{ needs.build-info.outputs.python-versions-list-as-string }} - # For regular PRs we do not need "no providers" constraints - they are only needed in canary builds - generate-no-providers-constraints: "false" - image-tag: ${{ needs.build-info.outputs.image-tag }} - chicken-egg-providers: ${{ needs.build-info.outputs.chicken-egg-providers }} - debug-resources: ${{ needs.build-info.outputs.debug-resources }} - - build-prod-images: - name: Build PROD images - permissions: - contents: read - packages: write - secrets: inherit - needs: [build-info, generate-constraints] - uses: ./.github/workflows/prod-image-build.yml - # Only run this it if the PR comes from fork, otherwise build will be done "in-PR-workflow" - if: | - needs.build-info.outputs.prod-image-build == 'true' && - github.event.pull_request.head.repo.full_name != 'apache/airflow' - with: - runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} - build-type: "Regular" - do-build: ${{ needs.build-info.outputs.ci-image-build }} - upload-package-artifact: "true" - target-commit-sha: ${{ needs.build-info.outputs.target-commit-sha }} - pull-request-target: "true" - is-committer-build: ${{ needs.build-info.outputs.is-committer-build }} - push-image: "true" - use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} - image-tag: ${{ needs.build-info.outputs.image-tag }} - platform: linux/amd64 - python-versions: ${{ needs.build-info.outputs.python-versions }} - default-python-version: ${{ needs.build-info.outputs.default-python-version }} - branch: ${{ needs.build-info.outputs.default-branch }} - constraints-branch: ${{ needs.build-info.outputs.constraints-branch }} - build-provider-packages: ${{ needs.build-info.outputs.default-branch == 'main' }} - upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} - chicken-egg-providers: ${{ needs.build-info.outputs.chicken-egg-providers }} - docker-cache: ${{ needs.build-info.outputs.docker-cache }} diff --git a/.github/workflows/ci-image-build.yml b/.github/workflows/ci-image-build.yml index 1c4b31b55a604..9283dc06b936f 100644 --- a/.github/workflows/ci-image-build.yml +++ b/.github/workflows/ci-image-build.yml @@ -28,13 +28,6 @@ on: # yamllint disable-line rule:truthy description: "The array of labels (in json form) determining self-hosted runners." required: true type: string - do-build: - description: > - Whether to actually do the build (true/false). If set to false, the build is done - already in pull-request-target workflow, so we skip it here. - required: false - default: "true" - type: string target-commit-sha: description: "The commit SHA to checkout for the build" required: false @@ -59,6 +52,14 @@ on: # yamllint disable-line rule:truthy required: false default: "true" type: string + upload-image-artifact: + description: "Whether to upload docker image artifact" + required: true + type: string + upload-mount-cache-artifact: + description: "Whether to upload mount-cache artifact" + required: true + type: string debian-version: description: "Base Debian distribution to use for the build (bookworm)" type: string @@ -71,10 +72,6 @@ on: # yamllint disable-line rule:truthy description: "Whether to use uv to build the image (true/false)" required: true type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string python-versions: description: "JSON-formatted array of Python versions to build images from" required: true @@ -95,25 +92,20 @@ on: # yamllint disable-line rule:truthy description: "Docker cache specification to build the image (registry, local, disabled)." required: true type: string + disable-airflow-repo-cache: + description: "Disable airflow repo cache read from main." + required: true + type: string +permissions: + contents: read jobs: build-ci-images: strategy: fail-fast: true matrix: - # yamllint disable-line rule:line-length - python-version: ${{ inputs.do-build == 'true' && fromJSON(inputs.python-versions) || fromJSON('[""]') }} + python-version: ${{ fromJSON(inputs.python-versions) || fromJSON('[""]') }} timeout-minutes: 110 - name: "\ -${{ inputs.do-build == 'true' && 'Build' || 'Skip building' }} \ -CI ${{ inputs.platform }} image\ -${{ matrix.python-version }}${{ inputs.do-build == 'true' && ':' || '' }}\ -${{ inputs.do-build == 'true' && inputs.image-tag || '' }}" - # The ARM images need to be built using self-hosted runners as ARM macos public runners - # do not yet allow us to run docker effectively and fast. - # https://github.com/actions/runner-images/issues/9254#issuecomment-1917916016 - # https://github.com/abiosoft/colima/issues/970 - # https://github.com/actions/runner/issues/1456 - # See https://github.com/apache/airflow/pull/38640 + name: "Build CI ${{ inputs.platform }} image ${{ matrix.python-version }}" # NOTE!!!!! This has to be put in one line for runs-on to recognize the "fromJSON" properly !!!! # adding space before (with >) apparently turns the `runs-on` processed line into a string "Array" # instead of an array of strings. @@ -121,56 +113,54 @@ ${{ inputs.do-build == 'true' && inputs.image-tag || '' }}" runs-on: ${{ (inputs.platform == 'linux/amd64') && fromJSON(inputs.runs-on-as-json-public) || fromJSON(inputs.runs-on-as-json-self-hosted) }} env: BACKEND: sqlite + PYTHON_MAJOR_MINOR_VERSION: ${{ matrix.python-version }} DEFAULT_BRANCH: ${{ inputs.branch }} DEFAULT_CONSTRAINTS_BRANCH: ${{ inputs.constraints-branch }} VERSION_SUFFIX_FOR_PYPI: "dev0" GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} - USE_UV: ${{ inputs.use-uv }} VERBOSE: "true" steps: - name: "Cleanup repo" shell: bash run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" - if: inputs.do-build == 'true' - name: "Checkout target branch" uses: actions/checkout@v4 with: persist-credentials: false - - name: "Checkout target commit" - uses: ./.github/actions/checkout_target_commit - if: inputs.do-build == 'true' - with: - target-commit-sha: ${{ inputs.target-commit-sha }} - pull-request-target: ${{ inputs.pull-request-target }} - is-committer-build: ${{ inputs.is-committer-build }} - name: "Cleanup docker" run: ./scripts/ci/cleanup_docker.sh - if: inputs.do-build == 'true' - name: "Install Breeze" uses: ./.github/actions/breeze - if: inputs.do-build == 'true' - - name: "Regenerate dependencies in case they were modified manually so that we can build an image" - shell: bash - run: | - pip install rich>=12.4.4 pyyaml - python scripts/ci/pre_commit/update_providers_dependencies.py - if: inputs.do-build == 'true' && inputs.upgrade-to-newer-dependencies != 'false' - - name: "Start ARM instance" - run: ./scripts/ci/images/ci_start_arm_instance_and_connect_to_docker.sh - if: inputs.do-build == 'true' && inputs.platform == 'linux/arm64' - - name: Login to ghcr.io - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - if: inputs.do-build == 'true' + with: + use-uv: ${{ inputs.use-uv }} + - name: "Restore ci-cache mount image ${{ inputs.platform }}:${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + uses: apache/infrastructure-actions/stash/restore@c94b890bbedc2fc61466d28e6bd9966bc6c6643c + with: + key: "ci-cache-mount-save-v2-${{ inputs.platform }}-${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + path: "/tmp/" + id: restore-cache-mount + - name: "Import mount-cache ${{ inputs.platform }}:${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + env: + PYTHON_MAJOR_MINOR_VERSION: ${{ env.PYTHON_MAJOR_MINOR_VERSION }} + run: > + breeze ci-image import-mount-cache + --cache-file /tmp/ci-cache-mount-save-v2-${PYTHON_MAJOR_MINOR_VERSION}.tar.gz + if: steps.restore-cache-mount.outputs.stash-hit == 'true' + - name: "Login to ghcr.io" + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + ACTOR: ${{ github.actor }} + run: echo "${GITHUB_TOKEN}" | docker login ghcr.io -u ${ACTOR} --password-stdin - name: > Build ${{ inputs.push-image == 'true' && ' & push ' || '' }} - ${{ inputs.platform }}:${{ matrix.python-version }}:${{ inputs.image-tag }} + ${{ inputs.platform }}:${{ env.PYTHON_MAJOR_MINOR_VERSION }} image run: > - breeze ci-image build --builder airflow_cache --tag-as-latest --image-tag "${{ inputs.image-tag }}" - --python "${{ matrix.python-version }}" --platform "${{ inputs.platform }}" + breeze ci-image build --platform "${PLATFORM}" env: DOCKER_CACHE: ${{ inputs.docker-cache }} + DISABLE_AIRFLOW_REPO_CACHE: ${{ inputs.disable-airflow-repo-cache }} INSTALL_MYSQL_CLIENT_TYPE: ${{ inputs.install-mysql-client-type }} UPGRADE_TO_NEWER_DEPENDENCIES: ${{ inputs.upgrade-to-newer-dependencies }} # You can override CONSTRAINTS_GITHUB_REPOSITORY by setting secret in your repo but by default the @@ -184,7 +174,38 @@ ${{ inputs.do-build == 'true' && inputs.image-tag || '' }}" GITHUB_USERNAME: ${{ github.actor }} PUSH: ${{ inputs.push-image }} VERBOSE: "true" - if: inputs.do-build == 'true' - - name: "Stop ARM instance" - run: ./scripts/ci/images/ci_stop_arm_instance.sh - if: always() && inputs.do-build == 'true' && inputs.platform == 'linux/arm64' + PLATFORM: ${{ inputs.platform }} + - name: Check free space + run: df -H + shell: bash + - name: Make /mnt/ directory writeable + run: sudo chown -R ${USER} /mnt + shell: bash + - name: "Export CI docker image ${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + env: + PLATFORM: ${{ inputs.platform }} + run: breeze ci-image save --platform "${PLATFORM}" --image-file-dir "/mnt" + if: inputs.upload-image-artifact == 'true' + - name: "Stash CI docker image ${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + uses: apache/infrastructure-actions/stash/save@c94b890bbedc2fc61466d28e6bd9966bc6c6643c + with: + key: ci-image-save-${{ inputs.platform }}-${{ env.PYTHON_MAJOR_MINOR_VERSION }} + path: "/mnt/ci-image-save-*-${{ env.PYTHON_MAJOR_MINOR_VERSION }}.tar" + if-no-files-found: 'error' + retention-days: '2' + if: inputs.upload-image-artifact == 'true' + - name: "Export mount cache ${{ inputs.platform }}:${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + env: + PYTHON_MAJOR_MINOR_VERSION: ${{ env.PYTHON_MAJOR_MINOR_VERSION }} + run: > + breeze ci-image export-mount-cache + --cache-file /tmp/ci-cache-mount-save-v2-${PYTHON_MAJOR_MINOR_VERSION}.tar.gz + if: inputs.upload-mount-cache-artifact == 'true' + - name: "Stash cache mount ${{ inputs.platform }}:${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + uses: apache/infrastructure-actions/stash/save@c94b890bbedc2fc61466d28e6bd9966bc6c6643c + with: + key: "ci-cache-mount-save-v2-${{ inputs.platform }}-${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + path: "/tmp/ci-cache-mount-save-v2-${{ env.PYTHON_MAJOR_MINOR_VERSION }}.tar.gz" + if-no-files-found: 'error' + retention-days: 2 + if: inputs.upload-mount-cache-artifact == 'true' diff --git a/.github/workflows/static-checks-mypy-docs.yml b/.github/workflows/ci-image-checks.yml similarity index 72% rename from .github/workflows/static-checks-mypy-docs.yml rename to .github/workflows/ci-image-checks.yml index be2c4f8e28645..06edff1101f97 100644 --- a/.github/workflows/static-checks-mypy-docs.yml +++ b/.github/workflows/ci-image-checks.yml @@ -16,7 +16,7 @@ # under the License. # --- -name: Static checks, mypy, docs +name: CI Image Checks on: # yamllint disable-line rule:truthy workflow_call: inputs: @@ -28,10 +28,6 @@ on: # yamllint disable-line rule:truthy description: "The array of labels (in json form) determining the labels used for docs build." required: true type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string needs-mypy: description: "Whether to run mypy checks (true/false)" required: true @@ -96,15 +92,77 @@ on: # yamllint disable-line rule:truthy description: "Whether to build docs (true/false)" required: true type: string + needs-api-codegen: + description: "Whether to run API codegen (true/false)" + required: true + type: string + default-postgres-version: + description: "The default version of the postgres to use" + required: true + type: string + run-coverage: + description: "Whether to run coverage or not (true/false)" + required: true + type: string + use-uv: + description: "Whether to use uv to build the image (true/false)" + required: true + type: string +permissions: + contents: read jobs: + install-pre-commit: + timeout-minutes: 5 + name: "Install pre-commit for cache (only canary runs)" + runs-on: ${{ fromJSON(inputs.runs-on-as-json-default) }} + env: + PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" + if: inputs.basic-checks-only == 'false' + steps: + - name: "Cleanup repo" + shell: bash + run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" + if: inputs.canary-run == 'true' + - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" + uses: actions/checkout@v4 + with: + persist-credentials: false + if: inputs.canary-run == 'true' + - name: "Install Breeze" + uses: ./.github/actions/breeze + with: + use-uv: ${{ inputs.use-uv }} + id: breeze + if: inputs.canary-run == 'true' + - name: "Install pre-commit" + uses: ./.github/actions/install-pre-commit + id: pre-commit + with: + python-version: ${{steps.breeze.outputs.host-python-version}} + if: inputs.canary-run == 'true' + - name: "Prepare .tar file from pre-commit cache" + run: | + tar -C ~ -czf /tmp/cache-pre-commit.tar.gz .cache/pre-commit .cache/uv + shell: bash + if: inputs.canary-run == 'true' + - name: "Save pre-commit cache" + uses: apache/infrastructure-actions/stash/save@c94b890bbedc2fc61466d28e6bd9966bc6c6643c + with: + # yamllint disable rule:line-length + key: cache-pre-commit-v4-${{ steps.breeze.outputs.host-python-version }}-${{ hashFiles('.pre-commit-config.yaml') }} + path: /tmp/cache-pre-commit.tar.gz + if-no-files-found: 'error' + retention-days: '2' + if: inputs.canary-run == 'true' + static-checks: timeout-minutes: 45 name: "Static checks" runs-on: ${{ fromJSON(inputs.runs-on-as-json-default) }} + needs: install-pre-commit env: PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" UPGRADE_TO_NEWER_DEPENDENCIES: "${{ inputs.upgrade-to-newer-dependencies }}" - IMAGE_TAG: ${{ inputs.image-tag }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} if: inputs.basic-checks-only == 'false' && inputs.latest-versions-only != 'true' steps: @@ -115,16 +173,12 @@ jobs: uses: actions/checkout@v4 with: persist-credentials: false - - name: "Setup python" - uses: actions/setup-python@v5 - with: - python-version: ${{ inputs.default-python-version }} - cache: 'pip' - cache-dependency-path: ./dev/breeze/pyproject.toml - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: "Prepare breeze & CI image: ${{ inputs.default-python-version}}:${{ inputs.image-tag }}" + - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}" uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + python: ${{ inputs.default-python-version }} + use-uv: ${{ inputs.use-uv }} id: breeze - name: "Install pre-commit" uses: ./.github/actions/install-pre-commit @@ -145,6 +199,7 @@ jobs: timeout-minutes: 45 name: "MyPy checks" runs-on: ${{ fromJSON(inputs.runs-on-as-json-default) }} + needs: install-pre-commit if: inputs.needs-mypy == 'true' strategy: fail-fast: false @@ -152,7 +207,6 @@ jobs: mypy-check: ${{ fromJSON(inputs.mypy-checks) }} env: PYTHON_MAJOR_MINOR_VERSION: "${{inputs.default-python-version}}" - IMAGE_TAG: "${{ inputs.image-tag }}" GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} steps: - name: "Cleanup repo" @@ -162,10 +216,12 @@ jobs: uses: actions/checkout@v4 with: persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}:${{ inputs.image-tag }}" + - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}" uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + python: ${{ inputs.default-python-version }} + use-uv: ${{ inputs.use-uv }} id: breeze - name: "Install pre-commit" uses: ./.github/actions/install-pre-commit @@ -173,7 +229,7 @@ jobs: with: python-version: ${{steps.breeze.outputs.host-python-version}} - name: "MyPy checks for ${{ matrix.mypy-check }}" - run: pre-commit run --color always --verbose --hook-stage manual ${{matrix.mypy-check}} --all-files + run: pre-commit run --color always --verbose --hook-stage manual "$MYPY_CHECK" --all-files env: VERBOSE: "false" COLUMNS: "250" @@ -181,6 +237,7 @@ jobs: DEFAULT_BRANCH: ${{ inputs.branch }} RUFF_FORMAT: "github" INCLUDE_MYPY_VOLUME: "false" + MYPY_CHECK: ${{ matrix.mypy-check }} build-docs: timeout-minutes: 150 @@ -195,7 +252,6 @@ jobs: GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} - IMAGE_TAG: "${{ inputs.image-tag }}" INCLUDE_NOT_READY_PROVIDERS: "true" INCLUDE_SUCCESS_OUTPUTS: "${{ inputs.include-success-outputs }}" PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" @@ -208,28 +264,39 @@ jobs: uses: actions/checkout@v4 with: persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}:${{ inputs.image-tag }}" + - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}" uses: ./.github/actions/prepare_breeze_and_image - - uses: actions/cache@v4 - id: cache-doc-inventories + with: + platform: "linux/amd64" + python: ${{ inputs.default-python-version }} + use-uv: ${{ inputs.use-uv }} + - name: "Restore docs inventory cache" + uses: apache/infrastructure-actions/stash/restore@c94b890bbedc2fc61466d28e6bd9966bc6c6643c with: path: ./docs/_inventory_cache/ - key: docs-inventory-${{ hashFiles('pyproject.toml;') }} - restore-keys: | - docs-inventory-${{ hashFiles('pyproject.toml;') }} - docs-inventory- + # TODO(potiuk): do better with determining the key + key: cache-docs-inventory-v1-${{ hashFiles('pyproject.toml') }} + id: restore-docs-inventory-cache - name: "Building docs with ${{ matrix.flag }} flag" + env: + DOCS_LIST_AS_STRING: ${{ inputs.docs-list-as-string }} run: > - breeze build-docs ${{ inputs.docs-list-as-string }} ${{ matrix.flag }} + breeze build-docs ${DOCS_LIST_AS_STRING} ${{ matrix.flag }} + - name: "Save docs inventory cache" + uses: apache/infrastructure-actions/stash/save@c94b890bbedc2fc61466d28e6bd9966bc6c6643c + with: + path: ./docs/_inventory_cache/ + key: cache-docs-inventory-v1-${{ hashFiles('pyproject.toml') }} + if-no-files-found: 'error' + retention-days: '2' + if: steps.restore-docs-inventory-cache != 'true' - name: "Upload build docs" uses: actions/upload-artifact@v4 with: name: airflow-docs path: './docs/_build' - retention-days: 7 - if-no-files-found: error + retention-days: '7' + if-no-files-found: 'error' if: matrix.flag == '--docs-only' publish-docs: @@ -241,7 +308,6 @@ jobs: GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} - IMAGE_TAG: "${{ inputs.image-tag }}" INCLUDE_NOT_READY_PROVIDERS: "true" INCLUDE_SUCCESS_OUTPUTS: "${{ inputs.include-success-outputs }}" PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" @@ -270,12 +336,18 @@ jobs: run: > git clone https://github.com/apache/airflow-site.git /mnt/airflow-site/airflow-site && echo "AIRFLOW_SITE_DIRECTORY=/mnt/airflow-site/airflow-site" >> "$GITHUB_ENV" - - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}:${{ inputs.image-tag }}" + - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}" uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + python: ${{ inputs.default-python-version }} + use-uv: ${{ inputs.use-uv }} - name: "Publish docs" + env: + DOCS_LIST_AS_STRING: ${{ inputs.docs-list-as-string }} run: > breeze release-management publish-docs --override-versioned --run-in-parallel - ${{ inputs.docs-list-as-string }} + ${DOCS_LIST_AS_STRING} - name: Check disk space available run: df -h - name: "Generate back references for providers" diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml index 69272f7b901a4..859213e663129 100644 --- a/.github/workflows/ci.yml +++ b/.github/workflows/ci.yml @@ -21,24 +21,24 @@ on: # yamllint disable-line rule:truthy schedule: - cron: '28 1,7,13,19 * * *' push: - branches: ['v[0-9]+-[0-9]+-test'] + branches: + - v[0-9]+-[0-9]+-test + - providers-[a-z]+-?[a-z]*/v[0-9]+-[0-9]+ pull_request: - branches: ['main', 'v[0-9]+-[0-9]+-test', 'v[0-9]+-[0-9]+-stable'] + branches: + - main + - v[0-9]+-[0-9]+-test + - v[0-9]+-[0-9]+-stable + - providers-[a-z]+-?[a-z]*/v[0-9]+-[0-9]+ workflow_dispatch: permissions: - # All other permissions are set to none + # All other permissions are set to none by default contents: read - # Technically read access while waiting for images should be more than enough. However, - # there is a bug in GitHub Actions/Packages and in case private repositories are used, you get a permission - # denied error when attempting to just pull private image, changing the token permission to write solves the - # issue. This is not dangerous, because if it is for "apache/airflow", only maintainers can push ci.yml - # changes. If it is for a fork, then the token is read-only anyway. - packages: write env: GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} - IMAGE_TAG: "${{ github.event.pull_request.head.sha || github.sha }}" + SLACK_BOT_TOKEN: ${{ secrets.SLACK_BOT_TOKEN }} VERBOSE: "true" concurrency: @@ -54,81 +54,88 @@ jobs: env: GITHUB_CONTEXT: ${{ toJson(github) }} outputs: - image-tag: ${{ github.event.pull_request.head.sha || github.sha }} - docker-cache: ${{ steps.selective-checks.outputs.docker-cache }} - affected-providers-list-as-string: >- - ${{ steps.selective-checks.outputs.affected-providers-list-as-string }} - upgrade-to-newer-dependencies: ${{ steps.selective-checks.outputs.upgrade-to-newer-dependencies }} - python-versions: ${{ steps.selective-checks.outputs.python-versions }} - python-versions-list-as-string: ${{ steps.selective-checks.outputs.python-versions-list-as-string }} all-python-versions-list-as-string: >- ${{ steps.selective-checks.outputs.all-python-versions-list-as-string }} - default-python-version: ${{ steps.selective-checks.outputs.default-python-version }} - kubernetes-versions-list-as-string: >- - ${{ steps.selective-checks.outputs.kubernetes-versions-list-as-string }} - kubernetes-combos-list-as-string: >- - ${{ steps.selective-checks.outputs.kubernetes-combos-list-as-string }} - default-kubernetes-version: ${{ steps.selective-checks.outputs.default-kubernetes-version }} - postgres-versions: ${{ steps.selective-checks.outputs.postgres-versions }} - default-postgres-version: ${{ steps.selective-checks.outputs.default-postgres-version }} - mysql-versions: ${{ steps.selective-checks.outputs.mysql-versions }} - default-mysql-version: ${{ steps.selective-checks.outputs.default-mysql-version }} - default-helm-version: ${{ steps.selective-checks.outputs.default-helm-version }} - default-kind-version: ${{ steps.selective-checks.outputs.default-kind-version }} - force-pip: ${{ steps.selective-checks.outputs.force-pip }} - full-tests-needed: ${{ steps.selective-checks.outputs.full-tests-needed }} - parallel-test-types-list-as-string: >- - ${{ steps.selective-checks.outputs.parallel-test-types-list-as-string }} - providers-test-types-list-as-string: >- - ${{ steps.selective-checks.outputs.providers-test-types-list-as-string }} - separate-test-types-list-as-string: >- - ${{ steps.selective-checks.outputs.separate-test-types-list-as-string }} - include-success-outputs: ${{ steps.selective-checks.outputs.include-success-outputs }} - postgres-exclude: ${{ steps.selective-checks.outputs.postgres-exclude }} - mysql-exclude: ${{ steps.selective-checks.outputs.mysql-exclude }} - sqlite-exclude: ${{ steps.selective-checks.outputs.sqlite-exclude }} - skip-provider-tests: ${{ steps.selective-checks.outputs.skip-provider-tests }} - run-tests: ${{ steps.selective-checks.outputs.run-tests }} - run-amazon-tests: ${{ steps.selective-checks.outputs.run-amazon-tests }} - run-www-tests: ${{ steps.selective-checks.outputs.run-www-tests }} - run-kubernetes-tests: ${{ steps.selective-checks.outputs.run-kubernetes-tests }} basic-checks-only: ${{ steps.selective-checks.outputs.basic-checks-only }} + canary-run: ${{ steps.source-run-info.outputs.canary-run }} + chicken-egg-providers: ${{ steps.selective-checks.outputs.chicken-egg-providers }} ci-image-build: ${{ steps.selective-checks.outputs.ci-image-build }} - prod-image-build: ${{ steps.selective-checks.outputs.prod-image-build }} - docs-build: ${{ steps.selective-checks.outputs.docs-build }} - mypy-checks: ${{ steps.selective-checks.outputs.mypy-checks }} - needs-mypy: ${{ steps.selective-checks.outputs.needs-mypy }} - needs-helm-tests: ${{ steps.selective-checks.outputs.needs-helm-tests }} - needs-api-tests: ${{ steps.selective-checks.outputs.needs-api-tests }} - needs-api-codegen: ${{ steps.selective-checks.outputs.needs-api-codegen }} + core-test-types-list-as-string: >- + ${{ steps.selective-checks.outputs.core-test-types-list-as-string }} + debug-resources: ${{ steps.selective-checks.outputs.debug-resources }} default-branch: ${{ steps.selective-checks.outputs.default-branch }} default-constraints-branch: ${{ steps.selective-checks.outputs.default-constraints-branch }} + default-helm-version: ${{ steps.selective-checks.outputs.default-helm-version }} + default-kind-version: ${{ steps.selective-checks.outputs.default-kind-version }} + default-kubernetes-version: ${{ steps.selective-checks.outputs.default-kubernetes-version }} + default-mysql-version: ${{ steps.selective-checks.outputs.default-mysql-version }} + default-postgres-version: ${{ steps.selective-checks.outputs.default-postgres-version }} + default-python-version: ${{ steps.selective-checks.outputs.default-python-version }} + disable-airflow-repo-cache: ${{ steps.selective-checks.outputs.disable-airflow-repo-cache }} + docker-cache: ${{ steps.selective-checks.outputs.docker-cache }} + docs-build: ${{ steps.selective-checks.outputs.docs-build }} docs-list-as-string: ${{ steps.selective-checks.outputs.docs-list-as-string }} - skip-pre-commits: ${{ steps.selective-checks.outputs.skip-pre-commits }} - providers-compatibility-checks: ${{ steps.selective-checks.outputs.providers-compatibility-checks }} + excluded-providers-as-string: ${{ steps.selective-checks.outputs.excluded-providers-as-string }} + force-pip: ${{ steps.selective-checks.outputs.force-pip }} + full-tests-needed: ${{ steps.selective-checks.outputs.full-tests-needed }} + has-migrations: ${{ steps.selective-checks.outputs.has-migrations }} helm-test-packages: ${{ steps.selective-checks.outputs.helm-test-packages }} - debug-resources: ${{ steps.selective-checks.outputs.debug-resources }} - runs-on-as-json-default: ${{ steps.selective-checks.outputs.runs-on-as-json-default }} - runs-on-as-json-docs-build: ${{ steps.selective-checks.outputs.runs-on-as-json-docs-build }} - runs-on-as-json-public: ${{ steps.selective-checks.outputs.runs-on-as-json-public }} - runs-on-as-json-self-hosted: ${{ steps.selective-checks.outputs.runs-on-as-json-self-hosted }} - runs-on-as-json-self-hosted-asf: ${{ steps.selective-checks.outputs.runs-on-as-json-self-hosted-asf }} - is-self-hosted-runner: ${{ steps.selective-checks.outputs.is-self-hosted-runner }} + include-success-outputs: ${{ steps.selective-checks.outputs.include-success-outputs }} + individual-providers-test-types-list-as-string: >- + ${{ steps.selective-checks.outputs.individual-providers-test-types-list-as-string }} is-airflow-runner: ${{ steps.selective-checks.outputs.is-airflow-runner }} is-amd-runner: ${{ steps.selective-checks.outputs.is-amd-runner }} is-arm-runner: ${{ steps.selective-checks.outputs.is-arm-runner }} - is-vm-runner: ${{ steps.selective-checks.outputs.is-vm-runner }} is-k8s-runner: ${{ steps.selective-checks.outputs.is-k8s-runner }} + is-self-hosted-runner: ${{ steps.selective-checks.outputs.is-self-hosted-runner }} + is-vm-runner: ${{ steps.selective-checks.outputs.is-vm-runner }} + kubernetes-combos: ${{ steps.selective-checks.outputs.kubernetes-combos }} + kubernetes-combos-list-as-string: >- + ${{ steps.selective-checks.outputs.kubernetes-combos-list-as-string }} + kubernetes-versions-list-as-string: >- + ${{ steps.selective-checks.outputs.kubernetes-versions-list-as-string }} latest-versions-only: ${{ steps.selective-checks.outputs.latest-versions-only }} - chicken-egg-providers: ${{ steps.selective-checks.outputs.chicken-egg-providers }} - has-migrations: ${{ steps.selective-checks.outputs.has-migrations }} - source-head-repo: ${{ steps.source-run-info.outputs.source-head-repo }} + mypy-checks: ${{ steps.selective-checks.outputs.mypy-checks }} + mysql-exclude: ${{ steps.selective-checks.outputs.mysql-exclude }} + mysql-versions: ${{ steps.selective-checks.outputs.mysql-versions }} + needs-api-codegen: ${{ steps.selective-checks.outputs.needs-api-codegen }} + needs-api-tests: ${{ steps.selective-checks.outputs.needs-api-tests }} + needs-helm-tests: ${{ steps.selective-checks.outputs.needs-helm-tests }} + needs-mypy: ${{ steps.selective-checks.outputs.needs-mypy }} + only-new-ui-files: ${{ steps.selective-checks.outputs.only-new-ui-files }} + postgres-exclude: ${{ steps.selective-checks.outputs.postgres-exclude }} + postgres-versions: ${{ steps.selective-checks.outputs.postgres-versions }} + prod-image-build: ${{ steps.selective-checks.outputs.prod-image-build }} + # yamllint disable rule:line-length + providers-compatibility-tests-matrix: ${{ steps.selective-checks.outputs.providers-compatibility-tests-matrix }} + providers-test-types-list-as-string: >- + ${{ steps.selective-checks.outputs.providers-test-types-list-as-string }} pull-request-labels: ${{ steps.source-run-info.outputs.pr-labels }} - in-workflow-build: ${{ steps.source-run-info.outputs.in-workflow-build }} - build-job-description: ${{ steps.source-run-info.outputs.build-job-description }} - testable-integrations: ${{ steps.selective-checks.outputs.testable-integrations }} - canary-run: ${{ steps.source-run-info.outputs.canary-run }} + python-versions-list-as-string: ${{ steps.selective-checks.outputs.python-versions-list-as-string }} + python-versions: ${{ steps.selective-checks.outputs.python-versions }} + run-amazon-tests: ${{ steps.selective-checks.outputs.run-amazon-tests }} run-coverage: ${{ steps.source-run-info.outputs.run-coverage }} + run-kubernetes-tests: ${{ steps.selective-checks.outputs.run-kubernetes-tests }} + run-system-tests: ${{ steps.selective-checks.outputs.run-system-tests }} + run-tests: ${{ steps.selective-checks.outputs.run-tests }} + run-ui-tests: ${{ steps.selective-checks.outputs.run-ui-tests }} + run-www-tests: ${{ steps.selective-checks.outputs.run-www-tests }} + runs-on-as-json-default: ${{ steps.selective-checks.outputs.runs-on-as-json-default }} + runs-on-as-json-docs-build: ${{ steps.selective-checks.outputs.runs-on-as-json-docs-build }} + runs-on-as-json-public: ${{ steps.selective-checks.outputs.runs-on-as-json-public }} + runs-on-as-json-self-hosted-asf: ${{ steps.selective-checks.outputs.runs-on-as-json-self-hosted-asf }} + runs-on-as-json-self-hosted: ${{ steps.selective-checks.outputs.runs-on-as-json-self-hosted }} + selected-providers-list-as-string: >- + ${{ steps.selective-checks.outputs.selected-providers-list-as-string }} + skip-pre-commits: ${{ steps.selective-checks.outputs.skip-pre-commits }} + skip-providers-tests: ${{ steps.selective-checks.outputs.skip-providers-tests }} + source-head-repo: ${{ steps.source-run-info.outputs.source-head-repo }} + sqlite-exclude: ${{ steps.selective-checks.outputs.sqlite-exclude }} + test-groups: ${{ steps.selective-checks.outputs.test-groups }} + testable-core-integrations: ${{ steps.selective-checks.outputs.testable-core-integrations }} + testable-providers-integrations: ${{ steps.selective-checks.outputs.testable-providers-integrations }} + use-uv: ${{ steps.selective-checks.outputs.force-pip == 'true' && 'false' || 'true' }} + upgrade-to-newer-dependencies: ${{ steps.selective-checks.outputs.upgrade-to-newer-dependencies }} steps: - name: "Cleanup repo" shell: bash @@ -147,6 +154,9 @@ jobs: persist-credentials: false - name: "Install Breeze" uses: ./.github/actions/breeze + with: + use-uv: ${{ inputs.use-uv }} + id: breeze - name: "Get information about the Workflow" id: source-run-info run: breeze ci get-workflow-info 2>> ${GITHUB_OUTPUT} @@ -158,7 +168,6 @@ jobs: PR_LABELS: "${{ steps.source-run-info.outputs.pr-labels }}" COMMIT_REF: "${{ github.sha }}" VERBOSE: "false" - run: breeze ci selective-check 2>> ${GITHUB_OUTPUT} - name: env run: printenv @@ -172,6 +181,7 @@ jobs: uses: ./.github/workflows/basic-tests.yml with: runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} + run-ui-tests: ${{needs.build-info.outputs.run-ui-tests}} run-www-tests: ${{needs.build-info.outputs.run-www-tests}} needs-api-codegen: ${{needs.build-info.outputs.needs-api-codegen}} default-python-version: ${{needs.build-info.outputs.default-python-version}} @@ -179,83 +189,46 @@ jobs: skip-pre-commits: ${{needs.build-info.outputs.skip-pre-commits}} canary-run: ${{needs.build-info.outputs.canary-run}} latest-versions-only: ${{needs.build-info.outputs.latest-versions-only}} - enable-aip-44: "false" + use-uv: ${{needs.build-info.outputs.use-uv}} build-ci-images: - name: > - ${{ needs.build-info.outputs.in-workflow-build == 'true' && 'Build' || 'Skip building' }} - CI images in-workflow + name: Build CI images needs: [build-info] uses: ./.github/workflows/ci-image-build.yml permissions: contents: read # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. - # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the - # "in-workflow-build" condition packages: write secrets: inherit with: runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} runs-on-as-json-self-hosted: ${{ needs.build-info.outputs.runs-on-as-json-self-hosted }} - do-build: ${{ needs.build-info.outputs.in-workflow-build }} - image-tag: ${{ needs.build-info.outputs.image-tag }} platform: "linux/amd64" + push-image: "false" + upload-image-artifact: "true" + upload-mount-cache-artifact: ${{ needs.build-info.outputs.canary-run }} python-versions: ${{ needs.build-info.outputs.python-versions }} branch: ${{ needs.build-info.outputs.default-branch }} - use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} + use-uv: ${{ needs.build-info.outputs.use-uv }} upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} constraints-branch: ${{ needs.build-info.outputs.default-constraints-branch }} docker-cache: ${{ needs.build-info.outputs.docker-cache }} - - wait-for-ci-images: - timeout-minutes: 120 - name: "Wait for CI images" - runs-on: ${{ fromJSON(needs.build-info.outputs.runs-on-as-json-public) }} - needs: [build-info, build-ci-images] + disable-airflow-repo-cache: ${{ needs.build-info.outputs.disable-airflow-repo-cache }} if: needs.build-info.outputs.ci-image-build == 'true' - env: - BACKEND: sqlite - # Force more parallelism for pull even on public images - PARALLELISM: 6 - INCLUDE_SUCCESS_OUTPUTS: "${{needs.build-info.outputs.include-success-outputs}}" - steps: - - name: "Cleanup repo" - shell: bash - run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" - if: needs.build-info.outputs.in-workflow-build == 'false' - - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 - with: - persist-credentials: false - if: needs.build-info.outputs.in-workflow-build == 'false' - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - if: needs.build-info.outputs.in-workflow-build == 'false' - - name: "Install Breeze" - uses: ./.github/actions/breeze - if: needs.build-info.outputs.in-workflow-build == 'false' - - name: Login to ghcr.io - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - if: needs.build-info.outputs.in-workflow-build == 'false' - - name: Wait for CI images ${{ env.PYTHON_VERSIONS }}:${{ needs.build-info.outputs.image-tag }} - id: wait-for-images - run: breeze ci-image pull --run-in-parallel --wait-for-image --tag-as-latest - env: - PYTHON_VERSIONS: ${{ needs.build-info.outputs.python-versions-list-as-string }} - DEBUG_RESOURCES: ${{needs.build-info.outputs.debug-resources}} - if: needs.build-info.outputs.in-workflow-build == 'false' additional-ci-image-checks: name: "Additional CI image checks" - needs: [build-info, wait-for-ci-images] + needs: [build-info, build-ci-images] uses: ./.github/workflows/additional-ci-image-checks.yml + permissions: + contents: read + packages: write if: needs.build-info.outputs.canary-run == 'true' with: runs-on-as-json-default: ${{ needs.build-info.outputs.runs-on-as-json-default }} runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} runs-on-as-json-self-hosted: ${{ needs.build-info.outputs.runs-on-as-json-self-hosted }} - image-tag: ${{ needs.build-info.outputs.image-tag }} python-versions: ${{ needs.build-info.outputs.python-versions }} branch: ${{ needs.build-info.outputs.default-branch }} constraints-branch: ${{ needs.build-info.outputs.default-constraints-branch }} @@ -263,16 +236,16 @@ jobs: upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} skip-pre-commits: ${{ needs.build-info.outputs.skip-pre-commits }} docker-cache: ${{ needs.build-info.outputs.docker-cache }} + disable-airflow-repo-cache: ${{ needs.build-info.outputs.disable-airflow-repo-cache }} canary-run: ${{ needs.build-info.outputs.canary-run }} latest-versions-only: ${{ needs.build-info.outputs.latest-versions-only }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} - use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} - + use-uv: ${{ needs.build-info.outputs.use-uv }} generate-constraints: name: "Generate constraints" - needs: [build-info, wait-for-ci-images] + needs: [build-info, build-ci-images] uses: ./.github/workflows/generate-constraints.yml if: needs.build-info.outputs.ci-image-build == 'true' with: @@ -281,19 +254,18 @@ jobs: # generate no providers constraints only in canary builds - they take quite some time to generate # they are not needed for regular builds, they are only needed to update constraints in canaries generate-no-providers-constraints: ${{ needs.build-info.outputs.canary-run }} - image-tag: ${{ needs.build-info.outputs.image-tag }} chicken-egg-providers: ${{ needs.build-info.outputs.chicken-egg-providers }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} + use-uv: ${{ needs.build-info.outputs.use-uv }} - static-checks-mypy-docs: - name: "Static checks, mypy, docs" - needs: [build-info, wait-for-ci-images] - uses: ./.github/workflows/static-checks-mypy-docs.yml + ci-image-checks: + name: "CI image checks" + needs: [build-info, build-ci-images] + uses: ./.github/workflows/ci-image-checks.yml secrets: inherit with: runs-on-as-json-default: ${{ needs.build-info.outputs.runs-on-as-json-default }} runs-on-as-json-docs-build: ${{ needs.build-info.outputs.runs-on-as-json-docs-build }} - image-tag: ${{ needs.build-info.outputs.image-tag }} needs-mypy: ${{ needs.build-info.outputs.needs-mypy }} mypy-checks: ${{ needs.build-info.outputs.mypy-checks }} python-versions-list-as-string: ${{ needs.build-info.outputs.python-versions-list-as-string }} @@ -310,34 +282,39 @@ jobs: include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} docs-build: ${{ needs.build-info.outputs.docs-build }} + needs-api-codegen: ${{ needs.build-info.outputs.needs-api-codegen }} + default-postgres-version: ${{ needs.build-info.outputs.default-postgres-version }} + run-coverage: ${{ needs.build-info.outputs.run-coverage }} + use-uv: ${{ needs.build-info.outputs.use-uv }} providers: - name: "Provider checks" - uses: ./.github/workflows/check-providers.yml - needs: [build-info, wait-for-ci-images] + name: "Provider packages tests" + uses: ./.github/workflows/test-provider-packages.yml + needs: [build-info, build-ci-images] permissions: contents: read packages: read secrets: inherit if: > - needs.build-info.outputs.skip-provider-tests != 'true' && + needs.build-info.outputs.skip-providers-tests != 'true' && needs.build-info.outputs.latest-versions-only != 'true' with: runs-on-as-json-default: ${{ needs.build-info.outputs.runs-on-as-json-default }} - image-tag: ${{ needs.build-info.outputs.image-tag }} canary-run: ${{ needs.build-info.outputs.canary-run }} default-python-version: ${{ needs.build-info.outputs.default-python-version }} upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} - affected-providers-list-as-string: ${{ needs.build-info.outputs.affected-providers-list-as-string }} - providers-compatibility-checks: ${{ needs.build-info.outputs.providers-compatibility-checks }} - skip-provider-tests: ${{ needs.build-info.outputs.skip-provider-tests }} + selected-providers-list-as-string: ${{ needs.build-info.outputs.selected-providers-list-as-string }} + # yamllint disable rule:line-length + providers-compatibility-tests-matrix: ${{ needs.build-info.outputs.providers-compatibility-tests-matrix }} + skip-providers-tests: ${{ needs.build-info.outputs.skip-providers-tests }} python-versions: ${{ needs.build-info.outputs.python-versions }} providers-test-types-list-as-string: ${{ needs.build-info.outputs.providers-test-types-list-as-string }} + use-uv: ${{ needs.build-info.outputs.use-uv }} tests-helm: name: "Helm tests" uses: ./.github/workflows/helm-tests.yml - needs: [build-info, wait-for-ci-images] + needs: [build-info, build-ci-images] permissions: contents: read packages: read @@ -346,8 +323,8 @@ jobs: runs-on-as-json-default: ${{ needs.build-info.outputs.runs-on-as-json-default }} runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} helm-test-packages: ${{ needs.build-info.outputs.helm-test-packages }} - image-tag: ${{ needs.build-info.outputs.image-tag }} default-python-version: ${{ needs.build-info.outputs.default-python-version }} + use-uv: ${{ needs.build-info.outputs.use-uv }} if: > needs.build-info.outputs.needs-helm-tests == 'true' && needs.build-info.outputs.default-branch == 'main' && @@ -356,7 +333,7 @@ jobs: tests-postgres: name: "Postgres tests" uses: ./.github/workflows/run-unit-tests.yml - needs: [build-info, wait-for-ci-images] + needs: [build-info, build-ci-images] permissions: contents: read packages: read @@ -366,21 +343,24 @@ jobs: backend: "postgres" test-name: "Postgres" test-scope: "DB" - image-tag: ${{ needs.build-info.outputs.image-tag }} + test-groups: ${{ needs.build-info.outputs.test-groups }} python-versions: ${{ needs.build-info.outputs.python-versions }} backend-versions: ${{ needs.build-info.outputs.postgres-versions }} + excluded-providers-as-string: ${{ needs.build-info.outputs.excluded-providers-as-string }} excludes: ${{ needs.build-info.outputs.postgres-exclude }} - parallel-test-types-list-as-string: ${{ needs.build-info.outputs.parallel-test-types-list-as-string }} + core-test-types-list-as-string: ${{ needs.build-info.outputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ needs.build-info.outputs.providers-test-types-list-as-string }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} run-migration-tests: "true" run-coverage: ${{ needs.build-info.outputs.run-coverage }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} + use-uv: ${{ needs.build-info.outputs.use-uv }} if: needs.build-info.outputs.run-tests == 'true' tests-mysql: name: "MySQL tests" uses: ./.github/workflows/run-unit-tests.yml - needs: [build-info, wait-for-ci-images] + needs: [build-info, build-ci-images] permissions: contents: read packages: read @@ -390,21 +370,24 @@ jobs: backend: "mysql" test-name: "MySQL" test-scope: "DB" - image-tag: ${{ needs.build-info.outputs.image-tag }} + test-groups: ${{ needs.build-info.outputs.test-groups }} python-versions: ${{ needs.build-info.outputs.python-versions }} backend-versions: ${{ needs.build-info.outputs.mysql-versions }} + excluded-providers-as-string: ${{ needs.build-info.outputs.excluded-providers-as-string }} excludes: ${{ needs.build-info.outputs.mysql-exclude }} - parallel-test-types-list-as-string: ${{ needs.build-info.outputs.parallel-test-types-list-as-string }} + core-test-types-list-as-string: ${{ needs.build-info.outputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ needs.build-info.outputs.providers-test-types-list-as-string }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} run-coverage: ${{ needs.build-info.outputs.run-coverage }} run-migration-tests: "true" debug-resources: ${{ needs.build-info.outputs.debug-resources }} + use-uv: ${{ needs.build-info.outputs.use-uv }} if: needs.build-info.outputs.run-tests == 'true' tests-sqlite: name: "Sqlite tests" uses: ./.github/workflows/run-unit-tests.yml - needs: [build-info, wait-for-ci-images] + needs: [build-info, build-ci-images] permissions: contents: read packages: read @@ -415,22 +398,25 @@ jobs: test-name: "Sqlite" test-name-separator: "" test-scope: "DB" - image-tag: ${{ needs.build-info.outputs.image-tag }} + test-groups: ${{ needs.build-info.outputs.test-groups }} python-versions: ${{ needs.build-info.outputs.python-versions }} # No versions for sqlite backend-versions: "['']" + excluded-providers-as-string: ${{ needs.build-info.outputs.excluded-providers-as-string }} excludes: ${{ needs.build-info.outputs.sqlite-exclude }} - parallel-test-types-list-as-string: ${{ needs.build-info.outputs.parallel-test-types-list-as-string }} + core-test-types-list-as-string: ${{ needs.build-info.outputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ needs.build-info.outputs.providers-test-types-list-as-string }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} run-coverage: ${{ needs.build-info.outputs.run-coverage }} run-migration-tests: "true" debug-resources: ${{ needs.build-info.outputs.debug-resources }} + use-uv: ${{ needs.build-info.outputs.use-uv }} if: needs.build-info.outputs.run-tests == 'true' tests-non-db: name: "Non-DB tests" uses: ./.github/workflows/run-unit-tests.yml - needs: [build-info, wait-for-ci-images] + needs: [build-info, build-ci-images] permissions: contents: read packages: read @@ -441,21 +427,24 @@ jobs: test-name: "" test-name-separator: "" test-scope: "Non-DB" - image-tag: ${{ needs.build-info.outputs.image-tag }} + test-groups: ${{ needs.build-info.outputs.test-groups }} python-versions: ${{ needs.build-info.outputs.python-versions }} # No versions for non-db backend-versions: "['']" + excluded-providers-as-string: ${{ needs.build-info.outputs.excluded-providers-as-string }} excludes: ${{ needs.build-info.outputs.sqlite-exclude }} - parallel-test-types-list-as-string: ${{ needs.build-info.outputs.parallel-test-types-list-as-string }} + core-test-types-list-as-string: ${{ needs.build-info.outputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ needs.build-info.outputs.providers-test-types-list-as-string }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} run-coverage: ${{ needs.build-info.outputs.run-coverage }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} + use-uv: ${{ needs.build-info.outputs.use-uv }} if: needs.build-info.outputs.run-tests == 'true' tests-special: name: "Special tests" uses: ./.github/workflows/special-tests.yml - needs: [build-info, wait-for-ci-images] + needs: [build-info, build-ci-images] permissions: contents: read packages: read @@ -466,41 +455,47 @@ jobs: needs.build-info.outputs.upgrade-to-newer-dependencies != 'false' || needs.build-info.outputs.full-tests-needed == 'true') with: + test-groups: ${{ needs.build-info.outputs.test-groups }} default-branch: ${{ needs.build-info.outputs.default-branch }} runs-on-as-json-default: ${{ needs.build-info.outputs.runs-on-as-json-default }} - image-tag: ${{ needs.build-info.outputs.image-tag }} - parallel-test-types-list-as-string: ${{ needs.build-info.outputs.parallel-test-types-list-as-string }} + core-test-types-list-as-string: ${{ needs.build-info.outputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ needs.build-info.outputs.providers-test-types-list-as-string }} run-coverage: ${{ needs.build-info.outputs.run-coverage }} default-python-version: ${{ needs.build-info.outputs.default-python-version }} python-versions: ${{ needs.build-info.outputs.python-versions }} default-postgres-version: ${{ needs.build-info.outputs.default-postgres-version }} + excluded-providers-as-string: ${{ needs.build-info.outputs.excluded-providers-as-string }} canary-run: ${{ needs.build-info.outputs.canary-run }} upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} + include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} + use-uv: ${{ needs.build-info.outputs.use-uv }} - tests-integration: - name: Integration Tests - needs: [build-info, wait-for-ci-images] - uses: ./.github/workflows/integration-tests.yml + tests-integration-system: + name: Integration and System Tests + needs: [build-info, build-ci-images] + uses: ./.github/workflows/integration-system-tests.yml permissions: contents: read packages: read secrets: inherit with: runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} - image-tag: ${{ needs.build-info.outputs.image-tag }} - testable-integrations: ${{ needs.build-info.outputs.testable-integrations }} + testable-core-integrations: ${{ needs.build-info.outputs.testable-core-integrations }} + testable-providers-integrations: ${{ needs.build-info.outputs.testable-providers-integrations }} + run-system-tests: ${{ needs.build-info.outputs.run-tests }} default-python-version: ${{ needs.build-info.outputs.default-python-version }} default-postgres-version: ${{ needs.build-info.outputs.default-postgres-version }} default-mysql-version: ${{ needs.build-info.outputs.default-mysql-version }} - skip-provider-tests: ${{ needs.build-info.outputs.skip-provider-tests }} + skip-providers-tests: ${{ needs.build-info.outputs.skip-providers-tests }} run-coverage: ${{ needs.build-info.outputs.run-coverage }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} + use-uv: ${{ needs.build-info.outputs.use-uv }} if: needs.build-info.outputs.run-tests == 'true' tests-with-lowest-direct-resolution: - name: "Lowest direct dependency resolution tests" - needs: [build-info, wait-for-ci-images] + name: "Lowest direct dependency providers tests" + needs: [build-info, build-ci-images] uses: ./.github/workflows/run-unit-tests.yml permissions: contents: read @@ -513,125 +508,83 @@ jobs: test-name: "LowestDeps-Postgres" force-lowest-dependencies: "true" test-scope: "All" + test-groups: ${{ needs.build-info.outputs.test-groups }} backend: "postgres" - image-tag: ${{ needs.build-info.outputs.image-tag }} python-versions: ${{ needs.build-info.outputs.python-versions }} backend-versions: "['${{ needs.build-info.outputs.default-postgres-version }}']" + excluded-providers-as-string: ${{ needs.build-info.outputs.excluded-providers-as-string }} excludes: "[]" - parallel-test-types-list-as-string: ${{ needs.build-info.outputs.separate-test-types-list-as-string }} + core-test-types-list-as-string: ${{ needs.build-info.outputs.core-test-types-list-as-string }} + # yamllint disable rule:line-length + providers-test-types-list-as-string: ${{ needs.build-info.outputs.individual-providers-test-types-list-as-string }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} run-coverage: ${{ needs.build-info.outputs.run-coverage }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} monitor-delay-time-in-seconds: 120 + use-uv: ${{ needs.build-info.outputs.use-uv }} build-prod-images: - name: > - ${{ needs.build-info.outputs.in-workflow-build == 'true' && 'Build' || 'Skip building' }} - PROD images in-workflow + name: Build PROD images needs: [build-info, build-ci-images, generate-constraints] uses: ./.github/workflows/prod-image-build.yml permissions: contents: read # This write is only given here for `push` events from "apache/airflow" repo. It is not given for PRs # from forks. This is to prevent malicious PRs from creating images in the "apache/airflow" repo. - # For regular build for PRS this "build-prod-images" workflow will be skipped anyway by the - # "in-workflow-build" condition packages: write secrets: inherit with: runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} build-type: "Regular" - do-build: ${{ needs.build-info.outputs.in-workflow-build }} - upload-package-artifact: "true" - image-tag: ${{ needs.build-info.outputs.image-tag }} platform: "linux/amd64" + push-image: "false" + upload-image-artifact: "true" + upload-package-artifact: "true" python-versions: ${{ needs.build-info.outputs.python-versions }} default-python-version: ${{ needs.build-info.outputs.default-python-version }} branch: ${{ needs.build-info.outputs.default-branch }} - push-image: "true" - use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} + use-uv: ${{ needs.build-info.outputs.use-uv }} build-provider-packages: ${{ needs.build-info.outputs.default-branch == 'main' }} upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} chicken-egg-providers: ${{ needs.build-info.outputs.chicken-egg-providers }} constraints-branch: ${{ needs.build-info.outputs.default-constraints-branch }} docker-cache: ${{ needs.build-info.outputs.docker-cache }} - - wait-for-prod-images: - timeout-minutes: 80 - name: "Wait for PROD images" - runs-on: ${{ fromJSON(needs.build-info.outputs.runs-on-as-json-public) }} - needs: [build-info, wait-for-ci-images, build-prod-images] - if: needs.build-info.outputs.prod-image-build == 'true' - env: - BACKEND: sqlite - PYTHON_MAJOR_MINOR_VERSION: "${{needs.build-info.outputs.default-python-version}}" - # Force more parallelism for pull on public images - PARALLELISM: 6 - INCLUDE_SUCCESS_OUTPUTS: "${{needs.build-info.outputs.include-success-outputs}}" - IMAGE_TAG: ${{ needs.build-info.outputs.image-tag }} - steps: - - name: "Cleanup repo" - shell: bash - run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" - if: needs.build-info.outputs.in-workflow-build == 'false' - - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 - with: - persist-credentials: false - if: needs.build-info.outputs.in-workflow-build == 'false' - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - if: needs.build-info.outputs.in-workflow-build == 'false' - - name: "Install Breeze" - uses: ./.github/actions/breeze - if: needs.build-info.outputs.in-workflow-build == 'false' - - name: Login to ghcr.io - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - if: needs.build-info.outputs.in-workflow-build == 'false' - - name: Wait for PROD images ${{ env.PYTHON_VERSIONS }}:${{ needs.build-info.outputs.image-tag }} - # We wait for the images to be available either from "build-images.yml' run as pull_request_target - # or from build-prod-images (or build-prod-images-release-branch) above. - # We are utilising single job to wait for all images because this job merely waits - # For the images to be available. - run: breeze prod-image pull --wait-for-image --run-in-parallel - env: - PYTHON_VERSIONS: ${{ needs.build-info.outputs.python-versions-list-as-string }} - DEBUG_RESOURCES: ${{ needs.build-info.outputs.debug-resources }} - if: needs.build-info.outputs.in-workflow-build == 'false' + disable-airflow-repo-cache: ${{ needs.build-info.outputs.disable-airflow-repo-cache }} + prod-image-build: ${{ needs.build-info.outputs.prod-image-build }} additional-prod-image-tests: name: "Additional PROD image tests" - needs: [build-info, wait-for-prod-images, generate-constraints] + needs: [build-info, build-prod-images, generate-constraints] uses: ./.github/workflows/additional-prod-image-tests.yml with: runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} default-branch: ${{ needs.build-info.outputs.default-branch }} constraints-branch: ${{ needs.build-info.outputs.default-constraints-branch }} - image-tag: ${{ needs.build-info.outputs.image-tag }} upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} chicken-egg-providers: ${{ needs.build-info.outputs.chicken-egg-providers }} docker-cache: ${{ needs.build-info.outputs.docker-cache }} + disable-airflow-repo-cache: ${{ needs.build-info.outputs.disable-airflow-repo-cache }} default-python-version: ${{ needs.build-info.outputs.default-python-version }} canary-run: ${{ needs.build-info.outputs.canary-run }} + use-uv: ${{ needs.build-info.outputs.use-uv }} if: needs.build-info.outputs.prod-image-build == 'true' tests-kubernetes: name: "Kubernetes tests" uses: ./.github/workflows/k8s-tests.yml - needs: [build-info, wait-for-prod-images] + needs: [build-info, build-prod-images] permissions: contents: read packages: read secrets: inherit with: + platform: "linux/amd64" runs-on-as-json-default: ${{ needs.build-info.outputs.runs-on-as-json-default }} - image-tag: ${{ needs.build-info.outputs.image-tag }} python-versions-list-as-string: ${{ needs.build-info.outputs.python-versions-list-as-string }} - kubernetes-versions-list-as-string: ${{ needs.build-info.outputs.kubernetes-versions-list-as-string }} - kubernetes-combos-list-as-string: ${{ needs.build-info.outputs.kubernetes-combos-list-as-string }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} - use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} + use-uv: ${{ needs.build-info.outputs.use-uv }} debug-resources: ${{ needs.build-info.outputs.debug-resources }} + kubernetes-combos: ${{ needs.build-info.outputs.kubernetes-combos }} if: > ( needs.build-info.outputs.run-kubernetes-tests == 'true' || needs.build-info.outputs.needs-helm-tests == 'true') @@ -645,26 +598,58 @@ jobs: needs: - build-info - generate-constraints - - wait-for-ci-images - - wait-for-prod-images - - static-checks-mypy-docs + - ci-image-checks - tests-sqlite - tests-mysql - tests-postgres - tests-non-db - - tests-integration + - tests-integration-system + - build-prod-images uses: ./.github/workflows/finalize-tests.yml with: runs-on-as-json-public: ${{ needs.build-info.outputs.runs-on-as-json-public }} runs-on-as-json-self-hosted: ${{ needs.build-info.outputs.runs-on-as-json-self-hosted }} - image-tag: ${{ needs.build-info.outputs.image-tag }} python-versions: ${{ needs.build-info.outputs.python-versions }} python-versions-list-as-string: ${{ needs.build-info.outputs.python-versions-list-as-string }} branch: ${{ needs.build-info.outputs.default-branch }} constraints-branch: ${{ needs.build-info.outputs.default-constraints-branch }} default-python-version: ${{ needs.build-info.outputs.default-python-version }} - in-workflow-build: ${{ needs.build-info.outputs.in-workflow-build }} upgrade-to-newer-dependencies: ${{ needs.build-info.outputs.upgrade-to-newer-dependencies }} include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} docker-cache: ${{ needs.build-info.outputs.docker-cache }} + disable-airflow-repo-cache: ${{ needs.build-info.outputs.disable-airflow-repo-cache }} canary-run: ${{ needs.build-info.outputs.canary-run }} + use-uv: ${{ needs.build-info.outputs.use-uv }} + debug-resources: ${{ needs.build-info.outputs.debug-resources }} + + notify-slack-failure: + name: "Notify Slack on Failure" + needs: + - basic-tests + - additional-ci-image-checks + - providers + - tests-helm + - tests-special + - tests-with-lowest-direct-resolution + - additional-prod-image-tests + - tests-kubernetes + - finalize-tests + if: github.event_name == 'schedule' && failure() && github.run_attempt == 1 + runs-on: ["ubuntu-22.04"] + steps: + - name: Notify Slack + id: slack + uses: slackapi/slack-github-action@485a9d42d3a73031f12ec201c457e2162c45d02d # v2.0.0 + with: + method: chat.postMessage + token: ${{ env.SLACK_BOT_TOKEN }} + # yamllint disable rule:line-length + payload: | + channel: "internal-airflow-ci-cd" + text: "🚨🕒 Scheduled CI Failure Alert 🕒🚨\n\n*Details:* " + blocks: + - type: "section" + text: + type: "mrkdwn" + text: "🚨🕒 Scheduled CI Failure Alert 🕒🚨\n\n*Details:* " + # yamllint enable rule:line-length diff --git a/.github/workflows/codeql-analysis.yml b/.github/workflows/codeql-analysis.yml index ec608192a7079..1fcf81a84fd5b 100644 --- a/.github/workflows/codeql-analysis.yml +++ b/.github/workflows/codeql-analysis.yml @@ -19,6 +19,8 @@ name: "CodeQL" on: # yamllint disable-line rule:truthy + pull_request: + branches: ['main', 'v[0-9]+-[0-9]+-test', 'v[0-9]+-[0-9]+-stable'] push: branches: [main] schedule: @@ -31,37 +33,13 @@ concurrency: cancel-in-progress: true jobs: - selective-checks: - name: Selective checks - runs-on: ["ubuntu-22.04"] - outputs: - needs-python-scans: ${{ steps.selective-checks.outputs.needs-python-scans }} - needs-javascript-scans: ${{ steps.selective-checks.outputs.needs-javascript-scans }} - steps: - - name: Checkout repository - uses: actions/checkout@v4 - with: - fetch-depth: 2 - persist-credentials: false - - name: "Install Breeze" - uses: ./.github/actions/breeze - - name: Selective checks - id: selective-checks - env: - COMMIT_REF: "${{ github.sha }}" - VERBOSE: "false" - run: breeze ci selective-check 2>> ${GITHUB_OUTPUT} - analyze: name: Analyze runs-on: ["ubuntu-22.04"] - needs: [selective-checks] strategy: fail-fast: false matrix: - # Override automatic language detection by changing the below list - # Supported options are ['csharp', 'cpp', 'go', 'java', 'javascript', 'python'] - language: ['python', 'javascript'] + language: ['python', 'javascript', 'actions'] permissions: actions: read contents: read @@ -72,33 +50,14 @@ jobs: uses: actions/checkout@v4 with: persist-credentials: false - if: | - matrix.language == 'python' && needs.selective-checks.outputs.needs-python-scans == 'true' || - matrix.language == 'javascript' && needs.selective-checks.outputs.needs-javascript-scans == 'true' - # Initializes the CodeQL tools for scanning. - name: Initialize CodeQL - uses: github/codeql-action/init@v2 + uses: github/codeql-action/init@v3 with: languages: ${{ matrix.language }} - # If you wish to specify custom queries, you can do so here or in a config file. - # By default, queries listed here will override any specified in a config file. - # Prefix the list here with "+" to use these queries and those in the config file. - # queries: ./path/to/local/query, your-org/your-repo/queries@main - if: | - matrix.language == 'python' && needs.selective-checks.outputs.needs-python-scans == 'true' || - matrix.language == 'javascript' && needs.selective-checks.outputs.needs-javascript-scans == 'true' - # Autobuild attempts to build any compiled languages (C/C++, C#, or Java). - # If this step fails, then you should remove it and run the build manually (see below) - name: Autobuild - uses: github/codeql-action/autobuild@v2 - if: | - matrix.language == 'python' && needs.selective-checks.outputs.needs-python-scans == 'true' || - matrix.language == 'javascript' && needs.selective-checks.outputs.needs-javascript-scans == 'true' + uses: github/codeql-action/autobuild@v3 - name: Perform CodeQL Analysis - uses: github/codeql-action/analyze@v2 - if: | - matrix.language == 'python' && needs.selective-checks.outputs.needs-python-scans == 'true' || - matrix.language == 'javascript' && needs.selective-checks.outputs.needs-javascript-scans == 'true' + uses: github/codeql-action/analyze@v3 diff --git a/.github/workflows/finalize-tests.yml b/.github/workflows/finalize-tests.yml index a460dbe151a30..ac13089caf656 100644 --- a/.github/workflows/finalize-tests.yml +++ b/.github/workflows/finalize-tests.yml @@ -28,10 +28,6 @@ on: # yamllint disable-line rule:truthy description: "The array of labels (in json form) determining self-hosted runners." required: true type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string python-versions: description: "JSON-formatted array of Python versions to test" required: true @@ -52,10 +48,6 @@ on: # yamllint disable-line rule:truthy description: "Which version of python should be used by default" required: true type: string - in-workflow-build: - description: "Whether the build is executed as part of the workflow (true/false)" - required: true - type: string upgrade-to-newer-dependencies: description: "Whether to upgrade to newer dependencies (true/false)" required: true @@ -64,6 +56,10 @@ on: # yamllint disable-line rule:truthy description: "Docker cache specification to build the image (registry, local, disabled)." required: true type: string + disable-airflow-repo-cache: + description: "Disable airflow repo cache read from main." + required: true + type: string include-success-outputs: description: "Whether to include success outputs (true/false)" required: true @@ -72,6 +68,16 @@ on: # yamllint disable-line rule:truthy description: "Whether this is a canary run (true/false)" required: true type: string + use-uv: + description: "Whether to use uv to build the image (true/false)" + required: true + type: string + debug-resources: + description: "Whether to debug resources or not (true/false)" + required: true + type: string +permissions: + contents: read jobs: update-constraints: runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} @@ -83,7 +89,6 @@ jobs: env: DEBUG_RESOURCES: ${{ inputs.debug-resources}} PYTHON_VERSIONS: ${{ inputs.python-versions-list-as-string }} - IMAGE_TAG: ${{ inputs.image-tag }} GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} @@ -145,9 +150,10 @@ jobs: python-versions: ${{ inputs.python-versions }} branch: ${{ inputs.branch }} constraints-branch: ${{ inputs.constraints-branch }} - use-uv: ${{ needs.build-info.outputs.force-pip && 'false' || 'true' }} + use-uv: ${{ inputs.use-uv }} include-success-outputs: ${{ inputs.include-success-outputs }} docker-cache: ${{ inputs.docker-cache }} + disable-airflow-repo-cache: ${{ inputs.disable-airflow-repo-cache }} if: inputs.canary-run == 'true' # push-buildx-cache-to-github-registry-arm: @@ -187,10 +193,14 @@ jobs: persist-credentials: false - name: "Cleanup docker" run: ./scripts/ci/cleanup_docker.sh - - name: "Download all artifacts from the current build" + - name: "Free up disk space" + shell: bash + run: ./scripts/tools/free_up_disk_space.sh + - name: "Download all test warning artifacts from the current build" uses: actions/download-artifact@v4 with: path: ./artifacts + pattern: test-warnings-* - name: "Setup python" uses: actions/setup-python@v5 with: diff --git a/.github/workflows/generate-constraints.yml b/.github/workflows/generate-constraints.yml index d6e536dfd091a..740310e1cc09b 100644 --- a/.github/workflows/generate-constraints.yml +++ b/.github/workflows/generate-constraints.yml @@ -32,10 +32,6 @@ on: # yamllint disable-line rule:truthy description: "Whether to generate constraints without providers (true/false)" required: true type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string chicken-egg-providers: description: "Space-separated list of providers that should be installed from context files" required: true @@ -44,6 +40,10 @@ on: # yamllint disable-line rule:truthy description: "Whether to run in debug mode (true/false)" required: true type: string + use-uv: + description: "Whether to use uvloop (true/false)" + required: true + type: string jobs: generate-constraints: permissions: @@ -57,7 +57,6 @@ jobs: GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} INCLUDE_SUCCESS_OUTPUTS: "true" - IMAGE_TAG: ${{ inputs.image-tag }} PYTHON_VERSIONS: ${{ inputs.python-versions-list-as-string }} VERBOSE: "true" VERSION_SUFFIX_FOR_PYPI: "dev0" @@ -69,21 +68,17 @@ jobs: uses: actions/checkout@v4 with: persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - name: "Install Breeze" uses: ./.github/actions/breeze - - name: Login to ghcr.io - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - - name: "\ - Pull CI images \ - ${{ inputs.python-versions-list-as-string }}:\ - ${{ inputs.image-tag }}" - run: breeze ci-image pull --run-in-parallel --tag-as-latest - - name: " - Verify CI images \ - ${{ inputs.python-versions-list-as-string }}:\ - ${{ inputs.image-tag }}" + with: + use-uv: ${{ inputs.use-uv }} + id: breeze + - name: "Prepare all CI images: ${{ inputs.python-versions-list-as-string}}" + uses: ./.github/actions/prepare_all_ci_images + with: + platform: "linux/amd64" + python-versions-list-as-string: ${{ inputs.python-versions-list-as-string }} + - name: "Verify all CI images ${{ inputs.python-versions-list-as-string }}" run: breeze ci-image verify --run-in-parallel - name: "Source constraints" shell: bash @@ -104,22 +99,28 @@ jobs: # from the source code, not from the PyPI because they have apache-airflow>=X.Y.Z dependency # And when we prepare them from sources they will have apache-airflow>=X.Y.Z.dev0 shell: bash + env: + CHICKEN_EGG_PROVIDERS: ${{ inputs.chicken-egg-providers }} run: > breeze release-management prepare-provider-packages --include-not-ready-providers --package-format wheel --version-suffix-for-pypi dev0 - ${{ inputs.chicken-egg-providers }} + ${CHICKEN_EGG_PROVIDERS} if: inputs.chicken-egg-providers != '' - name: "PyPI constraints" shell: bash timeout-minutes: 25 + env: + CHICKEN_EGG_PROVIDERS: ${{ inputs.chicken-egg-providers }} run: > breeze release-management generate-constraints --run-in-parallel --airflow-constraints-mode constraints --answer yes - --chicken-egg-providers "${{ inputs.chicken-egg-providers }}" --parallelism 3 + --chicken-egg-providers "${CHICKEN_EGG_PROVIDERS}" --parallelism 3 - name: "Dependency upgrade summary" shell: bash + env: + PYTHON_VERSIONS: ${{ env.PYTHON_VERSIONS }} run: | - for PYTHON_VERSION in ${{ env.PYTHON_VERSIONS }}; do + for PYTHON_VERSION in $PYTHON_VERSIONS; do echo "Summarizing Python $PYTHON_VERSION" cat "files/constraints-${PYTHON_VERSION}"/*.md >> $GITHUB_STEP_SUMMARY || true done diff --git a/.github/workflows/helm-tests.yml b/.github/workflows/helm-tests.yml index 8b26769ff4bc7..1b4aa19cbe595 100644 --- a/.github/workflows/helm-tests.yml +++ b/.github/workflows/helm-tests.yml @@ -32,14 +32,16 @@ on: # yamllint disable-line rule:truthy description: "Stringified JSON array of helm test packages to test" required: true type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string default-python-version: description: "Which version of python should be used by default" required: true type: string + use-uv: + description: "Whether to use uvloop (true/false)" + required: true + type: string +permissions: + contents: read jobs: tests-helm: timeout-minutes: 80 @@ -57,7 +59,6 @@ jobs: DB_RESET: "false" JOB_ID: "helm-tests" USE_XDIST: "true" - IMAGE_TAG: "${{ inputs.image-tag }}" GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} @@ -70,12 +71,16 @@ jobs: uses: actions/checkout@v4 with: persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: "Prepare breeze & CI image: ${{inputs.default-python-version}}:${{inputs.image-tag}}" + - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}" uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + python: ${{ inputs.default-python-version }} + use-uv: ${{ inputs.use-uv }} - name: "Helm Unit Tests: ${{ matrix.helm-test-package }}" - run: breeze testing helm-tests --helm-test-package "${{ matrix.helm-test-package }}" + env: + HELM_TEST_PACKAGE: "${{ matrix.helm-test-package }}" + run: breeze testing helm-tests --test-type "${HELM_TEST_PACKAGE}" tests-helm-release: timeout-minutes: 80 @@ -95,6 +100,8 @@ jobs: run: ./scripts/ci/cleanup_docker.sh - name: "Install Breeze" uses: ./.github/actions/breeze + with: + use-uv: ${{ inputs.use-uv }} - name: Setup git for tagging run: | git config --global user.email "name@example.com" diff --git a/.github/workflows/integration-system-tests.yml b/.github/workflows/integration-system-tests.yml new file mode 100644 index 0000000000000..3aabf28075d93 --- /dev/null +++ b/.github/workflows/integration-system-tests.yml @@ -0,0 +1,209 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# +--- +name: Integration and system tests +on: # yamllint disable-line rule:truthy + workflow_call: + inputs: + runs-on-as-json-public: + description: "The array of labels (in json form) determining public runners." + required: true + type: string + testable-core-integrations: + description: "The list of testable core integrations as JSON array." + required: true + type: string + testable-providers-integrations: + description: "The list of testable providers integrations as JSON array." + required: true + type: string + run-system-tests: + description: "Run system tests (true/false)" + required: true + type: string + default-postgres-version: + description: "Default version of Postgres to use" + required: true + type: string + default-mysql-version: + description: "Default version of MySQL to use" + required: true + type: string + skip-providers-tests: + description: "Skip provider tests (true/false)" + required: true + type: string + run-coverage: + description: "Run coverage (true/false)" + required: true + type: string + default-python-version: + description: "Which version of python should be used by default" + required: true + type: string + debug-resources: + description: "Debug resources (true/false)" + required: true + type: string + use-uv: + description: "Whether to use uv" + required: true + type: string +permissions: + contents: read +jobs: + tests-core-integration: + timeout-minutes: 130 + if: inputs.testable-core-integrations != '[]' + name: "Integration core ${{ matrix.integration }}" + runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} + strategy: + fail-fast: false + matrix: + integration: ${{ fromJSON(inputs.testable-core-integrations) }} + env: + BACKEND: "postgres" + BACKEND_VERSION: ${{ inputs.default-postgres-version }}" + PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" + JOB_ID: "integration-core-${{ matrix.integration }}" + SKIP_PROVIDERS_TESTS: "${{ inputs.skip-providers-tests }}" + ENABLE_COVERAGE: "${{ inputs.run-coverage}}" + DEBUG_RESOURCES: "${{ inputs.debug-resources }}" + GITHUB_REPOSITORY: ${{ github.repository }} + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + GITHUB_USERNAME: ${{ github.actor }} + VERBOSE: "true" + steps: + - name: "Cleanup repo" + shell: bash + run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" + - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" + uses: actions/checkout@v4 + with: + persist-credentials: false + - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}" + uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + python: ${{ inputs.default-python-version }} + use-uv: ${{ inputs.use-uv }} + - name: "Integration: core ${{ matrix.integration }}" + env: + INTEGRATION: "${{ matrix.integration }}" + # yamllint disable rule:line-length + run: ./scripts/ci/testing/run_integration_tests_with_retry.sh core "${INTEGRATION}" + - name: "Post Tests success" + uses: ./.github/actions/post_tests_success + with: + codecov-token: ${{ secrets.CODECOV_TOKEN }} + python-version: ${{ inputs.default-python-version }} + - name: "Post Tests failure" + uses: ./.github/actions/post_tests_failure + if: failure() + + tests-providers-integration: + timeout-minutes: 130 + if: inputs.testable-providers-integrations != '[]' && inputs.skip-providers-tests != 'true' + name: "Integration: providers ${{ matrix.integration }}" + runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} + strategy: + fail-fast: false + matrix: + integration: ${{ fromJSON(inputs.testable-providers-integrations) }} + env: + BACKEND: "postgres" + BACKEND_VERSION: ${{ inputs.default-postgres-version }}" + PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" + JOB_ID: "integration-providers-${{ matrix.integration }}" + SKIP_PROVIDERS_TESTS: "${{ inputs.skip-providers-tests }}" + ENABLE_COVERAGE: "${{ inputs.run-coverage}}" + DEBUG_RESOURCES: "${{ inputs.debug-resources }}" + GITHUB_REPOSITORY: ${{ github.repository }} + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + GITHUB_USERNAME: ${{ github.actor }} + VERBOSE: "true" + steps: + - name: "Cleanup repo" + shell: bash + run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" + - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" + uses: actions/checkout@v4 + with: + persist-credentials: false + - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}" + uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + python: ${{ inputs.default-python-version }} + use-uv: ${{ inputs.use-uv }} + - name: "Integration: providers ${{ matrix.integration }}" + env: + INTEGRATION: "${{ matrix.integration }}" + run: ./scripts/ci/testing/run_integration_tests_with_retry.sh providers "${INTEGRATION}" + - name: "Post Tests success" + uses: ./.github/actions/post_tests_success + with: + codecov-token: ${{ secrets.CODECOV_TOKEN }} + python-version: ${{ inputs.default-python-version }} + - name: "Post Tests failure" + uses: ./.github/actions/post_tests_failure + if: failure() + + tests-system: + timeout-minutes: 130 + if: inputs.run-system-tests == 'true' + name: "System Tests" + runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} + env: + BACKEND: "postgres" + BACKEND_VERSION: ${{ inputs.default-postgres-version }}" + PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" + JOB_ID: "system" + SKIP_PROVIDERS_TESTS: "${{ inputs.skip-providers-tests }}" + ENABLE_COVERAGE: "${{ inputs.run-coverage}}" + DEBUG_RESOURCES: "${{ inputs.debug-resources }}" + GITHUB_REPOSITORY: ${{ github.repository }} + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + GITHUB_USERNAME: ${{ github.actor }} + VERBOSE: "true" + steps: + - name: "Cleanup repo" + shell: bash + run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" + - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" + uses: actions/checkout@v4 + with: + persist-credentials: false + - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}" + uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + python: ${{ inputs.default-python-version }} + use-uv: ${{ inputs.use-uv }} + - name: "System Tests" + run: > + ./scripts/ci/testing/run_system_tests.sh + tests/system/example_empty.py tests/system/example_empty.py + - name: "Post Tests success" + uses: ./.github/actions/post_tests_success + with: + codecov-token: ${{ secrets.CODECOV_TOKEN }} + python-version: ${{ inputs.default-python-version }} + - name: "Post Tests failure" + uses: ./.github/actions/post_tests_failure + if: failure() diff --git a/.github/workflows/integration-tests.yml b/.github/workflows/integration-tests.yml deleted file mode 100644 index 530d0f9fc5636..0000000000000 --- a/.github/workflows/integration-tests.yml +++ /dev/null @@ -1,103 +0,0 @@ -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -# ---- -name: Integration tests -on: # yamllint disable-line rule:truthy - workflow_call: - inputs: - runs-on-as-json-public: - description: "The array of labels (in json form) determining public runners." - required: true - type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string - testable-integrations: - description: "The list of testable integrations as JSON array." - required: true - type: string - default-postgres-version: - description: "Default version of Postgres to use" - required: true - type: string - default-mysql-version: - description: "Default version of MySQL to use" - required: true - type: string - skip-provider-tests: - description: "Skip provider tests (true/false)" - required: true - type: string - run-coverage: - description: "Run coverage (true/false)" - required: true - type: string - default-python-version: - description: "Which version of python should be used by default" - required: true - type: string - debug-resources: - description: "Debug resources (true/false)" - required: true - type: string -jobs: - tests-integration: - timeout-minutes: 130 - if: inputs.testable-integrations != '[]' - name: "Integration Tests: ${{ matrix.integration }}" - runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} - strategy: - fail-fast: false - matrix: - integration: ${{ fromJSON(inputs.testable-integrations) }} - env: - IMAGE_TAG: "${{ inputs.image-tag }}" - BACKEND: "postgres" - BACKEND_VERSION: ${{ inputs.default-postgres-version }}" - PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" - JOB_ID: "integration-${{ matrix.integration }}" - SKIP_PROVIDER_TESTS: "${{ inputs.skip-provider-tests }}" - ENABLE_COVERAGE: "${{ inputs.run-coverage}}" - DEBUG_RESOURCES: "${{ inputs.debug-resources }}" - GITHUB_REPOSITORY: ${{ github.repository }} - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - GITHUB_USERNAME: ${{ github.actor }} - VERBOSE: "true" - steps: - - name: "Cleanup repo" - shell: bash - run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" - - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 - with: - persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}:${{ inputs.image-tag }}" - uses: ./.github/actions/prepare_breeze_and_image - - name: "Integration Tests: ${{ matrix.integration }}" - run: ./scripts/ci/testing/run_integration_tests_with_retry.sh ${{ matrix.integration }} - - name: "Post Tests success: Integration Tests ${{ matrix.integration }}" - uses: ./.github/actions/post_tests_success - with: - codecov-token: ${{ secrets.CODECOV_TOKEN }} - python-version: ${{ inputs.default-python-version }} - - name: "Post Tests failure: Integration Tests ${{ matrix.integration }}" - uses: ./.github/actions/post_tests_failure - if: failure() diff --git a/.github/workflows/k8s-tests.yml b/.github/workflows/k8s-tests.yml index 3b3e067038db9..40f73e3c59c66 100644 --- a/.github/workflows/k8s-tests.yml +++ b/.github/workflows/k8s-tests.yml @@ -20,24 +20,20 @@ name: K8s tests on: # yamllint disable-line rule:truthy workflow_call: inputs: - runs-on-as-json-default: - description: "The array of labels (in json form) determining default runner used for the build." + platform: + description: "Platform for the build - 'linux/amd64' or 'linux/arm64'" required: true type: string - image-tag: - description: "Tag to set for the image" + runs-on-as-json-default: + description: "The array of labels (in json form) determining default runner used for the build." required: true type: string python-versions-list-as-string: description: "List of Python versions to test: space separated string" required: true type: string - kubernetes-versions-list-as-string: - description: "List of Kubernetes versions to test" - required: true - type: string - kubernetes-combos-list-as-string: - description: "List of combinations of Kubernetes and Python versions to test: space separated string" + kubernetes-combos: + description: "Array of combinations of Kubernetes and Python versions to test" required: true type: string include-success-outputs: @@ -52,22 +48,22 @@ on: # yamllint disable-line rule:truthy description: "Whether to debug resources" required: true type: string +permissions: + contents: read jobs: tests-kubernetes: - timeout-minutes: 240 - name: "\ - K8S System:${{ matrix.executor }} - ${{ matrix.use-standard-naming }} - \ - ${{ inputs.kubernetes-versions-list-as-string }}" + timeout-minutes: 60 + name: "K8S System:${{ matrix.executor }}-${{ matrix.kubernetes-combo }}-${{ matrix.use-standard-naming }}" runs-on: ${{ fromJSON(inputs.runs-on-as-json-default) }} strategy: matrix: executor: [KubernetesExecutor, CeleryExecutor, LocalExecutor] use-standard-naming: [true, false] + kubernetes-combo: ${{ fromJSON(inputs.kubernetes-combos) }} fail-fast: false env: DEBUG_RESOURCES: ${{ inputs.debug-resources }} INCLUDE_SUCCESS_OUTPUTS: ${{ inputs.include-success-outputs }} - IMAGE_TAG: ${{ inputs.image-tag }} GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} @@ -76,55 +72,58 @@ jobs: - name: "Cleanup repo" shell: bash run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" + - name: "Prepare PYTHON_MAJOR_MINOR_VERSION and KUBERNETES_VERSION" + id: prepare-versions + env: + KUBERNETES_COMBO: ${{ matrix.kubernetes-combo }} + run: | + echo "PYTHON_MAJOR_MINOR_VERSION=${KUBERNETES_COMBO}" | sed 's/-.*//' >> $GITHUB_ENV + echo "KUBERNETES_VERSION=${KUBERNETES_COMBO}" | sed 's/=[^-]*-/=/' >> $GITHUB_ENV - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" uses: actions/checkout@v4 with: persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: "Install Breeze" - uses: ./.github/actions/breeze - id: breeze - - name: Login to ghcr.io - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - - name: Pull PROD images ${{ inputs.python-versions-list-as-string }}:${{ inputs.image-tag }} - run: breeze prod-image pull --run-in-parallel --tag-as-latest - env: - PYTHON_VERSIONS: ${{ inputs.python-versions-list-as-string }} - # Force more parallelism for pull even on public images - PARALLELISM: 6 - - name: "Cache bin folder with tools for kubernetes testing" - uses: actions/cache@v4 + # env.PYTHON_MAJOR_MINOR_VERSION, env.KUBERNETES_VERSION are set in the previous + # step id: prepare-versions + - name: "Prepare breeze & PROD image: ${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + uses: ./.github/actions/prepare_breeze_and_image with: - path: ".build/.k8s-env" - key: "\ - k8s-env-${{ steps.breeze.outputs.host-python-version }}-\ - ${{ hashFiles('scripts/ci/kubernetes/k8s_requirements.txt','hatch_build.py') }}" - - name: "Switch breeze to use uv" - run: breeze setup config --use-uv - if: inputs.use-uv == 'true' - - name: Run complete K8S tests ${{ inputs.kubernetes-combos-list-as-string }} - run: breeze k8s run-complete-tests --run-in-parallel --upgrade --no-copy-local-sources + platform: ${{ inputs.platform }} + image-type: "prod" + python: ${{ env.PYTHON_MAJOR_MINOR_VERSION }} + use-uv: ${{ inputs.use-uv }} + id: breeze + # preparing k8s environment with uv takes < 15 seconds with `uv` - there is no point in caching it. + - name: "\ + Run complete K8S tests ${{ matrix.executor }}-${{ env.PYTHON_MAJOR_MINOR_VERSION }}-\ + ${{env.KUBERNETES_VERSION}}-${{ matrix.use-standard-naming }}" + run: breeze k8s run-complete-tests --upgrade --no-copy-local-sources env: - PYTHON_VERSIONS: ${{ inputs.python-versions-list-as-string }} - KUBERNETES_VERSIONS: ${{ inputs.kubernetes-versions-list-as-string }} EXECUTOR: ${{ matrix.executor }} USE_STANDARD_NAMING: ${{ matrix.use-standard-naming }} VERBOSE: "false" - - name: Upload KinD logs on failure ${{ inputs.kubernetes-combos-list-as-string }} + - name: "\ + Upload KinD logs on failure ${{ matrix.executor }}-${{ matrix.kubernetes-combo }}-\ + ${{ matrix.use-standard-naming }}" uses: actions/upload-artifact@v4 if: failure() || cancelled() with: - name: kind-logs-${{ matrix.executor }}-${{ matrix.use-standard-naming }} + name: "\ + kind-logs-${{ matrix.kubernetes-combo }}-${{ matrix.executor }}-\ + ${{ matrix.use-standard-naming }}" path: /tmp/kind_logs_* - retention-days: 7 - - name: Upload test resource logs on failure ${{ inputs.kubernetes-combos-list-as-string }} + retention-days: '7' + - name: "\ + Upload test resource logs on failure ${{ matrix.executor }}-${{ matrix.kubernetes-combo }}-\ + ${{ matrix.use-standard-naming }}" uses: actions/upload-artifact@v4 if: failure() || cancelled() with: - name: k8s-test-resources-${{ matrix.executor }}-${{ matrix.use-standard-naming }} + name: "\ + k8s-test-resources-${{ matrix.kubernetes-combo }}-${{ matrix.executor }}-\ + ${{ matrix.use-standard-naming }}" path: /tmp/k8s_test_resources_* - retention-days: 7 + retention-days: '7' - name: "Delete clusters just in case they are left" run: breeze k8s delete-cluster --all if: always() diff --git a/.github/workflows/news-fragment.yml b/.github/workflows/news-fragment.yml new file mode 100644 index 0000000000000..46cb294d7a5b9 --- /dev/null +++ b/.github/workflows/news-fragment.yml @@ -0,0 +1,82 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +# +--- +name: CI + +on: # yamllint disable-line rule:truthy + pull_request: + types: [labeled, unlabeled, opened, reopened, synchronize] +permissions: + contents: read +jobs: + check-news-fragment: + name: Check News Fragment + runs-on: ubuntu-20.04 + if: "contains(github.event.pull_request.labels.*.name, 'airflow3.0:breaking')" + + steps: + - uses: actions/checkout@v4 + with: + persist-credentials: false + # `towncrier check` runs `git diff --name-only origin/main...`, which + # needs a non-shallow clone. + fetch-depth: 0 + + - name: Check news fragment existence + env: + BASE_REF: ${{ github.base_ref }} + run: > + python -m pip install --upgrade uv && + uv tool run towncrier check + --dir . + --config newsfragments/config.toml + --compare-with origin/${BASE_REF} + || + { + printf "\033[1;33mMissing significant newsfragment for PR labeled with + 'airflow3.0:breaking'.\nCheck + https://github.com/apache/airflow/blob/main/contributing-docs/16_contribution_workflow.rst + for guidance.\033[m\n" + && + false + ; } + + - name: Check news fragment contains change types + env: + BASE_REF: ${{ github.base_ref }} + run: > + change_types=( + 'DAG changes' + 'Config changes' + 'API changes' + 'CLI changes' + 'Behaviour changes' + 'Plugin changes' + 'Dependency change' + ) + news_fragment_content=`git diff origin/${BASE_REF} newsfragments/*.significant.rst` + + for type in "${change_types[@]}"; do + if [[ $news_fragment_content != *"$type"* ]]; then + printf "\033[1;33mMissing change type '$type' in significant newsfragment for PR labeled with + 'airflow3.0:breaking'.\nCheck + https://github.com/apache/airflow/blob/main/contributing-docs/16_contribution_workflow.rst + for guidance.\033[m\n" + exit 1 + fi + done diff --git a/.github/workflows/prod-image-build.yml b/.github/workflows/prod-image-build.yml index 75d9d0054ec78..470c149d26e6a 100644 --- a/.github/workflows/prod-image-build.yml +++ b/.github/workflows/prod-image-build.yml @@ -30,13 +30,6 @@ on: # yamllint disable-line rule:truthy variations. required: true type: string - do-build: - description: > - Whether to actually do the build (true/false). If set to false, the build is done - already in pull-request-target workflow, so we skip it here. - required: false - default: "true" - type: string upload-package-artifact: description: > Whether to upload package artifacts (true/false). If false, the job will rely on artifacts prepared @@ -62,6 +55,11 @@ on: # yamllint disable-line rule:truthy description: "Whether to push image to the registry (true/false)" required: true type: string + upload-image-artifact: + description: "Whether to upload docker image artifact" + required: false + default: "false" + type: string debian-version: description: "Base Debian distribution to use for the build (bookworm)" type: string @@ -74,10 +72,6 @@ on: # yamllint disable-line rule:truthy description: "Whether to use uv to build the image (true/false)" required: true type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string python-versions: description: "JSON-formatted array of Python versions to build images from" required: true @@ -114,12 +108,22 @@ on: # yamllint disable-line rule:truthy description: "Docker cache specification to build the image (registry, local, disabled)." required: true type: string + disable-airflow-repo-cache: + description: "Disable airflow repo cache read from main." + required: true + type: string + prod-image-build: + description: "Whether this is a prod-image build (true/false)" + required: true + type: string +permissions: + contents: read jobs: - build-prod-packages: - name: "${{ inputs.do-build == 'true' && 'Build' || 'Skip building' }} Airflow and provider packages" + name: "Build Airflow and provider packages" timeout-minutes: 10 runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} + if: inputs.prod-image-build == 'true' env: PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" VERSION_SUFFIX_FOR_PYPI: ${{ inputs.branch == 'main' && 'dev0' || '' }} @@ -127,32 +131,23 @@ jobs: - name: "Cleanup repo" shell: bash run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" - if: inputs.do-build == 'true' && inputs.upload-package-artifact == 'true' + if: inputs.upload-package-artifact == 'true' - name: "Checkout target branch" uses: actions/checkout@v4 with: persist-credentials: false - - name: "Checkout target commit" - uses: ./.github/actions/checkout_target_commit - with: - target-commit-sha: ${{ inputs.target-commit-sha }} - pull-request-target: ${{ inputs.pull-request-target }} - is-committer-build: ${{ inputs.is-committer-build }} - if: inputs.do-build == 'true' && inputs.upload-package-artifact == 'true' - name: "Cleanup docker" run: ./scripts/ci/cleanup_docker.sh - if: inputs.do-build == 'true' && inputs.upload-package-artifact == 'true' - - uses: actions/setup-python@v5 - with: - python-version: "${{ inputs.default-python-version }}" - if: inputs.do-build == 'true' && inputs.upload-package-artifact == 'true' + if: inputs.upload-package-artifact == 'true' - name: "Cleanup dist and context file" shell: bash run: rm -fv ./dist/* ./docker-context-files/* - if: inputs.do-build == 'true' && inputs.upload-package-artifact == 'true' + if: inputs.upload-package-artifact == 'true' - name: "Install Breeze" uses: ./.github/actions/breeze - if: inputs.do-build == 'true' && inputs.upload-package-artifact == 'true' + with: + use-uv: ${{ inputs.use-uv }} + if: inputs.upload-package-artifact == 'true' - name: "Prepare providers packages" shell: bash run: > @@ -160,23 +155,23 @@ jobs: --package-list-file ./prod_image_installed_providers.txt --package-format wheel if: > - inputs.do-build == 'true' && inputs.upload-package-artifact == 'true' && inputs.build-provider-packages == 'true' - name: "Prepare chicken-eggs provider packages" shell: bash + env: + CHICKEN_EGG_PROVIDERS: ${{ inputs.chicken-egg-providers }} run: > breeze release-management prepare-provider-packages - --package-format wheel ${{ inputs.chicken-egg-providers }} + --package-format wheel ${CHICKEN_EGG_PROVIDERS} if: > - inputs.do-build == 'true' && inputs.upload-package-artifact == 'true' && inputs.chicken-egg-providers != '' - name: "Prepare airflow package" shell: bash run: > breeze release-management prepare-airflow-package --package-format wheel - if: inputs.do-build == 'true' && inputs.upload-package-artifact == 'true' + if: inputs.upload-package-artifact == 'true' - name: "Upload prepared packages as artifacts" uses: actions/upload-artifact@v4 with: @@ -184,25 +179,21 @@ jobs: path: ./dist retention-days: 7 if-no-files-found: error - if: inputs.do-build == 'true' && inputs.upload-package-artifact == 'true' + if: inputs.upload-package-artifact == 'true' build-prod-images: strategy: fail-fast: false matrix: - # yamllint disable-line rule:line-length - python-version: ${{ inputs.do-build == 'true' && fromJSON(inputs.python-versions) || fromJSON('[""]') }} + python-version: ${{ fromJSON(inputs.python-versions) || fromJSON('[""]') }} timeout-minutes: 80 - name: "\ -${{ inputs.do-build == 'true' && 'Build' || 'Skip building' }} \ -PROD ${{ inputs.build-type }} image\ -${{ matrix.python-version }}${{ inputs.do-build == 'true' && ':' || '' }}\ -${{ inputs.do-build == 'true' && inputs.image-tag || '' }}" + name: "Build PROD ${{ inputs.build-type }} image ${{ matrix.python-version }}" runs-on: ${{ fromJSON(inputs.runs-on-as-json-public) }} needs: - build-prod-packages env: BACKEND: sqlite + PYTHON_MAJOR_MINOR_VERSION: "${{ matrix.python-version }}" DEFAULT_BRANCH: ${{ inputs.branch }} DEFAULT_CONSTRAINTS_BRANCH: ${{ inputs.constraints-branch }} VERSION_SUFFIX_FOR_PYPI: ${{ inputs.branch == 'main' && 'dev0' || '' }} @@ -216,88 +207,94 @@ ${{ inputs.do-build == 'true' && inputs.image-tag || '' }}" GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} - USE_UV: ${{ inputs.use-uv }} VERBOSE: "true" steps: - name: "Cleanup repo" shell: bash run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" - if: inputs.do-build == 'true' - name: "Checkout target branch" uses: actions/checkout@v4 with: persist-credentials: false - - name: "Checkout target commit" - uses: ./.github/actions/checkout_target_commit - with: - target-commit-sha: ${{ inputs.target-commit-sha }} - pull-request-target: ${{ inputs.pull-request-target }} - is-committer-build: ${{ inputs.is-committer-build }} - if: inputs.do-build == 'true' - name: "Cleanup docker" run: ./scripts/ci/cleanup_docker.sh - if: inputs.do-build == 'true' - name: "Install Breeze" uses: ./.github/actions/breeze - if: inputs.do-build == 'true' - - name: "Regenerate dependencies in case they was modified manually so that we can build an image" - shell: bash - run: | - pip install rich>=12.4.4 pyyaml - python scripts/ci/pre_commit/update_providers_dependencies.py - if: inputs.do-build == 'true' && inputs.upgrade-to-newer-dependencies != 'false' + with: + use-uv: ${{ inputs.use-uv }} - name: "Cleanup dist and context file" shell: bash run: rm -fv ./dist/* ./docker-context-files/* - if: inputs.do-build == 'true' - name: "Download packages prepared as artifacts" uses: actions/download-artifact@v4 with: name: prod-packages path: ./docker-context-files - if: inputs.do-build == 'true' - name: "Download constraints" uses: actions/download-artifact@v4 with: name: constraints path: ./docker-context-files - if: inputs.do-build == 'true' - - name: Login to ghcr.io - shell: bash - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - if: inputs.do-build == 'true' - - name: "Build PROD images w/ source providers ${{ matrix.python-version }}:${{ inputs.image-tag }}" + - name: "Login to ghcr.io" + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + ACTOR: ${{ github.actor }} + run: echo "${GITHUB_TOKEN}" | docker login ghcr.io -u ${ACTOR} --password-stdin + - name: "Build PROD images w/ source providers ${{ env.PYTHON_MAJOR_MINOR_VERSION }}" shell: bash run: > - breeze prod-image build --tag-as-latest --image-tag "${{ inputs.image-tag }}" + breeze prod-image build + --builder airflow_cache --commit-sha "${{ github.sha }}" - --install-packages-from-context --airflow-constraints-mode constraints-source-providers - --use-constraints-for-context-packages --python "${{ matrix.python-version }}" + --install-packages-from-context + --airflow-constraints-mode constraints-source-providers + --use-constraints-for-context-packages env: PUSH: ${{ inputs.push-image }} DOCKER_CACHE: ${{ inputs.docker-cache }} + DISABLE_AIRFLOW_REPO_CACHE: ${{ inputs.disable-airflow-repo-cache }} DEBIAN_VERSION: ${{ inputs.debian-version }} INSTALL_MYSQL_CLIENT_TYPE: ${{ inputs.install-mysql-client-type }} UPGRADE_TO_NEWER_DEPENDENCIES: ${{ inputs.upgrade-to-newer-dependencies }} INCLUDE_NOT_READY_PROVIDERS: "true" - if: inputs.do-build == 'true' && inputs.build-provider-packages == 'true' - - name: "Build PROD images with PyPi providers ${{ matrix.python-version }}:${{ inputs.image-tag }}" + if: inputs.build-provider-packages == 'true' + - name: "Build PROD images with PyPi providers ${{ env.PYTHON_MAJOR_MINOR_VERSION }}" shell: bash run: > - breeze prod-image build --builder airflow_cache --tag-as-latest - --image-tag "${{ inputs.image-tag }}" --commit-sha "${{ github.sha }}" - --install-packages-from-context --airflow-constraints-mode constraints - --use-constraints-for-context-packages --python "${{ matrix.python-version }}" + breeze prod-image build + --builder airflow_cache + --commit-sha "${{ github.sha }}" + --install-packages-from-context + --airflow-constraints-mode constraints + --use-constraints-for-context-packages env: PUSH: ${{ inputs.push-image }} DOCKER_CACHE: ${{ inputs.docker-cache }} + DISABLE_AIRFLOW_REPO_CACHE: ${{ inputs.disable-airflow-repo-cache }} DEBIAN_VERSION: ${{ inputs.debian-version }} INSTALL_MYSQL_CLIENT_TYPE: ${{ inputs.install-mysql-client-type }} UPGRADE_TO_NEWER_DEPENDENCIES: ${{ inputs.upgrade-to-newer-dependencies }} INCLUDE_NOT_READY_PROVIDERS: "true" - if: inputs.do-build == 'true' && inputs.build-provider-packages != 'true' - - name: Verify PROD image ${{ matrix.python-version }}:${{ inputs.image-tag }} + if: inputs.build-provider-packages != 'true' + - name: "Verify PROD image ${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + run: breeze prod-image verify + - name: Check free space + run: df -H + shell: bash + - name: Make /mnt/ directory writeable + run: sudo chown -R ${USER} /mnt + shell: bash + - name: "Export PROD docker image ${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + env: + PLATFORM: ${{ inputs.platform }} run: > - breeze prod-image verify --image-tag "${{ inputs.image-tag }}" - --python "${{ matrix.python-version }}" - if: inputs.do-build == 'true' + breeze prod-image save --platform "${PLATFORM}" --image-file-dir "/mnt" + if: inputs.upload-image-artifact == 'true' + - name: "Stash PROD docker image ${{ env.PYTHON_MAJOR_MINOR_VERSION }}" + uses: apache/infrastructure-actions/stash/save@c94b890bbedc2fc61466d28e6bd9966bc6c6643c + with: + key: prod-image-save-${{ inputs.platform }}-${{ env.PYTHON_MAJOR_MINOR_VERSION }} + path: "/mnt/prod-image-save-*-${{ env.PYTHON_MAJOR_MINOR_VERSION }}.tar" + if-no-files-found: 'error' + retention-days: '2' + if: inputs.upload-image-artifact == 'true' diff --git a/.github/workflows/prod-image-extra-checks.yml b/.github/workflows/prod-image-extra-checks.yml index 82d327ba2f16d..56fa4b2b1a28d 100644 --- a/.github/workflows/prod-image-extra-checks.yml +++ b/.github/workflows/prod-image-extra-checks.yml @@ -40,9 +40,6 @@ on: # yamllint disable-line rule:truthy description: "Whether to use uv to build the image (true/false)" required: true type: string - image-tag: - required: true - type: string build-provider-packages: description: "Whether to build provider packages (true/false). If false providers are from PyPI" required: true @@ -63,14 +60,20 @@ on: # yamllint disable-line rule:truthy description: "Docker cache specification to build the image (registry, local, disabled)." required: true type: string + disable-airflow-repo-cache: + description: "Disable airflow repo cache read from main." + required: true + type: string +permissions: + contents: read jobs: myssql-client-image: uses: ./.github/workflows/prod-image-build.yml with: runs-on-as-json-public: ${{ inputs.runs-on-as-json-public }} build-type: "MySQL Client" + upload-image-artifact: "false" upload-package-artifact: "false" - image-tag: mysql-${{ inputs.image-tag }} install-mysql-client-type: "mysql" python-versions: ${{ inputs.python-versions }} default-python-version: ${{ inputs.default-python-version }} @@ -84,6 +87,8 @@ jobs: chicken-egg-providers: ${{ inputs.chicken-egg-providers }} constraints-branch: ${{ inputs.constraints-branch }} docker-cache: ${{ inputs.docker-cache }} + disable-airflow-repo-cache: ${{ inputs.disable-airflow-repo-cache }} + prod-image-build: "true" pip-image: uses: ./.github/workflows/prod-image-build.yml @@ -92,8 +97,8 @@ jobs: with: runs-on-as-json-public: ${{ inputs.runs-on-as-json-public }} build-type: "pip" + upload-image-artifact: "false" upload-package-artifact: "false" - image-tag: mysql-${{ inputs.image-tag }} install-mysql-client-type: "mysql" python-versions: ${{ inputs.python-versions }} default-python-version: ${{ inputs.default-python-version }} @@ -107,3 +112,5 @@ jobs: chicken-egg-providers: ${{ inputs.chicken-egg-providers }} constraints-branch: ${{ inputs.constraints-branch }} docker-cache: ${{ inputs.docker-cache }} + disable-airflow-repo-cache: ${{ inputs.disable-airflow-repo-cache }} + prod-image-build: "true" diff --git a/.github/workflows/push-image-cache.yml b/.github/workflows/push-image-cache.yml index 0dc83a3fd66ea..86ec3b2a85a86 100644 --- a/.github/workflows/push-image-cache.yml +++ b/.github/workflows/push-image-cache.yml @@ -76,6 +76,12 @@ on: # yamllint disable-line rule:truthy description: "Docker cache specification to build the image (registry, local, disabled)." required: true type: string + disable-airflow-repo-cache: + description: "Disable airflow repo cache read from main." + required: true + type: string +permissions: + contents: read jobs: push-ci-image-cache: name: "Push CI ${{ inputs.cache-type }}:${{ matrix.python }} image cache " @@ -100,12 +106,13 @@ jobs: DEFAULT_BRANCH: ${{ inputs.branch }} DEFAULT_CONSTRAINTS_BRANCH: ${{ inputs.constraints-branch }} DOCKER_CACHE: ${{ inputs.docker-cache }} + DISABLE_AIRFLOW_REPO_CACHE: ${{ inputs.disable-airflow-repo-cache }} GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} INCLUDE_SUCCESS_OUTPUTS: "${{ inputs.include-success-outputs }}" INSTALL_MYSQL_CLIENT_TYPE: ${{ inputs.install-mysql-client-type }} - USE_UV: ${{ inputs.use-uv }} + PYTHON_MAJOR_MINOR_VERSION: "${{ matrix.python }}" UPGRADE_TO_NEWER_DEPENDENCIES: "false" VERBOSE: "true" VERSION_SUFFIX_FOR_PYPI: "dev0" @@ -121,23 +128,33 @@ jobs: run: ./scripts/ci/cleanup_docker.sh - name: "Install Breeze" uses: ./.github/actions/breeze - - name: "Start ARM instance" - run: ./scripts/ci/images/ci_start_arm_instance_and_connect_to_docker.sh - if: inputs.platform == 'linux/arm64' + with: + use-uv: ${{ inputs.use-uv }} - name: Login to ghcr.io - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - - name: "Push CI ${{ inputs.cache-type }} cache: ${{ matrix.python }} ${{ inputs.platform }}" - run: > - breeze ci-image build --builder airflow_cache --prepare-buildx-cache - --platform "${{ inputs.platform }}" --python ${{ matrix.python }} - - name: "Stop ARM instance" - run: ./scripts/ci/images/ci_stop_arm_instance.sh - if: always() && inputs.platform == 'linux/arm64' - - name: "Push CI latest images: ${{ matrix.python }} (linux/amd64 only)" + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + ACTOR: ${{ github.actor }} + run: echo "${GITHUB_TOKEN}" | docker login ghcr.io -u ${ACTOR} --password-stdin + - name: "Push CI latest images: ${{ env.PYTHON_MAJOR_MINOR_VERSION }} (linux/amd64 only)" + env: + PLATFORM: ${{ inputs.platform }} run: > - breeze ci-image build --builder airflow_cache --push - --python "${{ matrix.python }}" --platform "${{ inputs.platform }}" + breeze + ci-image build + --builder airflow_cache + --platform "${PLATFORM}" + --push if: inputs.push-latest-images == 'true' && inputs.platform == 'linux/amd64' + # yamllint disable-line rule:line-length + - name: "Push CI ${{ inputs.cache-type }} cache:${{ env.PYTHON_MAJOR_MINOR_VERSION }}:${{ inputs.platform }}" + env: + PLATFORM: ${{ inputs.platform }} + run: > + breeze ci-image build + --builder airflow_cache + --prepare-buildx-cache + --platform "${PLATFORM}" + --push push-prod-image-cache: name: "Push PROD ${{ inputs.cache-type }}:${{ matrix.python }} image cache" @@ -162,12 +179,13 @@ jobs: DEFAULT_BRANCH: ${{ inputs.branch }} DEFAULT_CONSTRAINTS_BRANCH: ${{ inputs.constraints-branch }} DOCKER_CACHE: ${{ inputs.docker-cache }} + DISABLE_AIRFLOW_REPO_CACHE: ${{ inputs.disable-airflow-repo-cache }} GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} INSTALL_MYSQL_CLIENT_TYPE: ${{ inputs.install-mysql-client-type }} + PYTHON_MAJOR_MINOR_VERSION: "${{ matrix.python }}" UPGRADE_TO_NEWER_DEPENDENCIES: "false" - USE_UV: ${{ inputs.branch == 'main' && inputs.use-uv || 'false' }} VERBOSE: "true" VERSION_SUFFIX_FOR_PYPI: "dev0" if: inputs.include-prod-images == 'true' @@ -183,6 +201,8 @@ jobs: run: ./scripts/ci/cleanup_docker.sh - name: "Install Breeze" uses: ./.github/actions/breeze + with: + use-uv: ${{ inputs.use-uv }} - name: "Cleanup dist and context file" run: rm -fv ./dist/* ./docker-context-files/* - name: "Download packages prepared as artifacts" @@ -190,25 +210,33 @@ jobs: with: name: prod-packages path: ./docker-context-files - - name: "Start ARM instance" - run: ./scripts/ci/images/ci_start_arm_instance_and_connect_to_docker.sh - if: inputs.platform == 'linux/arm64' - name: Login to ghcr.io - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin - - name: "Push PROD ${{ inputs.cache-type }} cache: ${{ matrix.python-version }} ${{ inputs.platform }}" - run: > - breeze prod-image build --builder airflow_cache - --prepare-buildx-cache --platform "${{ inputs.platform }}" - --install-packages-from-context --airflow-constraints-mode constraints-source-providers - --python ${{ matrix.python }} - - name: "Stop ARM instance" - run: ./scripts/ci/images/ci_stop_arm_instance.sh - if: always() && inputs.platform == 'linux/arm64' + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + ACTOR: ${{ github.actor }} + run: echo "${GITHUB_TOKEN}" | docker login ghcr.io -u ${ACTOR} --password-stdin # We only push "AMD" images as it is really only needed for any kind of automated builds in CI # and currently there is not an easy way to make multi-platform image from two separate builds # and we can do it after we stopped the ARM instance as it is not needed anymore - - name: "Push PROD latest image: ${{ matrix.python }} (linux/amd64 ONLY)" + - name: "Push PROD latest image: ${{ env.PYTHON_MAJOR_MINOR_VERSION }} (linux/amd64 ONLY)" + env: + PLATFORM: ${{ inputs.platform }} run: > - breeze prod-image build --builder airflow_cache --install-packages-from-context - --push --platform "${{ inputs.platform }}" + breeze prod-image build + --builder airflow_cache + --install-packages-from-context + --platform "${PLATFORM}" + --airflow-constraints-mode constraints-source-providers if: inputs.push-latest-images == 'true' && inputs.platform == 'linux/amd64' + # yamllint disable-line rule:line-length + - name: "Push PROD ${{ inputs.cache-type }} cache: ${{ env.PYTHON_MAJOR_MINOR_VERSION }} ${{ inputs.platform }}" + env: + PLATFORM: ${{ inputs.platform }} + run: > + breeze prod-image build + --builder airflow_cache + --prepare-buildx-cache + --install-packages-from-context + --platform "${PLATFORM}" + --airflow-constraints-mode constraints-source-providers + --push diff --git a/.github/workflows/recheck-old-bug-report.yml b/.github/workflows/recheck-old-bug-report.yml index ee14cfde5f757..217092b86f87e 100644 --- a/.github/workflows/recheck-old-bug-report.yml +++ b/.github/workflows/recheck-old-bug-report.yml @@ -45,6 +45,7 @@ jobs: remove-stale-when-updated: false remove-issue-stale-when-updated: true labels-to-add-when-unstale: 'needs-triage' + labels-to-remove-when-unstale: 'Stale Bug Report' stale-issue-message: > This issue has been automatically marked as stale because it has been open for 365 days without any activity. There has been several Airflow releases since last activity on this issue. diff --git a/.github/workflows/release_dockerhub_image.yml b/.github/workflows/release_dockerhub_image.yml index 5ce1585131f76..b8758146cc1b1 100644 --- a/.github/workflows/release_dockerhub_image.yml +++ b/.github/workflows/release_dockerhub_image.yml @@ -63,11 +63,14 @@ jobs: run: ./scripts/ci/cleanup_docker.sh - name: "Install Breeze" uses: ./.github/actions/breeze + with: + use-uv: "false" - name: Selective checks id: selective-checks env: VERBOSE: "false" run: breeze ci selective-check 2>> ${GITHUB_OUTPUT} + release-images: timeout-minutes: 120 name: "Release images: ${{ github.event.inputs.airflowVersion }}, ${{ matrix.python-version }}" @@ -99,6 +102,8 @@ jobs: run: ./scripts/ci/cleanup_docker.sh - name: "Install Breeze" uses: ./.github/actions/breeze + with: + use-uv: "false" - name: Free space run: breeze ci free-space --answer yes - name: "Cleanup dist and context file" @@ -108,7 +113,10 @@ jobs: echo ${{ secrets.DOCKERHUB_TOKEN }} | docker login --password-stdin --username ${{ secrets.DOCKERHUB_USER }} - name: Login to ghcr.io - run: echo "${{ env.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.actor }} --password-stdin + env: + GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} + ACTOR: ${{ github.actor }} + run: echo "${GITHUB_TOKEN}" | docker login ghcr.io -u ${ACTOR} --password-stdin - name: "Install buildx plugin" # yamllint disable rule:line-length run: | @@ -141,10 +149,12 @@ jobs: # from the source code, not from the PyPI because they have apache-airflow>=X.Y.Z dependency # And when we prepare them from sources they will have apache-airflow>=X.Y.Z.dev0 shell: bash + env: + CHICKEN_EGG_PROVIDERS: ${{ needs.build-info.outputs.chicken-egg-providers }} run: > breeze release-management prepare-provider-packages --package-format wheel - --version-suffix-for-pypi dev0 ${{ needs.build-info.outputs.chicken-egg-providers }} + --version-suffix-for-pypi dev0 ${CHICKEN_EGG_PROVIDERS} if: needs.build-info.outputs.chicken-egg-providers != '' - name: "Copy dist packages to docker-context files" shell: bash @@ -152,42 +162,61 @@ jobs: if: needs.build-info.outputs.chicken-egg-providers != '' - name: > Release regular images: ${{ github.event.inputs.airflowVersion }}, ${{ matrix.python-version }} - run: > - breeze release-management release-prod-images - --dockerhub-repo ${{ github.repository }} - --airflow-version ${{ github.event.inputs.airflowVersion }} - ${{ needs.build-info.outputs.skipLatest }} - ${{ needs.build-info.outputs.limitPlatform }} - --limit-python ${{ matrix.python-version }} - --chicken-egg-providers "${{ needs.build-info.outputs.chicken-egg-providers }}" env: COMMIT_SHA: ${{ github.sha }} - - name: > - Release slim images: ${{ github.event.inputs.airflowVersion }}, ${{ matrix.python-version }} + REPOSITORY: ${{ github.repository }} + PYTHON_VERSION: ${{ matrix.python-version }} + AIRFLOW_VERSION: ${{ github.event.inputs.airflowVersion }} + SKIP_LATEST: ${{ needs.build-info.outputs.skipLatest }} + LIMIT_PLATFORM: ${{ needs.build-info.outputs.limitPlatform }} + CHICKEN_EGG_PROVIDERS: ${{ needs.build-info.outputs.chicken-egg-providers }} run: > breeze release-management release-prod-images - --dockerhub-repo ${{ github.repository }} - --airflow-version ${{ github.event.inputs.airflowVersion }} - ${{ needs.build-info.outputs.skipLatest }} - ${{ needs.build-info.outputs.limitPlatform }} - --limit-python ${{ matrix.python-version }} --slim-images + --dockerhub-repo "${REPOSITORY}" + --airflow-version "${AIRFLOW_VERSION}" + ${SKIP_LATEST} + ${LIMIT_PLATFORM} + --limit-python ${PYTHON_VERSION} + --chicken-egg-providers ${CHICKEN_EGG_PROVIDERS} + - name: > + Release slim images: ${{ github.event.inputs.airflowVersion }}, ${{ matrix.python-version }} env: COMMIT_SHA: ${{ github.sha }} + REPOSITORY: ${{ github.repository }} + PYTHON_VERSION: ${{ matrix.python-version }} + AIRFLOW_VERSION: ${{ github.event.inputs.airflowVersion }} + SKIP_LATEST: ${{ needs.build-info.outputs.skipLatest }} + LIMIT_PLATFORM: ${{ needs.build-info.outputs.limitPlatform }} + run: > + breeze release-management release-prod-images + --dockerhub-repo "${REPOSITORY}" + --airflow-version "${AIRFLOW_VERSION}" + ${SKIP_LATEST} + ${LIMIT_PLATFORM} + --limit-python ${PYTHON_VERSION} --slim-images - name: > Verify regular AMD64 image: ${{ github.event.inputs.airflowVersion }}, ${{ matrix.python-version }} + env: + PYTHON_VERSION: ${{ matrix.python-version }} + AIRFLOW_VERSION: ${{ github.event.inputs.airflowVersion }} + REPOSITORY: ${{ github.repository }} run: > breeze prod-image verify --pull --image-name - ${{github.repository}}:${{github.event.inputs.airflowVersion}}-python${{matrix.python-version}} + ${REPOSITORY}:${AIRFLOW_VERSION}-python${PYTHON_VERSION} - name: > Verify slim AMD64 image: ${{ github.event.inputs.airflowVersion }}, ${{ matrix.python-version }} + env: + PYTHON_VERSION: ${{ matrix.python-version }} + AIRFLOW_VERSION: ${{ github.event.inputs.airflowVersion }} + REPOSITORY: ${{ github.repository }} run: > breeze prod-image verify --pull --slim-image --image-name - ${{github.repository}}:slim-${{github.event.inputs.airflowVersion}}-python${{matrix.python-version}} + ${REPOSITORY}:slim-${AIRFLOW_VERSION}-python${PYTHON_VERSION} - name: "Docker logout" run: docker logout if: always() diff --git a/.github/workflows/run-unit-tests.yml b/.github/workflows/run-unit-tests.yml index 2989a952d9ea2..e67d59ee08d37 100644 --- a/.github/workflows/run-unit-tests.yml +++ b/.github/workflows/run-unit-tests.yml @@ -24,6 +24,10 @@ on: # yamllint disable-line rule:truthy description: "The array of labels (in json form) determining default runner used for the build." required: true type: string + test-groups: + description: "The json representing list of test test groups to run" + required: true + type: string backend: description: "The backend to run the tests on" required: true @@ -41,10 +45,6 @@ on: # yamllint disable-line rule:truthy required: false default: ":" type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string python-versions: description: "The list of python versions (stringified JSON array) to run the tests on." required: true @@ -53,12 +53,20 @@ on: # yamllint disable-line rule:truthy description: "The list of backend versions (stringified JSON array) to run the tests on." required: true type: string + excluded-providers-as-string: + description: "Excluded providers (per Python version) as json string" + required: true + type: string excludes: description: "Excluded combos (stringified JSON array of python-version/backend-version dicts)" required: true type: string - parallel-test-types-list-as-string: - description: "The list of parallel test types to run separated by spaces" + core-test-types-list-as-string: + description: "The list of core test types to run separated by spaces" + required: true + type: string + providers-test-types-list-as-string: + description: "The list of providers test types to run separated by spaces" required: true type: string run-migration-tests: @@ -89,26 +97,11 @@ on: # yamllint disable-line rule:truthy required: false default: "false" type: string - pydantic: - description: "The version of pydantic to use" - required: false - default: "v2" - type: string downgrade-pendulum: description: "Whether to downgrade pendulum or not (true/false)" required: false default: "false" type: string - enable-aip-44: - description: "Whether to enable AIP-44 or not (true/false)" - required: false - default: "true" - type: string - database-isolation: - description: "Whether to enable database isolattion or not (true/false)" - required: false - default: "false" - type: string force-lowest-dependencies: description: "Whether to force lowest dependencies for the tests or not (true/false)" required: false @@ -119,13 +112,19 @@ on: # yamllint disable-line rule:truthy required: false default: 20 type: number + use-uv: + description: "Whether to use uv" + required: true + type: string +permissions: + contents: read jobs: tests: timeout-minutes: 120 name: "\ - ${{ inputs.test-scope }}:\ + ${{ inputs.test-scope }}-${{ matrix.test-group }}:\ ${{ inputs.test-name }}${{ inputs.test-name-separator }}${{ matrix.backend-version }}:\ - ${{matrix.python-version}}: ${{ inputs.parallel-test-types-list-as-string }}" + ${{matrix.python-version}}" runs-on: ${{ fromJSON(inputs.runs-on-as-json-default) }} strategy: fail-fast: false @@ -133,9 +132,8 @@ jobs: python-version: "${{fromJSON(inputs.python-versions)}}" backend-version: "${{fromJSON(inputs.backend-versions)}}" exclude: "${{fromJSON(inputs.excludes)}}" + test-group: "${{fromJSON(inputs.test-groups)}}" env: - # yamllint disable rule:line-length - AIRFLOW_ENABLE_AIP_44: "${{ inputs.enable-aip-44 }}" BACKEND: "${{ inputs.backend }}" BACKEND_VERSION: "${{ matrix.backend-version }}" DB_RESET: "true" @@ -143,21 +141,20 @@ jobs: DOWNGRADE_SQLALCHEMY: "${{ inputs.downgrade-sqlalchemy }}" DOWNGRADE_PENDULUM: "${{ inputs.downgrade-pendulum }}" ENABLE_COVERAGE: "${{ inputs.run-coverage }}" + EXCLUDED_PROVIDERS: "${{ inputs.excluded-providers-as-string }}" FORCE_LOWEST_DEPENDENCIES: "${{ inputs.force-lowest-dependencies }}" GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} - IMAGE_TAG: "${{ inputs.image-tag }}" INCLUDE_SUCCESS_OUTPUTS: ${{ inputs.include-success-outputs }} # yamllint disable rule:line-length - JOB_ID: "${{ inputs.test-scope }}-${{ inputs.test-name }}-${{inputs.backend}}-${{ matrix.backend-version }}-${{ matrix.python-version }}" + JOB_ID: "${{ matrix.test-group }}-${{ inputs.test-scope }}-${{ inputs.test-name }}-${{inputs.backend}}-${{ matrix.backend-version }}-${{ matrix.python-version }}" MOUNT_SOURCES: "skip" - PARALLEL_TEST_TYPES: "${{ inputs.parallel-test-types-list-as-string }}" - PYDANTIC: "${{ inputs.pydantic }}" + # yamllint disable rule:line-length + PARALLEL_TEST_TYPES: ${{ matrix.test-group == 'core' && inputs.core-test-types-list-as-string || inputs.providers-test-types-list-as-string }} PYTHON_MAJOR_MINOR_VERSION: "${{ matrix.python-version }}" UPGRADE_BOTO: "${{ inputs.upgrade-boto }}" AIRFLOW_MONITOR_DELAY_TIME_IN_SECONDS: "${{inputs.monitor-delay-time-in-seconds}}" - DATABASE_ISOLATION: "${{ inputs.database-isolation }}" VERBOSE: "true" steps: - name: "Cleanup repo" @@ -167,38 +164,23 @@ jobs: uses: actions/checkout@v4 with: persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: "Prepare breeze & CI image: ${{matrix.python-version}}:${{ inputs.image-tag }}" + - name: "Prepare breeze & CI image: ${{ matrix.python-version }}" uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + python: ${{ matrix.python-version }} + use-uv: ${{ inputs.use-uv }} - name: > - Migration Tests: - ${{ matrix.python-version }}:${{ inputs.parallel-test-types-list-as-string }} + Migration Tests: ${{ matrix.python-version }}:${{ env.PARALLEL_TEST_TYPES }} uses: ./.github/actions/migration_tests - if: inputs.run-migration-tests == 'true' + if: inputs.run-migration-tests == 'true' && matrix.test-group == 'core' - name: > - ${{ inputs.test-scope }} Tests ${{ inputs.test-name }} ${{ matrix.backend-version }} - Py${{ matrix.python-version }}:${{ inputs.parallel-test-types-list-as-string}} - run: | - if [[ "${{ inputs.test-scope }}" == "DB" ]]; then - breeze testing db-tests \ - --parallel-test-types "${{ inputs.parallel-test-types-list-as-string }}" - elif [[ "${{ inputs.test-scope }}" == "Non-DB" ]]; then - breeze testing non-db-tests \ - --parallel-test-types "${{ inputs.parallel-test-types-list-as-string }}" - elif [[ "${{ inputs.test-scope }}" == "All" ]]; then - breeze testing tests --run-in-parallel \ - --parallel-test-types "${{ inputs.parallel-test-types-list-as-string }}" - elif [[ "${{ inputs.test-scope }}" == "Quarantined" ]]; then - breeze testing tests --test-type "All-Quarantined" || true - elif [[ "${{ inputs.test-scope }}" == "ARM collection" ]]; then - breeze testing tests --collect-only --remove-arm-packages - elif [[ "${{ inputs.test-scope }}" == "System" ]]; then - breeze testing tests tests/system/example_empty.py --system core - else - echo "Unknown test scope: ${{ inputs.test-scope }}" - exit 1 - fi + ${{ matrix.test-group}}:${{ inputs.test-scope }} Tests ${{ inputs.test-name }} ${{ matrix.backend-version }} + Py${{ matrix.python-version }}:${{ env.PARALLEL_TEST_TYPES }} + env: + TEST_GROUP: "${{ matrix.test-group }}" + TEST_SCOPE: "${{ inputs.test-scope }}" + run: ./scripts/ci/testing/run_unit_tests.sh "${TEST_GROUP}" "${TEST_SCOPE}" - name: "Post Tests success" uses: ./.github/actions/post_tests_success with: @@ -207,4 +189,4 @@ jobs: if: success() - name: "Post Tests failure" uses: ./.github/actions/post_tests_failure - if: failure() + if: failure() || cancelled() diff --git a/.github/workflows/special-tests.yml b/.github/workflows/special-tests.yml index 9a2a4cad330be..8507294e535c6 100644 --- a/.github/workflows/special-tests.yml +++ b/.github/workflows/special-tests.yml @@ -28,12 +28,16 @@ on: # yamllint disable-line rule:truthy description: "The default branch for the repository" required: true type: string - image-tag: - description: "Tag to set for the image" + test-groups: + description: "The json representing list of test test groups to run" required: true type: string - parallel-test-types-list-as-string: - description: "The list of parallel test types to run separated by spaces" + core-test-types-list-as-string: + description: "The list of core test types to run separated by spaces" + required: true + type: string + providers-test-types-list-as-string: + description: "The list of providers test types to run separated by spaces" required: true type: string run-coverage: @@ -44,6 +48,10 @@ on: # yamllint disable-line rule:truthy description: "Which version of python should be used by default" required: true type: string + excluded-providers-as-string: + description: "Excluded providers (per Python version) as json string" + required: true + type: string python-versions: description: "The list of python versions (stringified JSON array) to run the tests on." required: true @@ -60,10 +68,20 @@ on: # yamllint disable-line rule:truthy description: "Whether to upgrade to newer dependencies or not (true/false)" required: true type: string + include-success-outputs: + description: "Whether to include success outputs or not (true/false)" + required: true + type: string debug-resources: description: "Whether to debug resources or not (true/false)" required: true type: string + use-uv: + description: "Whether to use uv or not (true/false)" + required: true + type: string +permissions: + contents: read jobs: tests-min-sqlalchemy: name: "Min SQLAlchemy test" @@ -77,15 +95,17 @@ jobs: downgrade-sqlalchemy: "true" test-name: "MinSQLAlchemy-Postgres" test-scope: "DB" + test-groups: ${{ inputs.test-groups }} backend: "postgres" - image-tag: ${{ inputs.image-tag }} python-versions: "['${{ inputs.default-python-version }}']" backend-versions: "['${{ inputs.default-postgres-version }}']" + excluded-providers-as-string: ${{ inputs.excluded-providers-as-string }} excludes: "[]" - parallel-test-types-list-as-string: ${{ inputs.parallel-test-types-list-as-string }} - include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} + core-test-types-list-as-string: ${{ inputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ inputs.providers-test-types-list-as-string }} run-coverage: ${{ inputs.run-coverage }} debug-resources: ${{ inputs.debug-resources }} + use-uv: ${{ inputs.use-uv }} tests-boto: name: "Latest Boto test" @@ -99,59 +119,18 @@ jobs: upgrade-boto: "true" test-name: "LatestBoto-Postgres" test-scope: "All" + test-groups: ${{ inputs.test-groups }} backend: "postgres" - image-tag: ${{ inputs.image-tag }} - python-versions: "['${{ inputs.default-python-version }}']" - backend-versions: "['${{ inputs.default-postgres-version }}']" - excludes: "[]" - parallel-test-types-list-as-string: ${{ inputs.parallel-test-types-list-as-string }} - include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} - run-coverage: ${{ inputs.run-coverage }} - debug-resources: ${{ inputs.debug-resources }} - - tests-pydantic-v1: - name: "Pydantic v1 test" - uses: ./.github/workflows/run-unit-tests.yml - permissions: - contents: read - packages: read - secrets: inherit - with: - runs-on-as-json-default: ${{ inputs.runs-on-as-json-default }} - pydantic: "v1" - test-name: "Pydantic-V1-Postgres" - test-scope: "All" - backend: "postgres" - image-tag: ${{ inputs.image-tag }} - python-versions: "['${{ inputs.default-python-version }}']" - backend-versions: "['${{ inputs.default-postgres-version }}']" - excludes: "[]" - parallel-test-types-list-as-string: ${{ inputs.parallel-test-types-list-as-string }} - include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} - run-coverage: ${{ inputs.run-coverage }} - debug-resources: ${{ inputs.debug-resources }} - - tests-pydantic-none: - name: "Pydantic removed test" - uses: ./.github/workflows/run-unit-tests.yml - permissions: - contents: read - packages: read - secrets: inherit - with: - runs-on-as-json-default: ${{ inputs.runs-on-as-json-default }} - pydantic: "none" - test-name: "Pydantic-Removed-Postgres" - test-scope: "All" - backend: "postgres" - image-tag: ${{ inputs.image-tag }} python-versions: "['${{ inputs.default-python-version }}']" backend-versions: "['${{ inputs.default-postgres-version }}']" + excluded-providers-as-string: ${{ inputs.excluded-providers-as-string }} excludes: "[]" - parallel-test-types-list-as-string: ${{ inputs.parallel-test-types-list-as-string }} - include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} + core-test-types-list-as-string: ${{ inputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ inputs.providers-test-types-list-as-string }} + include-success-outputs: ${{ inputs.include-success-outputs }} run-coverage: ${{ inputs.run-coverage }} debug-resources: ${{ inputs.debug-resources }} + use-uv: ${{ inputs.use-uv }} tests-pendulum-2: name: "Pendulum2 test" @@ -165,60 +144,18 @@ jobs: downgrade-pendulum: "true" test-name: "Pendulum2-Postgres" test-scope: "All" + test-groups: ${{ inputs.test-groups }} backend: "postgres" - image-tag: ${{ inputs.image-tag }} - python-versions: "['${{ inputs.default-python-version }}']" - backend-versions: "['${{ inputs.default-postgres-version }}']" - excludes: "[]" - parallel-test-types-list-as-string: ${{ inputs.parallel-test-types-list-as-string }} - include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} - run-coverage: ${{ inputs.run-coverage }} - debug-resources: ${{ inputs.debug-resources }} - - tests-in-progress-disabled: - name: "In progress disabled test" - uses: ./.github/workflows/run-unit-tests.yml - permissions: - contents: read - packages: read - secrets: inherit - with: - runs-on-as-json-default: ${{ inputs.runs-on-as-json-default }} - enable-aip-44: "false" - test-name: "InProgressDisabled-Postgres" - test-scope: "All" - backend: "postgres" - image-tag: ${{ inputs.image-tag }} - python-versions: "['${{ inputs.default-python-version }}']" - backend-versions: "['${{ inputs.default-postgres-version }}']" - excludes: "[]" - parallel-test-types-list-as-string: ${{ inputs.parallel-test-types-list-as-string }} - include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} - run-coverage: ${{ inputs.run-coverage }} - debug-resources: ${{ inputs.debug-resources }} - - tests-database-isolation: - name: "Database isolation test" - uses: ./.github/workflows/run-unit-tests.yml - permissions: - contents: read - packages: read - secrets: inherit - with: - runs-on-as-json-default: ${{ inputs.runs-on-as-json-default }} - enable-aip-44: "true" - database-isolation: "true" - test-name: "DatabaseIsolation-Postgres" - test-scope: "DB" - backend: "postgres" - image-tag: ${{ inputs.image-tag }} python-versions: "['${{ inputs.default-python-version }}']" backend-versions: "['${{ inputs.default-postgres-version }}']" + excluded-providers-as-string: ${{ inputs.excluded-providers-as-string }} excludes: "[]" - parallel-test-types-list-as-string: ${{ inputs.parallel-test-types-list-as-string }} - include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} + core-test-types-list-as-string: ${{ inputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ inputs.providers-test-types-list-as-string }} + include-success-outputs: ${{ inputs.include-success-outputs }} run-coverage: ${{ inputs.run-coverage }} debug-resources: ${{ inputs.debug-resources }} + use-uv: ${{ inputs.use-uv }} tests-quarantined: name: "Quarantined test" @@ -231,15 +168,18 @@ jobs: runs-on-as-json-default: ${{ inputs.runs-on-as-json-default }} test-name: "Postgres" test-scope: "Quarantined" + test-groups: ${{ inputs.test-groups }} backend: "postgres" - image-tag: ${{ inputs.image-tag }} python-versions: "['${{ inputs.default-python-version }}']" backend-versions: "['${{ inputs.default-postgres-version }}']" + excluded-providers-as-string: ${{ inputs.excluded-providers-as-string }} excludes: "[]" - parallel-test-types-list-as-string: ${{ inputs.parallel-test-types-list-as-string }} - include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} + core-test-types-list-as-string: ${{ inputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ inputs.providers-test-types-list-as-string }} + include-success-outputs: ${{ inputs.include-success-outputs }} run-coverage: ${{ inputs.run-coverage }} debug-resources: ${{ inputs.debug-resources }} + use-uv: ${{ inputs.use-uv }} tests-arm-collection: name: "ARM Collection test" @@ -252,19 +192,23 @@ jobs: runs-on-as-json-default: ${{ inputs.runs-on-as-json-default }} test-name: "Postgres" test-scope: "ARM collection" + test-groups: ${{ inputs.test-groups }} backend: "postgres" - image-tag: ${{ inputs.image-tag }} python-versions: "['${{ inputs.default-python-version }}']" backend-versions: "['${{ inputs.default-postgres-version }}']" + excluded-providers-as-string: ${{ inputs.excluded-providers-as-string }} excludes: "[]" - parallel-test-types-list-as-string: ${{ inputs.parallel-test-types-list-as-string }} - include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} + core-test-types-list-as-string: ${{ inputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ inputs.providers-test-types-list-as-string }} + include-success-outputs: ${{ inputs.include-success-outputs }} run-coverage: ${{ inputs.run-coverage }} debug-resources: ${{ inputs.debug-resources }} + use-uv: ${{ inputs.use-uv }} if: ${{ inputs.default-branch == 'main' }} + # matrix.test-group comes from run-unit-tests.yml tests-system: - name: "System test" + name: "System test: ${{ matrix.test-group }}" uses: ./.github/workflows/run-unit-tests.yml permissions: contents: read @@ -274,12 +218,15 @@ jobs: runs-on-as-json-default: ${{ inputs.runs-on-as-json-default }} test-name: "SystemTest" test-scope: "System" + test-groups: ${{ inputs.test-groups }} backend: "postgres" - image-tag: ${{ inputs.image-tag }} python-versions: "['${{ inputs.default-python-version }}']" backend-versions: "['${{ inputs.default-postgres-version }}']" + excluded-providers-as-string: ${{ inputs.excluded-providers-as-string }} excludes: "[]" - parallel-test-types-list-as-string: ${{ inputs.parallel-test-types-list-as-string }} - include-success-outputs: ${{ needs.build-info.outputs.include-success-outputs }} + core-test-types-list-as-string: ${{ inputs.core-test-types-list-as-string }} + providers-test-types-list-as-string: ${{ inputs.providers-test-types-list-as-string }} + include-success-outputs: ${{ inputs.include-success-outputs }} run-coverage: ${{ inputs.run-coverage }} debug-resources: ${{ inputs.debug-resources }} + use-uv: ${{ inputs.use-uv }} diff --git a/.github/workflows/check-providers.yml b/.github/workflows/test-provider-packages.yml similarity index 57% rename from .github/workflows/check-providers.yml rename to .github/workflows/test-provider-packages.yml index e89d4a81faaca..b0912fa6dfe37 100644 --- a/.github/workflows/check-providers.yml +++ b/.github/workflows/test-provider-packages.yml @@ -24,10 +24,6 @@ on: # yamllint disable-line rule:truthy description: "The array of labels (in json form) determining default runner used for the build." required: true type: string - image-tag: - description: "Tag to set for the image" - required: true - type: string canary-run: description: "Whether this is a canary run" required: true @@ -40,13 +36,13 @@ on: # yamllint disable-line rule:truthy description: "Whether to upgrade to newer dependencies" required: true type: string - affected-providers-list-as-string: + selected-providers-list-as-string: description: "List of affected providers as string" required: false type: string - providers-compatibility-checks: + providers-compatibility-tests-matrix: description: > - JSON-formatted array of providers compatibility checks in the form of array of dicts + JSON-formatted array of providers compatibility tests in the form of array of dicts (airflow-version, python-versions, remove-providers, run-tests) required: true type: string @@ -54,7 +50,7 @@ on: # yamllint disable-line rule:truthy description: "List of parallel provider test types as string" required: true type: string - skip-provider-tests: + skip-providers-tests: description: "Whether to skip provider tests (true/false)" required: true type: string @@ -62,16 +58,25 @@ on: # yamllint disable-line rule:truthy description: "JSON-formatted array of Python versions to build images from" required: true type: string + use-uv: + description: "Whether to use uv" + required: true + type: string +permissions: + contents: read jobs: - prepare-install-verify-provider-packages-wheel: + prepare-install-verify-provider-packages: timeout-minutes: 80 - name: "Provider packages wheel build and verify" + name: "Providers ${{ matrix.package-format }} tests" runs-on: ${{ fromJSON(inputs.runs-on-as-json-default) }} + strategy: + fail-fast: false + matrix: + package-format: ["wheel", "sdist"] env: GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} - IMAGE_TAG: "${{ inputs.image-tag }}" INCLUDE_NOT_READY_PROVIDERS: "true" PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" VERBOSE: "true" @@ -83,138 +88,93 @@ jobs: uses: actions/checkout@v4 with: persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: > - Prepare breeze & CI image: ${{ inputs.default-python-version }}:${{ inputs.image-tag }} + - name: "Prepare breeze & CI image: ${{ inputs.default-python-version }}" uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + python: ${{ inputs.default-python-version }} + use-uv: ${{ inputs.use-uv }} - name: "Cleanup dist files" run: rm -fv ./dist/* - name: "Prepare provider documentation" run: > breeze release-management prepare-provider-documentation --include-not-ready-providers --non-interactive - - name: "Prepare provider packages: wheel" + if: matrix.package-format == 'wheel' + - name: "Prepare provider packages: ${{ matrix.package-format }}" run: > breeze release-management prepare-provider-packages --include-not-ready-providers - --version-suffix-for-pypi dev0 --package-format wheel - - name: "Prepare airflow package: wheel" - run: breeze release-management prepare-airflow-package --version-suffix-for-pypi dev0 - - name: "Verify wheel packages with twine" + --version-suffix-for-pypi dev0 --package-format ${{ matrix.package-format }} + - name: "Prepare airflow package: ${{ matrix.package-format }}" + run: > + breeze release-management prepare-airflow-package --version-suffix-for-pypi dev0 + --package-format ${{ matrix.package-format }} + - name: "Verify ${{ matrix.package-format }} packages with twine" run: | - pipx uninstall twine || true - pipx install twine && twine check dist/*.whl + uv tool uninstall twine || true + uv tool install twine && twine check dist/* - name: "Test providers issue generation automatically" run: > breeze release-management generate-issue-content-providers --only-available-in-dist --disable-progress + if: matrix.package-format == 'wheel' + - name: Remove Python 3.9-incompatible provider packages + run: | + echo "Removing Python 3.9-incompatible provider: cloudant" + rm -vf dist/*cloudant* - name: "Generate source constraints from CI image" shell: bash run: > breeze release-management generate-constraints --airflow-constraints-mode constraints-source-providers --answer yes - - name: "Install and verify all provider packages and airflow via wheel files" - run: > - breeze release-management verify-provider-packages - --use-packages-from-dist - --package-format wheel - --use-airflow-version wheel - --airflow-constraints-reference default - --providers-constraints-location - /files/constraints-${{env.PYTHON_MAJOR_MINOR_VERSION}}/constraints-source-providers-${{env.PYTHON_MAJOR_MINOR_VERSION}}.txt + - name: "Install and verify wheel provider packages" env: + PACKAGE_FORMAT: ${{ matrix.package-format }} + PYTHON_MAJOR_MINOR_VERSION: ${env.PYTHON_MAJOR_MINOR_VERSION} AIRFLOW_SKIP_CONSTRAINTS: "${{ inputs.upgrade-to-newer-dependencies }}" - - name: "Prepare airflow package: wheel without suffix and skipping the tag check" run: > - breeze release-management prepare-provider-packages --skip-tag-check --package-format wheel - - prepare-install-provider-packages-sdist: - timeout-minutes: 80 - name: "Provider packages sdist build and install" - runs-on: ${{ fromJSON(inputs.runs-on-as-json-default) }} - env: - GITHUB_REPOSITORY: ${{ github.repository }} - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - GITHUB_USERNAME: ${{ github.actor }} - IMAGE_TAG: "${{ inputs.image-tag }}" - INCLUDE_NOT_READY_PROVIDERS: "true" - PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" - VERBOSE: "true" - steps: - - name: "Cleanup repo" - shell: bash - run: docker run -v "${GITHUB_WORKSPACE}:/workspace" -u 0:0 bash -c "rm -rf /workspace/*" - - name: "Checkout ${{ github.ref }} ( ${{ github.sha }} )" - uses: actions/checkout@v4 - with: - persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: > - Prepare breeze & CI image: ${{ inputs.default-python-version }}:${{ inputs.image-tag }} - uses: ./.github/actions/prepare_breeze_and_image - - name: "Cleanup dist files" - run: rm -fv ./dist/* - - name: "Prepare provider packages: sdist" - run: > - breeze release-management prepare-provider-packages --include-not-ready-providers - --version-suffix-for-pypi dev0 --package-format sdist - ${{ inputs.affected-providers-list-as-string }} - - name: "Prepare airflow package: sdist" - run: > - breeze release-management prepare-airflow-package - --version-suffix-for-pypi dev0 --package-format sdist - - name: "Verify sdist packages with twine" - run: | - pipx uninstall twine || true - pipx install twine && twine check dist/*.tar.gz - - name: "Generate source constraints from CI image" - shell: bash - run: > - breeze release-management generate-constraints - --airflow-constraints-mode constraints-source-providers --answer yes - - name: "Install all provider packages and airflow via sdist files" - run: > - breeze release-management install-provider-packages + breeze release-management verify-provider-packages --use-packages-from-dist - --package-format sdist - --use-airflow-version sdist + --package-format "${PACKAGE_FORMAT}" + --use-airflow-version "${PACKAGE_FORMAT}" --airflow-constraints-reference default --providers-constraints-location - /files/constraints-${{env.PYTHON_MAJOR_MINOR_VERSION}}/constraints-source-providers-${{env.PYTHON_MAJOR_MINOR_VERSION}}.txt - --run-in-parallel - if: inputs.affected-providers-list-as-string == '' - - name: "Install affected provider packages and airflow via sdist files" + /files/constraints-${PYTHON_MAJOR_MINOR_VERSION}/constraints-source-providers-${PYTHON_MAJOR_MINOR_VERSION}.txt + if: matrix.package-format == 'wheel' + - name: "Install all sdist provider packages and airflow" + env: + PACKAGE_FORMAT: ${{ matrix.package-format }} + PYTHON_MAJOR_MINOR_VERSION: ${{ env.PYTHON_MAJOR_MINOR_VERSION }} run: > breeze release-management install-provider-packages --use-packages-from-dist - --package-format sdist - --use-airflow-version sdist + --package-format "${PACKAGE_FORMAT}" + --use-airflow-version ${PACKAGE_FORMAT} --airflow-constraints-reference default --providers-constraints-location - /files/constraints-${{env.PYTHON_MAJOR_MINOR_VERSION}}/constraints-source-providers-${{env.PYTHON_MAJOR_MINOR_VERSION}}.txt + /files/constraints-${PYTHON_MAJOR_MINOR_VERSION}/constraints-source-providers-${PYTHON_MAJOR_MINOR_VERSION}.txt --run-in-parallel - if: inputs.affected-providers-list-as-string != '' + if: matrix.package-format == 'sdist' - providers-compatibility-checks: + # All matrix parameters are passed as JSON string in the input variable providers-compatibility-tests-matrix + providers-compatibility-tests-matrix: timeout-minutes: 80 - name: Compat ${{ matrix.airflow-version }}:P${{ matrix.python-version }} provider check + name: Compat ${{ matrix.airflow-version }}:P${{ matrix.python-version }} providers test runs-on: ${{ fromJSON(inputs.runs-on-as-json-default) }} strategy: fail-fast: false matrix: - include: ${{fromJSON(inputs.providers-compatibility-checks)}} + include: ${{fromJSON(inputs.providers-compatibility-tests-matrix)}} env: GITHUB_REPOSITORY: ${{ github.repository }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_USERNAME: ${{ github.actor }} - IMAGE_TAG: "${{ inputs.image-tag }}" INCLUDE_NOT_READY_PROVIDERS: "true" - PYTHON_MAJOR_MINOR_VERSION: "${{ inputs.default-python-version }}" + PYTHON_MAJOR_MINOR_VERSION: "${{ matrix.python-version }}" VERSION_SUFFIX_FOR_PYPI: "dev0" VERBOSE: "true" CLEAN_AIRFLOW_INSTALLATION: "${{ inputs.canary-run }}" - if: inputs.skip-provider-tests != 'true' + if: inputs.skip-providers-tests != 'true' steps: - name: "Cleanup repo" shell: bash @@ -223,10 +183,12 @@ jobs: uses: actions/checkout@v4 with: persist-credentials: false - - name: "Cleanup docker" - run: ./scripts/ci/cleanup_docker.sh - - name: "Prepare breeze & CI image: ${{ matrix.python-version }}:${{ inputs.image-tag }}" + - name: "Prepare breeze & CI image: ${{ matrix.python-version }}" uses: ./.github/actions/prepare_breeze_and_image + with: + platform: "linux/amd64" + python: ${{ matrix.python-version }} + use-uv: ${{ inputs.use-uv }} - name: "Cleanup dist files" run: rm -fv ./dist/* - name: "Prepare provider packages: wheel" @@ -236,8 +198,10 @@ jobs: - name: > Remove incompatible Airflow ${{ matrix.airflow-version }}:Python ${{ matrix.python-version }} provider packages + env: + REMOVE_PROVIDERS: ${{ matrix.remove-providers }} run: | - for provider in ${{ matrix.remove-providers }}; do + for provider in ${REMOVE_PROVIDERS}; do echo "Removing incompatible provider: ${provider}" rm -vf dist/apache_airflow_providers_${provider/./_}* done @@ -251,25 +215,34 @@ jobs: # We do not need to run import check if we run tests, the tests should cover all the import checks # automatically if: matrix.run-tests != 'true' + env: + AIRFLOW_VERSION: "${{ matrix.airflow-version }}" run: > breeze release-management verify-provider-packages --use-packages-from-dist --package-format wheel --use-airflow-version wheel - --airflow-constraints-reference constraints-${{matrix.airflow-version}} + --airflow-constraints-reference constraints-${AIRFLOW_VERSION} --providers-skip-constraints --install-airflow-with-constraints + - name: Check amount of disk space available + run: df -H + shell: bash - name: > Run provider unit tests on Airflow ${{ matrix.airflow-version }}:Python ${{ matrix.python-version }} if: matrix.run-tests == 'true' + env: + PROVIDERS_TEST_TYPES: "${{ inputs.providers-test-types-list-as-string }}" + AIRFLOW_VERSION: "${{ matrix.airflow-version }}" + REMOVE_PROVIDERS: "${{ matrix.remove-providers }}" run: > - breeze testing tests --run-in-parallel - --parallel-test-types "${{ inputs.providers-test-types-list-as-string }}" + breeze testing providers-tests --run-in-parallel + --parallel-test-types "${PROVIDERS_TEST_TYPES}" --use-packages-from-dist --package-format wheel - --use-airflow-version "${{ matrix.airflow-version }}" - --airflow-constraints-reference constraints-${{matrix.airflow-version}} + --use-airflow-version "${AIRFLOW_VERSION}" + --airflow-constraints-reference constraints-${AIRFLOW_VERSION} --install-airflow-with-constraints --providers-skip-constraints - --skip-providers "${{ matrix.remove-providers }}" + --skip-providers "${REMOVE_PROVIDERS}" diff --git a/.gitignore b/.gitignore index ac6ea1566111f..84afbd474102b 100644 --- a/.gitignore +++ b/.gitignore @@ -12,6 +12,7 @@ airflow.db airflow/git_version airflow/www/static/coverage/ airflow/www/*.log +airflow/ui/coverage/ logs/ airflow-webserver.pid standalone_admin_password.txt @@ -110,6 +111,7 @@ celerybeat-schedule # dotenv .env +.env.local .autoenv*.zsh # virtualenv @@ -127,9 +129,6 @@ ENV/ .idea/ *.iml -# Visual Studio Code -.vscode/ - # vim *.swp @@ -165,6 +164,15 @@ node_modules npm-debug.log* derby.log metastore_db +npm-debug.log* +yarn-debug.log* +yarn-error.log* +pnpm-debug.log* +.vscode/* +!.vscode/extensions.json +/.vite/ +.pnpm-store +*.tsbuildinfo # Airflow log files when airflow is run locally airflow-*.err @@ -249,6 +257,3 @@ airflow-build-dockerfile* # Temporary ignore uv.lock until we integrate it fully in our constraint preparation mechanism /uv.lock - -# Airflow 3 related files -airflow/ui diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 027fbd006b903..982dd51567285 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -20,11 +20,13 @@ default_language_version: python: python3 node: 22.2.0 minimum_pre_commit_version: '3.2.0' +exclude: ^.*/.*_vendor/ repos: - repo: meta hooks: - id: identity - name: Print input to the static check hooks for troubleshooting + name: Print checked files + description: Print input to the static check hooks for troubleshooting - id: check-hooks-apply name: Check if all hooks apply to the repository - repo: https://github.com/thlorenz/doctoc.git @@ -34,7 +36,6 @@ repos: name: Add TOC for Markdown and RST files files: ^README\.md$|^UPDATING.*\.md$|^chart/UPDATING.*\.md$|^dev/.*\.md$|^dev/.*\.rst$|^.github/.*\.md|^tests/system/README.md$ - exclude: ^.*/.*_vendor/ args: - "--maxlevel" - "2" @@ -46,8 +47,7 @@ repos: files: \.sql$ exclude: | (?x) - ^\.github/| - ^.*/.*_vendor/ + ^\.github/ args: - --comment-style - "/*||*/" @@ -56,7 +56,7 @@ repos: - --fuzzy-match-generates-todo - id: insert-license name: Add license for all RST files - exclude: ^\.github/.*$|^.*/.*_vendor/|newsfragments/.*\.rst$ + exclude: ^\.github/.*$|newsfragments/.*\.rst$ args: - --comment-style - "||" @@ -65,9 +65,9 @@ repos: - --fuzzy-match-generates-todo files: \.rst$ - id: insert-license - name: Add license for all CSS/JS/JSX/PUML/TS/TSX files + name: Add license for CSS/JS/JSX/PUML/TS/TSX files: \.(css|jsx?|puml|tsx?)$ - exclude: ^\.github/.*$|^.*/.*_vendor/ + exclude: ^\.github/.*$|^airflow/www/static/js/types/api-generated.ts$|ui/openapi-gen/ args: - --comment-style - "/*!| *| */" @@ -77,7 +77,7 @@ repos: - id: insert-license name: Add license for all JINJA template files files: ^airflow/www/templates/.*\.html$ - exclude: ^\.github/.*$|^.*/.*_vendor/ + exclude: ^\.github/.*$ args: - --comment-style - "{#||#}" @@ -86,7 +86,7 @@ repos: - --fuzzy-match-generates-todo - id: insert-license name: Add license for all Shell files - exclude: ^\.github/.*$|^.*/.*_vendor/|^dev/breeze/autocomplete/.*$ + exclude: ^\.github/.*$|^dev/breeze/autocomplete/.*$ files: \.bash$|\.sh$ args: - --comment-style @@ -96,7 +96,7 @@ repos: - --fuzzy-match-generates-todo - id: insert-license name: Add license for all toml files - exclude: ^\.github/.*$|^.*/.*_vendor/|^dev/breeze/autocomplete/.*$ + exclude: ^\.github/.*$|^dev/breeze/autocomplete/.*$ files: \.toml$ args: - --comment-style @@ -106,7 +106,7 @@ repos: - --fuzzy-match-generates-todo - id: insert-license name: Add license for all Python files - exclude: ^\.github/.*$|^.*/.*_vendor/ + exclude: ^\.github/.*$ files: \.py$|\.pyi$ args: - --comment-style @@ -116,7 +116,7 @@ repos: - --fuzzy-match-generates-todo - id: insert-license name: Add license for all XML files - exclude: ^\.github/.*$|^.*/.*_vendor/ + exclude: ^\.github/.*$ files: \.xml$ args: - --comment-style @@ -135,7 +135,7 @@ repos: - --fuzzy-match-generates-todo - id: insert-license name: Add license for all YAML files except Helm templates - exclude: ^\.github/.*$|^.*/.*_vendor/|^chart/templates/.*|.*/reproducible_build.yaml$ + exclude: ^\.github/.*$|^chart/templates/.*|.*/reproducible_build.yaml$|^.*/pnpm-lock.yaml$ types: [yaml] files: \.ya?ml$ args: @@ -147,7 +147,7 @@ repos: - id: insert-license name: Add license for all Markdown files files: \.md$ - exclude: PROVIDER_CHANGES.*\.md$|^.*/.*_vendor/ + exclude: PROVIDER_CHANGES.*\.md$ args: - --comment-style - "" @@ -156,7 +156,7 @@ repos: - --fuzzy-match-generates-todo - id: insert-license name: Add license for all other files - exclude: ^\.github/.*$|^.*/.*_vendor/ + exclude: ^\.github/.*$ args: - --comment-style - "|#|" @@ -173,14 +173,6 @@ repos: language: python additional_dependencies: ['rich>=12.4.4'] require_serial: true - - id: update-common-sql-api-stubs - name: Check and update common.sql API stubs - entry: ./scripts/ci/pre_commit/update_common_sql_api_stubs.py - language: python - files: ^scripts/ci/pre_commit/update_common_sql_api\.py|^airflow/providers/common/sql/.*\.pyi?$ - additional_dependencies: ['rich>=12.4.4', 'mypy==1.9.0', 'black==23.10.0', 'jinja2'] - pass_filenames: false - require_serial: true - id: update-black-version name: Update black versions everywhere (manual) entry: ./scripts/ci/pre_commit/update_black_version.py @@ -190,21 +182,12 @@ repos: additional_dependencies: ['pyyaml'] pass_filenames: false require_serial: true - - id: update-build-dependencies - name: Update build-dependencies to latest (manual) - entry: ./scripts/ci/pre_commit/update_build_dependencies.py + - id: update-installers-and-pre-commit + name: Update installers and pre-commit to latest (manual) + entry: ./scripts/ci/pre_commit/update_installers_and_pre_commit.py stages: ['manual'] language: python - files: ^.pre-commit-config.yaml$|^scripts/ci/pre_commit/update_build_dependencies.py$ - pass_filenames: false - require_serial: true - additional_dependencies: ['rich>=12.4.4'] - - id: update-installers - name: Update installers to latest (manual) - entry: ./scripts/ci/pre_commit/update_installers.py - stages: ['manual'] - language: python - files: ^.pre-commit-config.yaml$|^scripts/ci/pre_commit/update_installers.py$ + files: ^.pre-commit-config.yaml$|^scripts/ci/pre_commit/update_installers_and_pre_commit.py$ pass_filenames: false require_serial: true additional_dependencies: ['pyyaml', 'rich>=12.4.4', 'requests'] @@ -217,65 +200,45 @@ repos: files: ^.pre-commit-config.yaml$|^scripts/ci/pre_commit/update_build_dependencies.py$ pass_filenames: false require_serial: true - - id: check-taskinstance-tis-attrs - name: Check that TI and TIS have the same attributes - entry: ./scripts/ci/pre_commit/check_ti_vs_tis_attributes.py - language: python - additional_dependencies: ['rich>=12.4.4'] - files: ^airflow/models/taskinstance.py$|^airflow/models/taskinstancehistory.py$ - pass_filenames: false - require_serial: true - - id: check-deferrable-default - name: Check and fix default value of default_deferrable - language: python - entry: ./scripts/ci/pre_commit/check_deferrable_default.py - pass_filenames: false - additional_dependencies: ["libcst>=1.1.0"] - files: ^airflow/.*/sensors/.*\.py$|^airflow/.*/operators/.*\.py$ - repo: https://github.com/asottile/blacken-docs - rev: 1.18.0 + rev: 1.19.1 hooks: - id: blacken-docs - name: Run black on Python code blocks in documentation files + name: Run black on docs args: - --line-length=110 - - --target-version=py37 - - --target-version=py38 - --target-version=py39 - --target-version=py310 + - --target-version=py311 + - --target-version=py312 alias: blacken-docs - additional_dependencies: [black==23.10.0] + additional_dependencies: [black==24.10.0] - repo: https://github.com/pre-commit/pre-commit-hooks - rev: v4.6.0 + rev: v5.0.0 hooks: - id: check-merge-conflict name: Check that merge conflicts are not being committed - id: debug-statements name: Detect accidentally committed debug statements - id: check-builtin-literals - name: Require literal syntax when initializing builtin types - exclude: ^.*/.*_vendor/ + name: Require literal syntax when initializing builtins - id: detect-private-key name: Detect if private key is added to the repository exclude: ^docs/apache-airflow-providers-ssh/connections/ssh.rst$ - id: end-of-file-fixer name: Make sure that there is an empty line at the end - exclude: ^.*/.*_vendor/|^docs/apache-airflow/img/.*\.dot|^docs/apache-airflow/img/.*\.sha256 + exclude: ^docs/apache-airflow/img/.*\.dot|^docs/apache-airflow/img/.*\.sha256 - id: mixed-line-ending name: Detect if mixed line ending is used (\r vs. \r\n) - exclude: ^.*/.*_vendor/ - id: check-executables-have-shebangs name: Check that executables have shebang - exclude: ^.*/.*_vendor/ - id: check-xml name: Check XML files with xmllint - exclude: ^.*/.*_vendor/ - id: trailing-whitespace name: Remove trailing whitespace at end of line - exclude: ^.*/.*_vendor/|^docs/apache-airflow/img/.*\.dot|^dev/breeze/doc/images/output.*$ + exclude: ^docs/apache-airflow/img/.*\.dot|^dev/breeze/doc/images/output.*$ - id: fix-encoding-pragma name: Remove encoding header from Python files - exclude: ^.*/.*_vendor/ args: - --remove - id: pretty-format-json @@ -292,10 +255,8 @@ repos: hooks: - id: rst-backticks name: Check if RST files use double backticks for code - exclude: ^.*/.*_vendor/ - id: python-no-log-warn name: Check if there are no deprecate log warn - exclude: ^.*/.*_vendor/ - repo: https://github.com/adrienverge/yamllint rev: v1.35.1 hooks: @@ -303,15 +264,12 @@ repos: name: Check YAML files with yamllint entry: yamllint -c yamllint-config.yml --strict types: [yaml] - exclude: ^.*airflow\.template\.yaml$|^.*init_git_sync\.template\.yaml$|^.*/.*_vendor/|^chart/(?:templates|files)/.*\.yaml$|openapi/.*\.yaml$|^\.pre-commit-config\.yaml$|^.*/reproducible_build.yaml$ + exclude: ^.*airflow\.template\.yaml$|^.*init_git_sync\.template\.yaml$|^chart/(?:templates|files)/.*\.yaml$|openapi/.*\.yaml$|^\.pre-commit-config\.yaml$|^.*/reproducible_build.yaml$|^.*pnpm-lock\.yaml$ - repo: https://github.com/ikamensh/flynt rev: '1.0.1' hooks: - id: flynt name: Run flynt string format converter for Python - exclude: | - (?x) - ^.*/.*_vendor/ args: # If flynt detects too long text it ignores it. So we set a very large limit to make it easy # to split the text by hand. Too long lines are detected by flake8 (below), @@ -322,17 +280,27 @@ repos: rev: v2.3.0 hooks: - id: codespell - name: Run codespell to check for common misspellings in files + name: Run codespell + description: Run codespell to check for common misspellings in files entry: bash -c 'echo "If you think that this failure is an error, consider adding the word(s) to the codespell dictionary at docs/spelling_wordlist.txt. The word(s) should be in lowercase." && exec codespell "$@"' -- language: python types: [text] - exclude: ^.*/.*_vendor/|^airflow/www/static/css/material-icons\.css$|^images/.*$|^RELEASE_NOTES\.txt$|^.*package-lock\.json$|^.*/kinglear\.txt$ + exclude: material-icons\.css$|^images/.*$|^RELEASE_NOTES\.txt$|^.*package-lock\.json$|^.*/kinglear\.txt$|^.*pnpm-lock\.yaml$ args: - --ignore-words=docs/spelling_wordlist.txt - --skip=airflow/providers/*/*.rst,airflow/www/*.log,docs/*/commits.rst,docs/apache-airflow/tutorial/pipeline_example.csv,*.min.js,*.lock,INTHEWILD.md - --exclude-file=.codespellignorelines + - repo: https://github.com/woodruffw/zizmor-pre-commit + rev: v1.0.0 + hooks: + - id: zizmor + name: Run zizmor to check for github workflow syntax errors + types: [yaml] + files: \.github/workflows/.*$|\.github/actions/.*$ + require_serial: true + entry: zizmor - repo: local # Note that this is the 2nd "local" repo group in the .pre-commit-config.yaml file. This is because # we try to minimise the number of passes that must happen in order to apply some of the changes @@ -342,13 +310,6 @@ repos: # changes quickly - especially when we want the early modifications from the first local group # to be applied before the non-local pre-commits are run hooks: - - id: validate-operators-init - name: Prevent templated field logic checks in operators' __init__ - language: python - entry: ./scripts/ci/pre_commit/validate_operators_init.py - pass_filenames: true - files: ^airflow/providers/.*/(operators|transfers|sensors)/.*\.py$ - additional_dependencies: [ 'rich>=12.4.4' ] - id: ruff name: Run 'ruff' for extremely fast Python linting description: "Run 'ruff' for extremely fast Python linting" @@ -357,24 +318,24 @@ repos: types_or: [python, pyi] args: [--fix] require_serial: true - additional_dependencies: ["ruff==0.5.5"] - exclude: ^.*/.*_vendor/|^tests/dags/test_imports.py + additional_dependencies: ["ruff==0.8.1"] + exclude: ^tests/dags/test_imports.py|^performance/tests/test_.*.py - id: ruff-format - name: Run 'ruff format' for extremely fast Python formatting + name: Run 'ruff format' description: "Run 'ruff format' for extremely fast Python formatting" entry: ./scripts/ci/pre_commit/ruff_format.py language: python types_or: [python, pyi] args: [] require_serial: true - additional_dependencies: ["ruff==0.5.5"] - exclude: ^.*/.*_vendor/|^tests/dags/test_imports.py|^airflow/contrib/ + additional_dependencies: ["ruff==0.8.1"] + exclude: ^tests/dags/test_imports.py$ - id: replace-bad-characters name: Replace bad characters entry: ./scripts/ci/pre_commit/replace_bad_characters.py language: python types: [file, text] - exclude: ^.*/.*_vendor/|^clients/gen/go\.sh$|^\.gitmodules$ + exclude: ^clients/gen/go\.sh$|^\.gitmodules$ additional_dependencies: ['rich>=12.4.4'] - id: lint-openapi name: Lint OpenAPI using spectral @@ -410,36 +371,6 @@ repos: exclude: ^airflow/openlineage/ entry: ./scripts/ci/pre_commit/check_common_compat_used_for_openlineage.py additional_dependencies: ['rich>=12.4.4'] - - id: check-airflow-providers-bug-report-template - name: Check airflow-bug-report provider list is sorted/unique - language: python - files: ^.github/ISSUE_TEMPLATE/airflow_providers_bug_report\.yml$ - require_serial: true - entry: ./scripts/ci/pre_commit/check_airflow_bug_report_template.py - additional_dependencies: ['rich>=12.4.4', 'pyyaml'] - - id: check-cncf-k8s-only-for-executors - name: Check cncf.kubernetes imports used for executors only - language: python - files: ^airflow/.*\.py$ - require_serial: true - exclude: ^airflow/kubernetes/|^airflow/providers/ - entry: ./scripts/ci/pre_commit/check_cncf_k8s_used_for_k8s_executor_only.py - additional_dependencies: ['rich>=12.4.4'] - - id: check-airflow-provider-compatibility - name: Check compatibility of Providers with Airflow - entry: ./scripts/ci/pre_commit/check_provider_airflow_compatibility.py - language: python - pass_filenames: true - files: ^airflow/providers/.*\.py$ - additional_dependencies: ['rich>=12.4.4'] - - id: check-google-re2-as-dependency - name: Check google-re2 is declared as dependency when needed - entry: ./scripts/ci/pre_commit/check_google_re2_imports.py - language: python - pass_filenames: true - require_serial: true - files: ^airflow/providers/.*\.py$ - additional_dependencies: ['rich>=12.4.4'] - id: update-local-yml-file name: Update mounts in the local yml file entry: ./scripts/ci/pre_commit/local_yml_mounts.py @@ -447,12 +378,6 @@ repos: files: ^dev/breeze/src/airflow_breeze/utils/docker_command_utils\.py$|^scripts/ci/docker_compose/local\.yml$ pass_filenames: false additional_dependencies: ['rich>=12.4.4'] - - id: check-sql-dependency-common-data-structure - name: Check dependency of SQL Providers with common data structure - entry: ./scripts/ci/pre_commit/check_common_sql_dependency.py - language: python - files: ^airflow/providers/.*/hooks/.*\.py$ - additional_dependencies: ['rich>=12.4.4', 'pyyaml', 'packaging'] - id: update-providers-dependencies name: Update dependencies for provider packages entry: ./scripts/ci/pre_commit/update_providers_dependencies.py @@ -475,13 +400,6 @@ repos: pass_filenames: false entry: ./scripts/ci/pre_commit/check_order_hatch_build.py additional_dependencies: ['rich>=12.4.4', 'hatchling==1.27.0'] - - id: update-extras - name: Update extras in documentation - entry: ./scripts/ci/pre_commit/insert_extras.py - language: python - files: ^contributing-docs/12_airflow_dependencies_and_extras.rst$|^INSTALL$|^airflow/providers/.*/provider\.yaml$|^Dockerfile.* - pass_filenames: false - additional_dependencies: ['rich>=12.4.4', 'hatchling==1.27.0'] - id: check-extras-order name: Check order of extras in Dockerfile entry: ./scripts/ci/pre_commit/check_order_dockerfile_extras.py @@ -503,19 +421,8 @@ repos: files: ^docs/apache-airflow/installation/supported-versions\.rst$|^scripts/ci/pre_commit/supported_versions\.py$|^README\.md$ pass_filenames: false additional_dependencies: ['tabulate'] - - id: check-revision-heads-map - name: Check that the REVISION_HEADS_MAP is up-to-date - language: python - entry: ./scripts/ci/pre_commit/version_heads_map.py - pass_filenames: false - files: > - (?x) - ^scripts/ci/pre_commit/version_heads_map\.py$| - ^airflow/migrations/versions/.*$|^airflow/migrations/versions| - ^airflow/utils/db.py$ - additional_dependencies: ['packaging','google-re2'] - id: update-version - name: Update version to the latest version in the documentation + name: Update versions in docs entry: ./scripts/ci/pre_commit/update_versions.py language: python files: ^docs|^airflow/__init__.py$ @@ -528,7 +435,7 @@ repos: pass_filenames: true files: \.py$ - id: check-links-to-example-dags-do-not-use-hardcoded-versions - name: Verify example dags do not use hard-coded version numbers + name: Verify no hard-coded version in example dags description: The links to example dags should use |version| as version specification language: pygrep entry: > @@ -622,13 +529,13 @@ repos: ^airflow/providers/opsgenie/hooks/opsgenie.py$| ^airflow/providers/redis/provider.yaml$| ^airflow/serialization/serialized_objects.py$| + ^airflow/ui/pnpm-lock.yaml$| ^airflow/utils/db.py$| ^airflow/utils/trigger_rule.py$| ^airflow/www/static/css/bootstrap-theme.css$| ^airflow/www/static/js/types/api-generated.ts$| ^airflow/www/templates/appbuilder/flash.html$| ^chart/values.schema.json$| - ^.*/.*_vendor/| ^dev/| ^docs/README.rst$| ^docs/apache-airflow-providers-amazon/secrets-backends/aws-ssm-parameter-store.rst$| @@ -641,28 +548,21 @@ repos: ^docs/apache-airflow-providers-cncf-kubernetes/operators.rst$| ^docs/conf.py$| ^docs/exts/removemarktransform.py$| + ^newsfragments/41761.significant.rst$| ^scripts/ci/pre_commit/vendor_k8s_json_schema.py$| + ^scripts/ci/docker-compose/integration-keycloak.yml$| + ^scripts/ci/docker-compose/keycloak/keycloak-entrypoint.sh$| ^tests/| + ^providers/tests/| ^.pre-commit-config\.yaml$| ^.*CHANGELOG\.(rst|txt)$| ^.*RELEASE_NOTES\.rst$| ^contributing-docs/03_contributors_quick_start.rst$| ^.*\.(png|gif|jp[e]?g|tgz|lock)$| - git - - id: check-base-operator-partial-arguments - name: Check BaseOperator and partial() arguments - language: python - entry: ./scripts/ci/pre_commit/base_operator_partial_arguments.py - pass_filenames: false - files: ^airflow/models/(?:base|mapped)operator\.py$ - - id: check-init-decorator-arguments - name: Check model __init__ and decorator arguments are in sync - language: python - entry: ./scripts/ci/pre_commit/sync_init_decorator.py - pass_filenames: false - files: ^airflow/models/dag\.py$|^airflow/(?:decorators|utils)/task_group\.py$ + git| + ^newsfragments/43349\.significant\.rst$ - id: check-template-context-variable-in-sync - name: Check all template context variable references are in sync + name: Sync template context variable refs language: python entry: ./scripts/ci/pre_commit/template_context_key_sync.py files: ^airflow/models/taskinstance\.py$|^airflow/utils/context\.pyi?$|^docs/apache-airflow/templates-ref\.rst$ @@ -730,33 +630,28 @@ repos: pass_filenames: true - id: check-provide-create-sessions-imports language: pygrep - name: Check provide_session and create_session imports - description: provide_session and create_session should be imported from airflow.utils.session - to avoid import cycles. - entry: "from airflow\\.utils\\.db import.* (provide_session|create_session)" + name: Check session util imports + description: NEW_SESSION, provide_session, and create_session should be imported from airflow.utils.session to avoid import cycles. + entry: "from airflow\\.utils\\.db import.* (NEW_SESSION|provide_session|create_session)" files: \.py$ - exclude: ^.*/.*_vendor/ pass_filenames: true - id: check-incorrect-use-of-LoggingMixin language: pygrep name: Make sure LoggingMixin is not used alone entry: "LoggingMixin\\(\\)" files: \.py$ - exclude: ^.*/.*_vendor/ pass_filenames: true - id: check-daysago-import-from-utils language: pygrep - name: Make sure days_ago is imported from airflow.utils.dates + name: days_ago imported from airflow.utils.dates entry: "(airflow\\.){0,1}utils\\.dates\\.days_ago" files: \.py$ - exclude: ^.*/.*_vendor/ pass_filenames: true - id: check-start-date-not-used-in-defaults language: pygrep - name: start_date not to be defined in default_args in example_dags + name: start_date not in default_args entry: "default_args\\s*=\\s*{\\s*(\"|')start_date(\"|')|(\"|')start_date(\"|'):" files: \.*example_dags.*\.py$ - exclude: ^.*/.*_vendor/ pass_filenames: true - id: check-apache-license-rat name: Check if licenses are OK for Apache @@ -778,7 +673,7 @@ repos: entry: ./scripts/ci/pre_commit/boring_cyborg.py pass_filenames: false require_serial: true - additional_dependencies: ['pyyaml', 'termcolor==1.1.0', 'wcmatch==8.2'] + additional_dependencies: ['pyyaml', 'termcolor==2.5.0', 'wcmatch==8.2'] - id: update-in-the-wild-to-be-sorted name: Sort INTHEWILD.md alphabetically entry: ./scripts/ci/pre_commit/sort_in_the_wild.py @@ -787,14 +682,14 @@ repos: pass_filenames: false require_serial: true - id: update-installed-providers-to-be-sorted - name: Sort alphabetically and uniquify installed_providers.txt + name: Sort and uniquify installed_providers.txt entry: ./scripts/ci/pre_commit/sort_installed_providers.py language: python files: ^\.pre-commit-config\.yaml$|^.*_installed_providers\.txt$ pass_filenames: false require_serial: true - id: update-spelling-wordlist-to-be-sorted - name: Sort alphabetically and uniquify spelling_wordlist.txt + name: Sort spelling_wordlist.txt entry: ./scripts/ci/pre_commit/sort_spelling_wordlist.py language: python files: ^\.pre-commit-config\.yaml$|^docs/spelling_wordlist\.txt$ @@ -855,12 +750,6 @@ repos: entry: ./scripts/ci/pre_commit/compile_www_assets_dev.py pass_filenames: false additional_dependencies: ['yarn@1.22.21'] - - id: check-providers-init-file-missing - name: Provider init file is missing - pass_filenames: false - always_run: true - entry: ./scripts/ci/pre_commit/check_providers_init.py - language: python - id: check-providers-subpackages-init-file-exist name: Provider subpackage init files are there pass_filenames: false @@ -872,10 +761,10 @@ repos: name: Validate hook IDs & names and sync with docs entry: ./scripts/ci/pre_commit/check_pre_commit_hooks.py args: - - --max-length=60 + - --max-length=53 language: python files: ^\.pre-commit-config\.yaml$|^scripts/ci/pre_commit/check_pre_commit_hooks\.py$ - additional_dependencies: ['pyyaml', 'jinja2', 'black==23.10.0', 'tabulate', 'rich>=12.4.4'] + additional_dependencies: ['pyyaml', 'jinja2', 'black==24.10.0', 'tabulate', 'rich>=12.4.4'] require_serial: true pass_filenames: false - id: check-integrations-list-consistent @@ -883,7 +772,7 @@ repos: entry: ./scripts/ci/pre_commit/check_integrations_list.py language: python files: ^scripts/ci/docker-compose/integration-.*\.yml$|^contributing-docs/testing/integration_tests.rst$ - additional_dependencies: ['black==23.10.0', 'tabulate', 'rich>=12.4.4', 'pyyaml'] + additional_dependencies: ['black==24.10.0', 'tabulate', 'rich>=12.4.4', 'pyyaml'] require_serial: true pass_filenames: false - id: update-breeze-readme-config-hash @@ -902,29 +791,14 @@ repos: pass_filenames: false require_serial: true - id: check-breeze-top-dependencies-limited - name: Breeze should have small number of top-level dependencies + name: Check top-level breeze deps + description: Breeze should have small number of top-level dependencies language: python entry: ./scripts/tools/check_if_limited_dependencies.py files: ^dev/breeze/.*$ pass_filenames: false require_serial: true additional_dependencies: ['click', 'rich>=12.4.4', 'pyyaml'] - - id: check-tests-in-the-right-folders - name: Check if tests are in the right folders - entry: ./scripts/ci/pre_commit/check_tests_in_right_folders.py - language: python - files: ^tests/.*\.py - pass_filenames: true - require_serial: true - additional_dependencies: ['rich>=12.4.4'] - - id: check-system-tests-present - name: Check if system tests have required segments of code - entry: ./scripts/ci/pre_commit/check_system_tests.py - language: python - files: ^tests/system/.*/example_[^/]*\.py$ - exclude: ^tests/system/providers/google/cloud/bigquery/example_bigquery_queries\.py$ - pass_filenames: true - additional_dependencies: ['rich>=12.4.4'] - id: generate-pypi-readme name: Generate PyPI README entry: ./scripts/ci/pre_commit/generate_pypi_readme.py @@ -940,7 +814,7 @@ repos: files: \.(md|mdown|markdown)$ additional_dependencies: ['markdownlint-cli@0.38.0'] - id: lint-json-schema - name: Lint JSON Schema files with JSON Schema + name: Lint JSON Schema files entry: ./scripts/ci/pre_commit/json_schema.py args: - --spec-file @@ -948,11 +822,10 @@ repos: language: python pass_filenames: true files: .*\.schema\.json$ - exclude: ^.*/.*_vendor/ require_serial: true - additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==5.3.1', 'requests==2.25.0'] + additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==6.0.2', 'requests==2.32.3'] - id: lint-json-schema - name: Lint NodePort Service with JSON Schema + name: Lint NodePort Service entry: ./scripts/ci/pre_commit/json_schema.py args: - --spec-url @@ -961,9 +834,9 @@ repos: pass_filenames: true files: ^scripts/ci/kubernetes/nodeport\.yaml$ require_serial: true - additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==5.3.1', 'requests==2.25.0'] + additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==6.0.2', 'requests==2.32.3'] - id: lint-json-schema - name: Lint Docker compose files with JSON Schema + name: Lint Docker compose files entry: ./scripts/ci/pre_commit/json_schema.py args: - --spec-url @@ -976,9 +849,9 @@ repos: ^scripts/ci/docker-compose/grafana/.| ^scripts/ci/docker-compose/.+-config\.ya?ml require_serial: true - additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==5.3.1', 'requests==2.25.0'] + additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==6.0.2', 'requests==2.32.3'] - id: lint-json-schema - name: Lint chart/values.schema.json file with JSON Schema + name: Lint chart/values.schema.json entry: ./scripts/ci/pre_commit/json_schema.py args: - --spec-file @@ -988,15 +861,15 @@ repos: pass_filenames: false files: ^chart/values\.schema\.json$|^chart/values_schema\.schema\.json$ require_serial: true - additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==5.3.1', 'requests==2.25.0'] + additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==6.0.2', 'requests==2.32.3'] - id: update-vendored-in-k8s-json-schema name: Vendor k8s definitions into values.schema.json entry: ./scripts/ci/pre_commit/vendor_k8s_json_schema.py language: python files: ^chart/values\.schema\.json$ - additional_dependencies: ['requests==2.25.0'] + additional_dependencies: ['requests==2.32.3'] - id: lint-json-schema - name: Lint chart/values.yaml file with JSON Schema + name: Lint chart/values.yaml entry: ./scripts/ci/pre_commit/json_schema.py args: - --enforce-defaults @@ -1007,9 +880,9 @@ repos: pass_filenames: false files: ^chart/values\.yaml$|^chart/values\.schema\.json$ require_serial: true - additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==5.3.1', 'requests==2.25.0'] + additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==6.0.2', 'requests==2.32.3'] - id: lint-json-schema - name: Lint config_templates/config.yml file with JSON Schema + name: Lint config_templates/config.yml entry: ./scripts/ci/pre_commit/json_schema.py args: - --spec-file @@ -1018,9 +891,10 @@ repos: pass_filenames: true files: ^airflow/config_templates/config\.yml$ require_serial: true - additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==5.3.1', 'requests==2.25.0'] + additional_dependencies: ['jsonschema>=3.2.0,<5.0', 'PyYAML==6.0.2', 'requests==2.32.3'] - id: check-persist-credentials-disabled-in-github-workflows - name: Check that workflow files have persist-credentials disabled + name: Check persistent creds in workflow files + description: Check that workflow files have persist-credentials disabled entry: ./scripts/ci/pre_commit/checkout_no_credentials.py language: python pass_filenames: true @@ -1032,22 +906,7 @@ repos: language: python pass_filenames: true files: \.py$ - exclude: ^.*/.*_vendor/ additional_dependencies: ['rich>=12.4.4'] - - id: check-compat-cache-on-methods - name: Check that compat cache do not use on class methods - entry: ./scripts/ci/pre_commit/compat_cache_on_methods.py - language: python - pass_filenames: true - files: ^airflow/.*\.py$ - exclude: ^.*/.*_vendor/ - - id: check-code-deprecations - name: Check deprecations categories in decorators - entry: ./scripts/ci/pre_commit/check_deprecations.py - language: python - pass_filenames: true - files: ^airflow/.*\.py$ - exclude: ^.*/.*_vendor/ - id: lint-chart-schema name: Lint chart/values.schema.json file entry: ./scripts/ci/pre_commit/chart_schema.py @@ -1084,7 +943,8 @@ repos: # This is fast, so not too much downside always_run: true - id: update-breeze-cmd-output - name: Update output of breeze commands in Breeze documentation + name: Update breeze docs + description: Update output of breeze commands in Breeze documentation entry: ./scripts/ci/pre_commit/breeze_cmd_line.py language: python files: > @@ -1096,28 +956,12 @@ repos: require_serial: true pass_filenames: false additional_dependencies: ['rich>=12.4.4'] - - id: check-example-dags-urls - name: Check that example dags url include provider versions - entry: ./scripts/ci/pre_commit/update_example_dags_paths.py - language: python - pass_filenames: true - files: ^docs/.*example-dags\.rst$|^docs/.*index\.rst$ - additional_dependencies: ['rich>=12.4.4', 'pyyaml'] - always_run: true - - id: check-system-tests-tocs - name: Check that system tests is properly added - entry: ./scripts/ci/pre_commit/check_system_tests_hidden_in_index.py - language: python - pass_filenames: true - files: ^docs/apache-airflow-providers-[^/]*/index\.rst$ - additional_dependencies: ['rich>=12.4.4', 'pyyaml'] - id: check-lazy-logging name: Check that all logging methods are lazy entry: ./scripts/ci/pre_commit/check_lazy_logging.py language: python pass_filenames: true files: \.py$ - exclude: ^.*/.*_vendor/ additional_dependencies: ['rich>=12.4.4', 'astor'] - id: create-missing-init-py-files-tests name: Create missing init.py files in tests @@ -1127,15 +971,17 @@ repos: pass_filenames: false files: ^tests/.*\.py$ - id: ts-compile-format-lint-www - name: TS types generation / ESLint / Prettier against UI files + name: Compile / format / lint WWW + description: TS types generation / ESLint / Prettier against UI files language: node 'types_or': [javascript, ts, tsx, yaml, css, json] files: ^airflow/www/static/(js|css)/|^airflow/api_connexion/openapi/v1\.yaml$ - entry: ./scripts/ci/pre_commit/www_lint.py + entry: ./scripts/ci/pre_commit/lint_www.py additional_dependencies: ['yarn@1.22.21', "openapi-typescript@>=6.7.4"] pass_filenames: false - id: check-tests-unittest-testcase - name: Check that unit tests do not inherit from unittest.TestCase + name: Unit tests do not inherit from unittest.TestCase + description: Check that unit tests do not inherit from unittest.TestCase entry: ./scripts/ci/pre_commit/unittest_testcase.py language: python pass_filenames: true @@ -1243,20 +1089,20 @@ repos: - id: mypy-airflow name: Run mypy for airflow language: python - entry: ./scripts/ci/pre_commit/mypy.py --namespace-packages + entry: ./scripts/ci/pre_commit/mypy.py files: \.py$ exclude: ^.*/.*_vendor/|^airflow/migrations|^airflow/providers|^dev|^scripts|^docs|^provider_packages|^tests/providers|^tests/system/providers|^tests/dags/test_imports.py|^clients/python/test_.*\.py require_serial: true additional_dependencies: ['rich>=12.4.4'] - id: mypy-airflow - stages: [ 'manual' ] + stages: ['manual'] name: Run mypy for airflow (manual) language: python entry: ./scripts/ci/pre_commit/mypy_folder.py airflow pass_filenames: false files: ^.*\.py$ require_serial: true - additional_dependencies: [ 'rich>=12.4.4' ] + additional_dependencies: ['rich>=12.4.4'] - id: mypy-providers name: Run mypy for providers language: python @@ -1269,7 +1115,7 @@ repos: stages: ['manual'] name: Run mypy for providers (manual) language: python - entry: ./scripts/ci/pre_commit/mypy_folder.py airflow/providers + entry: ./scripts/ci/pre_commit/mypy_folder.py providers/src/airflow/providers pass_filenames: false files: ^.*\.py$ require_serial: true diff --git a/Dockerfile b/Dockerfile index 5ac5fd61e1228..a96ddc40db759 100644 --- a/Dockerfile +++ b/Dockerfile @@ -45,12 +45,17 @@ ARG AIRFLOW_UID="50000" ARG AIRFLOW_USER_HOME_DIR=/home/airflow # latest released version here -ARG AIRFLOW_VERSION="2.9.3" +ARG AIRFLOW_VERSION="2.10.4" ARG PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" + +# You can swap comments between those two args to test pip from the main version +# When you attempt to test if the version of `pip` from specified branch works for our builds +# Also use `force pip` label on your PR to swap all places we use `uv` to `pip` ARG AIRFLOW_PIP_VERSION=24.3.1 -ARG AIRFLOW_UV_VERSION=0.5.11 +# ARG AIRFLOW_PIP_VERSION="git+https://github.com/pypa/pip.git@main" +ARG AIRFLOW_UV_VERSION=0.5.17 ARG AIRFLOW_USE_UV="false" ARG UV_HTTP_TIMEOUT="300" ARG AIRFLOW_IMAGE_REPOSITORY="https://github.com/apache/airflow" @@ -417,85 +422,6 @@ common::show_packaging_tool_version_and_location common::install_packaging_tools EOF -# The content below is automatically copied from scripts/docker/install_airflow_dependencies_from_branch_tip.sh -COPY <<"EOF" /install_airflow_dependencies_from_branch_tip.sh -#!/usr/bin/env bash - -. "$( dirname "${BASH_SOURCE[0]}" )/common.sh" - -: "${AIRFLOW_REPO:?Should be set}" -: "${AIRFLOW_BRANCH:?Should be set}" -: "${INSTALL_MYSQL_CLIENT:?Should be true or false}" -: "${INSTALL_POSTGRES_CLIENT:?Should be true or false}" - -function install_airflow_dependencies_from_branch_tip() { - echo - echo "${COLOR_BLUE}Installing airflow from ${AIRFLOW_BRANCH}. It is used to cache dependencies${COLOR_RESET}" - echo - if [[ ${INSTALL_MYSQL_CLIENT} != "true" ]]; then - AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS/mysql,} - fi - if [[ ${INSTALL_POSTGRES_CLIENT} != "true" ]]; then - AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS/postgres,} - fi - local TEMP_AIRFLOW_DIR - TEMP_AIRFLOW_DIR=$(mktemp -d) - # Install latest set of dependencies - without constraints. This is to download a "base" set of - # dependencies that we can cache and reuse when installing airflow using constraints and latest - # pyproject.toml in the next step (when we install regular airflow). - set -x - curl -fsSL "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz" | \ - tar xz -C "${TEMP_AIRFLOW_DIR}" --strip 1 - # Make sure editable dependencies are calculated when devel-ci dependencies are installed - ${PACKAGING_TOOL_CMD} install ${EXTRA_INSTALL_FLAGS} ${ADDITIONAL_PIP_INSTALL_FLAGS} \ - --editable "${TEMP_AIRFLOW_DIR}[${AIRFLOW_EXTRAS}]" - set +x - common::install_packaging_tools - set -x - echo "${COLOR_BLUE}Uninstalling providers. Dependencies remain${COLOR_RESET}" - # Uninstall airflow and providers to keep only the dependencies. In the future when - # planned https://github.com/pypa/pip/issues/11440 is implemented in pip we might be able to use this - # flag and skip the remove step. - pip freeze | grep apache-airflow-providers | xargs ${PACKAGING_TOOL_CMD} uninstall ${EXTRA_UNINSTALL_FLAGS} || true - set +x - echo - echo "${COLOR_BLUE}Uninstalling just airflow. Dependencies remain. Now target airflow can be reinstalled using mostly cached dependencies${COLOR_RESET}" - echo - set +x - ${PACKAGING_TOOL_CMD} uninstall ${EXTRA_UNINSTALL_FLAGS} apache-airflow - rm -rf "${TEMP_AIRFLOW_DIR}" - set -x - # If you want to make sure dependency is removed from cache in your PR when you removed it from - # pyproject.toml - please add your dependency here as a list of strings - # for example: - # DEPENDENCIES_TO_REMOVE=("package_a" "package_b") - # Once your PR is merged, you should make a follow-up PR to remove it from this list - # and increase the AIRFLOW_CI_BUILD_EPOCH in Dockerfile.ci to make sure your cache is rebuilt. - local DEPENDENCIES_TO_REMOVE - # IMPORTANT!! Make sure to increase AIRFLOW_CI_BUILD_EPOCH in Dockerfile.ci when you remove a dependency from that list - DEPENDENCIES_TO_REMOVE=() - if [[ "${DEPENDENCIES_TO_REMOVE[*]}" != "" ]]; then - echo - echo "${COLOR_BLUE}Uninstalling just removed dependencies (temporary until cache refreshes)${COLOR_RESET}" - echo "${COLOR_BLUE}Dependencies to uninstall: ${DEPENDENCIES_TO_REMOVE[*]}${COLOR_RESET}" - echo - set +x - ${PACKAGING_TOOL_CMD} uninstall "${DEPENDENCIES_TO_REMOVE[@]}" || true - set -x - # make sure that the dependency is not needed by something else - pip check - fi -} - -common::get_colors -common::get_packaging_tool -common::get_airflow_version_specification -common::get_constraints_location -common::show_packaging_tool_version_and_location - -install_airflow_dependencies_from_branch_tip -EOF - # The content below is automatically copied from scripts/docker/common.sh COPY <<"EOF" /common.sh #!/usr/bin/env bash @@ -1374,7 +1300,8 @@ ARG PYTHON_BASE_IMAGE ENV PYTHON_BASE_IMAGE=${PYTHON_BASE_IMAGE} \ DEBIAN_FRONTEND=noninteractive LANGUAGE=C.UTF-8 LANG=C.UTF-8 LC_ALL=C.UTF-8 \ LC_CTYPE=C.UTF-8 LC_MESSAGES=C.UTF-8 \ - PIP_CACHE_DIR=/tmp/.cache/pip + PIP_CACHE_DIR=/tmp/.cache/pip \ + UV_CACHE_DIR=/tmp/.cache/uv ARG DEV_APT_DEPS="" ARG ADDITIONAL_DEV_APT_DEPS="" @@ -1440,9 +1367,6 @@ ARG DEFAULT_CONSTRAINTS_BRANCH="constraints-main" # By default PIP has progress bar but you can disable it. ARG PIP_PROGRESS_BAR -# By default we do not use pre-cached packages, but in CI/Breeze environment we override this to speed up -# builds in case pyproject.toml changed. This is pure optimisation of CI/Breeze builds. -ARG AIRFLOW_PRE_CACHED_PIP_PACKAGES="false" # This is airflow version that is put in the label of the image build ARG AIRFLOW_VERSION # By default latest released version of airflow is installed (when empty) but this value can be overridden @@ -1480,7 +1404,6 @@ ENV AIRFLOW_PIP_VERSION=${AIRFLOW_PIP_VERSION} \ AIRFLOW_UV_VERSION=${AIRFLOW_UV_VERSION} \ UV_HTTP_TIMEOUT=${UV_HTTP_TIMEOUT} \ AIRFLOW_USE_UV=${AIRFLOW_USE_UV} \ - AIRFLOW_PRE_CACHED_PIP_PACKAGES=${AIRFLOW_PRE_CACHED_PIP_PACKAGES} \ AIRFLOW_VERSION=${AIRFLOW_VERSION} \ AIRFLOW_INSTALLATION_METHOD=${AIRFLOW_INSTALLATION_METHOD} \ AIRFLOW_VERSION_SPECIFICATION=${AIRFLOW_VERSION_SPECIFICATION} \ @@ -1505,8 +1428,7 @@ ENV AIRFLOW_PIP_VERSION=${AIRFLOW_PIP_VERSION} \ # Copy all scripts required for installation - changing any of those should lead to # rebuilding from here -COPY --from=scripts common.sh install_packaging_tools.sh \ - install_airflow_dependencies_from_branch_tip.sh create_prod_venv.sh /scripts/docker/ +COPY --from=scripts common.sh install_packaging_tools.sh create_prod_venv.sh /scripts/docker/ # We can set this value to true in case we want to install .whl/.tar.gz packages placed in the # docker-context-files folder. This can be done for both additional packages you want to install @@ -1536,13 +1458,7 @@ ENV AIRFLOW_CI_BUILD_EPOCH=${AIRFLOW_CI_BUILD_EPOCH} # By default PIP installs everything to ~/.local and it's also treated as VIRTUALENV ENV VIRTUAL_ENV="${AIRFLOW_USER_HOME_DIR}/.local" -RUN bash /scripts/docker/install_packaging_tools.sh; \ - bash /scripts/docker/create_prod_venv.sh; \ - if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" && \ - ${INSTALL_PACKAGES_FROM_CONTEXT} == "false" && \ - ${UPGRADE_INVALIDATION_STRING} == "" ]]; then \ - bash /scripts/docker/install_airflow_dependencies_from_branch_tip.sh; \ - fi +RUN bash /scripts/docker/install_packaging_tools.sh; bash /scripts/docker/create_prod_venv.sh COPY --chown=airflow:0 ${AIRFLOW_SOURCES_FROM} ${AIRFLOW_SOURCES_TO} @@ -1566,10 +1482,10 @@ COPY --from=scripts install_from_docker_context_files.sh install_airflow.sh \ # an incorrect architecture. ARG TARGETARCH # Value to be able to easily change cache id and therefore use a bare new cache -ARG PIP_CACHE_EPOCH="9" +ARG DEPENDENCY_CACHE_EPOCH="9" # hadolint ignore=SC2086, SC2010, DL3042 -RUN --mount=type=cache,id=$PYTHON_BASE_IMAGE-$AIRFLOW_PIP_VERSION-$TARGETARCH-$PIP_CACHE_EPOCH,target=/tmp/.cache/pip,uid=${AIRFLOW_UID} \ +RUN --mount=type=cache,id=prod-$TARGETARCH-$DEPENDENCY_CACHE_EPOCH,target=/tmp/.cache/,uid=${AIRFLOW_UID} \ if [[ ${INSTALL_PACKAGES_FROM_CONTEXT} == "true" ]]; then \ bash /scripts/docker/install_from_docker_context_files.sh; \ fi; \ @@ -1589,7 +1505,7 @@ RUN --mount=type=cache,id=$PYTHON_BASE_IMAGE-$AIRFLOW_PIP_VERSION-$TARGETARCH-$P # during the build additionally to whatever has been installed so far. It is recommended that # the requirements.txt contains only dependencies with == version specification # hadolint ignore=DL3042 -RUN --mount=type=cache,id=additional-requirements-$PYTHON_BASE_IMAGE-$AIRFLOW_PIP_VERSION-$TARGETARCH-$PIP_CACHE_EPOCH,target=/tmp/.cache/pip,uid=${AIRFLOW_UID} \ +RUN --mount=type=cache,id=prod-$TARGETARCH-$DEPENDENCY_CACHE_EPOCH,target=/tmp/.cache/,uid=${AIRFLOW_UID} \ if [[ -f /docker-context-files/requirements.txt ]]; then \ pip install -r /docker-context-files/requirements.txt; \ fi @@ -1617,7 +1533,9 @@ ARG PYTHON_BASE_IMAGE ENV PYTHON_BASE_IMAGE=${PYTHON_BASE_IMAGE} \ # Make sure noninteractive debian install is used and language variables set DEBIAN_FRONTEND=noninteractive LANGUAGE=C.UTF-8 LANG=C.UTF-8 LC_ALL=C.UTF-8 \ - LC_CTYPE=C.UTF-8 LC_MESSAGES=C.UTF-8 LD_LIBRARY_PATH=/usr/local/lib + LC_CTYPE=C.UTF-8 LC_MESSAGES=C.UTF-8 LD_LIBRARY_PATH=/usr/local/lib \ + PIP_CACHE_DIR=/tmp/.cache/pip \ + UV_CACHE_DIR=/tmp/.cache/uv ARG RUNTIME_APT_DEPS="" ARG ADDITIONAL_RUNTIME_APT_DEPS="" diff --git a/Dockerfile.ci b/Dockerfile.ci index 16d3b29cf478f..ca526f53f8a87 100644 --- a/Dockerfile.ci +++ b/Dockerfile.ci @@ -363,85 +363,6 @@ common::show_packaging_tool_version_and_location common::install_packaging_tools EOF -# The content below is automatically copied from scripts/docker/install_airflow_dependencies_from_branch_tip.sh -COPY <<"EOF" /install_airflow_dependencies_from_branch_tip.sh -#!/usr/bin/env bash - -. "$( dirname "${BASH_SOURCE[0]}" )/common.sh" - -: "${AIRFLOW_REPO:?Should be set}" -: "${AIRFLOW_BRANCH:?Should be set}" -: "${INSTALL_MYSQL_CLIENT:?Should be true or false}" -: "${INSTALL_POSTGRES_CLIENT:?Should be true or false}" - -function install_airflow_dependencies_from_branch_tip() { - echo - echo "${COLOR_BLUE}Installing airflow from ${AIRFLOW_BRANCH}. It is used to cache dependencies${COLOR_RESET}" - echo - if [[ ${INSTALL_MYSQL_CLIENT} != "true" ]]; then - AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS/mysql,} - fi - if [[ ${INSTALL_POSTGRES_CLIENT} != "true" ]]; then - AIRFLOW_EXTRAS=${AIRFLOW_EXTRAS/postgres,} - fi - local TEMP_AIRFLOW_DIR - TEMP_AIRFLOW_DIR=$(mktemp -d) - # Install latest set of dependencies - without constraints. This is to download a "base" set of - # dependencies that we can cache and reuse when installing airflow using constraints and latest - # pyproject.toml in the next step (when we install regular airflow). - set -x - curl -fsSL "https://github.com/${AIRFLOW_REPO}/archive/${AIRFLOW_BRANCH}.tar.gz" | \ - tar xz -C "${TEMP_AIRFLOW_DIR}" --strip 1 - # Make sure editable dependencies are calculated when devel-ci dependencies are installed - ${PACKAGING_TOOL_CMD} install ${EXTRA_INSTALL_FLAGS} ${ADDITIONAL_PIP_INSTALL_FLAGS} \ - --editable "${TEMP_AIRFLOW_DIR}[${AIRFLOW_EXTRAS}]" - set +x - common::install_packaging_tools - set -x - echo "${COLOR_BLUE}Uninstalling providers. Dependencies remain${COLOR_RESET}" - # Uninstall airflow and providers to keep only the dependencies. In the future when - # planned https://github.com/pypa/pip/issues/11440 is implemented in pip we might be able to use this - # flag and skip the remove step. - pip freeze | grep apache-airflow-providers | xargs ${PACKAGING_TOOL_CMD} uninstall ${EXTRA_UNINSTALL_FLAGS} || true - set +x - echo - echo "${COLOR_BLUE}Uninstalling just airflow. Dependencies remain. Now target airflow can be reinstalled using mostly cached dependencies${COLOR_RESET}" - echo - set +x - ${PACKAGING_TOOL_CMD} uninstall ${EXTRA_UNINSTALL_FLAGS} apache-airflow - rm -rf "${TEMP_AIRFLOW_DIR}" - set -x - # If you want to make sure dependency is removed from cache in your PR when you removed it from - # pyproject.toml - please add your dependency here as a list of strings - # for example: - # DEPENDENCIES_TO_REMOVE=("package_a" "package_b") - # Once your PR is merged, you should make a follow-up PR to remove it from this list - # and increase the AIRFLOW_CI_BUILD_EPOCH in Dockerfile.ci to make sure your cache is rebuilt. - local DEPENDENCIES_TO_REMOVE - # IMPORTANT!! Make sure to increase AIRFLOW_CI_BUILD_EPOCH in Dockerfile.ci when you remove a dependency from that list - DEPENDENCIES_TO_REMOVE=() - if [[ "${DEPENDENCIES_TO_REMOVE[*]}" != "" ]]; then - echo - echo "${COLOR_BLUE}Uninstalling just removed dependencies (temporary until cache refreshes)${COLOR_RESET}" - echo "${COLOR_BLUE}Dependencies to uninstall: ${DEPENDENCIES_TO_REMOVE[*]}${COLOR_RESET}" - echo - set +x - ${PACKAGING_TOOL_CMD} uninstall "${DEPENDENCIES_TO_REMOVE[@]}" || true - set -x - # make sure that the dependency is not needed by something else - pip check - fi -} - -common::get_colors -common::get_packaging_tool -common::get_airflow_version_specification -common::get_constraints_location -common::show_packaging_tool_version_and_location - -install_airflow_dependencies_from_branch_tip -EOF - # The content below is automatically copied from scripts/docker/common.sh COPY <<"EOF" /common.sh #!/usr/bin/env bash @@ -627,35 +548,6 @@ function common::import_trusted_gpg() { } EOF -# The content below is automatically copied from scripts/docker/install_pipx_tools.sh -COPY <<"EOF" /install_pipx_tools.sh -#!/usr/bin/env bash -. "$( dirname "${BASH_SOURCE[0]}" )/common.sh" - -function install_pipx_tools() { - echo - echo "${COLOR_BLUE}Installing pipx tools${COLOR_RESET}" - echo - # Make sure PIPX is installed in latest version - ${PACKAGING_TOOL_CMD} install ${EXTRA_INSTALL_FLAGS} --upgrade "pipx>=1.2.1" - if [[ $(uname -m) != "aarch64" ]]; then - # Do not install mssql-cli for ARM - # Install all the tools we need available in command line but without impacting the current environment - pipx install mssql-cli - - # Unfortunately mssql-cli installed by `pipx` does not work out of the box because it uses - # its own execution bash script which is not compliant with the auto-activation of - # pipx venvs - we need to manually patch Python executable in the script to fix it: ¯\_(ツ)_/¯ - sed "s/python /\/root\/\.local\/pipx\/venvs\/mssql-cli\/bin\/python /" -i /root/.local/bin/mssql-cli - fi -} - -common::get_colors -common::get_packaging_tool - -install_pipx_tools -EOF - # The content below is automatically copied from scripts/docker/install_airflow.sh COPY <<"EOF" /install_airflow.sh #!/usr/bin/env bash @@ -1207,7 +1099,7 @@ ARG AIRFLOW_IMAGE_REPOSITORY="https://github.com/apache/airflow" # NOTE! When you want to make sure dependencies are installed from scratch in your PR after removing # some dependencies, you also need to set "disable image cache" in your PR to make sure the image is # not built using the "main" version of those dependencies. -ARG DEPENDENCIES_EPOCH_NUMBER="13" +ARG DEPENDENCIES_EPOCH_NUMBER="14" # Make sure noninteractive debian install is used and language variables set ENV PYTHON_BASE_IMAGE=${PYTHON_BASE_IMAGE} \ @@ -1216,7 +1108,10 @@ ENV PYTHON_BASE_IMAGE=${PYTHON_BASE_IMAGE} \ DEPENDENCIES_EPOCH_NUMBER=${DEPENDENCIES_EPOCH_NUMBER} \ INSTALL_MYSQL_CLIENT="true" \ INSTALL_MSSQL_CLIENT="true" \ - INSTALL_POSTGRES_CLIENT="true" + INSTALL_POSTGRES_CLIENT="true" \ + PIP_CACHE_DIR=/root/.cache/pip \ + UV_CACHE_DIR=/root/.cache/uv + RUN echo "Base image version: ${PYTHON_BASE_IMAGE}" @@ -1268,7 +1163,7 @@ RUN bash /scripts/docker/install_mysql.sh prod \ && chmod 0440 /etc/sudoers.d/airflow # Install Helm -ARG HELM_VERSION="v3.15.3" +ARG HELM_VERSION="v3.16.4" RUN SYSTEM=$(uname -s | tr '[:upper:]' '[:lower:]') \ && PLATFORM=$([ "$(uname -m)" = "aarch64" ] && echo "arm64" || echo "amd64" ) \ @@ -1296,15 +1191,7 @@ ARG DEFAULT_CONSTRAINTS_BRANCH="constraints-main" # By changing the epoch we can force reinstalling Airflow and pip all dependencies # It can also be overwritten manually by setting the AIRFLOW_CI_BUILD_EPOCH environment variable. ARG AIRFLOW_CI_BUILD_EPOCH="10" -ARG AIRFLOW_PRE_CACHED_PIP_PACKAGES="true" -ARG AIRFLOW_PIP_VERSION=24.3.1 -ARG AIRFLOW_UV_VERSION=0.5.11 -ARG AIRFLOW_USE_UV="true" # Setup PIP -# By default PIP install run without cache to make image smaller -ARG PIP_NO_CACHE_DIR="true" -# By default UV install run without cache to make image smaller -ARG UV_NO_CACHE="true" ARG UV_HTTP_TIMEOUT="300" # By default PIP has progress bar but you can disable it. ARG PIP_PROGRESS_BAR="on" @@ -1321,8 +1208,6 @@ ARG AIRFLOW_VERSION="" # Additional PIP flags passed to all pip install commands except reinstalling pip itself ARG ADDITIONAL_PIP_INSTALL_FLAGS="" -ARG AIRFLOW_PIP_VERSION=24.3.1 -ARG AIRFLOW_UV_VERSION=0.5.11 ARG AIRFLOW_USE_UV="true" ENV AIRFLOW_REPO=${AIRFLOW_REPO}\ @@ -1334,7 +1219,6 @@ ENV AIRFLOW_REPO=${AIRFLOW_REPO}\ AIRFLOW_CONSTRAINTS_LOCATION=${AIRFLOW_CONSTRAINTS_LOCATION} \ DEFAULT_CONSTRAINTS_BRANCH=${DEFAULT_CONSTRAINTS_BRANCH} \ AIRFLOW_CI_BUILD_EPOCH=${AIRFLOW_CI_BUILD_EPOCH} \ - AIRFLOW_PRE_CACHED_PIP_PACKAGES=${AIRFLOW_PRE_CACHED_PIP_PACKAGES} \ AIRFLOW_VERSION=${AIRFLOW_VERSION} \ AIRFLOW_PIP_VERSION=${AIRFLOW_PIP_VERSION} \ AIRFLOW_UV_VERSION=${AIRFLOW_UV_VERSION} \ @@ -1346,9 +1230,7 @@ ENV AIRFLOW_REPO=${AIRFLOW_REPO}\ INSTALL_POSTGRES_CLIENT="true" \ AIRFLOW_INSTALLATION_METHOD="." \ AIRFLOW_VERSION_SPECIFICATION="" \ - PIP_NO_CACHE_DIR=${PIP_NO_CACHE_DIR} \ PIP_PROGRESS_BAR=${PIP_PROGRESS_BAR} \ - UV_NO_CACHE=${UV_NO_CACHE} \ ADDITIONAL_PIP_INSTALL_FLAGS=${ADDITIONAL_PIP_INSTALL_FLAGS} \ CASS_DRIVER_BUILD_CONCURRENCY=${CASS_DRIVER_BUILD_CONCURRENCY} \ CASS_DRIVER_NO_CYTHON=${CASS_DRIVER_NO_CYTHON} @@ -1357,42 +1239,50 @@ RUN echo "Airflow version: ${AIRFLOW_VERSION}" # Copy all scripts required for installation - changing any of those should lead to # rebuilding from here -COPY --from=scripts install_packaging_tools.sh install_airflow_dependencies_from_branch_tip.sh \ - common.sh /scripts/docker/ +COPY --from=scripts common.sh install_packaging_tools.sh install_additional_dependencies.sh /scripts/docker/ # We are first creating a venv where all python packages and .so binaries needed by those are # installed. -# In case of CI builds we want to pre-install main version of airflow dependencies so that -# We do not have to always reinstall it from the scratch. -# And is automatically reinstalled from the scratch every time patch release of python gets released -# The Airflow and providers are uninstalled, only dependencies remain. -# the cache is only used when "upgrade to newer dependencies" is not set to automatically -# account for removed dependencies (we do not install them in the first place) -RUN bash /scripts/docker/install_packaging_tools.sh; \ - if [[ ${AIRFLOW_PRE_CACHED_PIP_PACKAGES} == "true" ]]; then \ - bash /scripts/docker/install_airflow_dependencies_from_branch_tip.sh; \ - fi + +# Here we fix the versions so all subsequent commands will use the versions +# from the sources +# You can swap comments between those two args to test pip from the main version +# When you attempt to test if the version of `pip` from specified branch works for our builds +# Also use `force pip` label on your PR to swap all places we use `uv` to `pip` +ARG AIRFLOW_PIP_VERSION=24.3.1 +# ARG AIRFLOW_PIP_VERSION="git+https://github.com/pypa/pip.git@main" +ARG AIRFLOW_UV_VERSION=0.5.17 +# TODO(potiuk): automate with upgrade check (possibly) +ARG AIRFLOW_PRE_COMMIT_VERSION="3.5.0" + +ENV AIRFLOW_PIP_VERSION=${AIRFLOW_PIP_VERSION} \ + AIRFLOW_UV_VERSION=${AIRFLOW_UV_VERSION} \ + # This is needed since we are using cache mounted from the host + UV_LINK_MODE=copy \ + AIRFLOW_PRE_COMMIT_VERSION=${AIRFLOW_PRE_COMMIT_VERSION} # The PATH is needed for PIPX to find the tools installed ENV PATH="/root/.local/bin:${PATH}" -COPY --from=scripts install_pipx_tools.sh /scripts/docker/ +# Useful for creating a cache id based on the underlying architecture, preventing the use of cached python packages from +# an incorrect architecture. +ARG TARGETARCH +# Value to be able to easily change cache id and therefore use a bare new cache +ARG DEPENDENCY_CACHE_EPOCH="0" # Install useful command line tools in their own virtualenv so that they do not clash with -# dependencies installed in Airflow -RUN bash /scripts/docker/install_pipx_tools.sh - -# Airflow sources change frequently but dependency configuration won't change that often -# We copy pyproject.toml and other files needed to perform setup of dependencies -# So in case pyproject.toml changes we can install latest dependencies required. -COPY pyproject.toml ${AIRFLOW_SOURCES}/pyproject.toml -COPY airflow/__init__.py ${AIRFLOW_SOURCES}/airflow/ -COPY generated/* ${AIRFLOW_SOURCES}/generated/ -COPY constraints/* ${AIRFLOW_SOURCES}/constraints/ -COPY LICENSE ${AIRFLOW_SOURCES}/LICENSE -COPY hatch_build.py ${AIRFLOW_SOURCES}/ +# dependencies installed in Airflow also reinstall PIP and UV to make sure they are installed +# in the version specified above +RUN bash /scripts/docker/install_packaging_tools.sh + COPY --from=scripts install_airflow.sh /scripts/docker/ +# We can copy everything here. The Context is filtered by dockerignore. This makes sure we are not +# copying over stuff that is accidentally generated or that we do not need (such as egg-info) +# if you want to add something that is missing and you expect to see it in the image you can +# add it with ! in .dockerignore next to the airflow, test etc. directories there +COPY . ${AIRFLOW_SOURCES}/ + # Those are additional constraints that are needed for some extras but we do not want to # force them on the main Airflow package. Currently we need no extra limits as PIP 23.1+ has much better # dependency resolution and we do not need to limit the versions of the dependencies @@ -1411,36 +1301,30 @@ ENV EAGER_UPGRADE_ADDITIONAL_REQUIREMENTS=${EAGER_UPGRADE_ADDITIONAL_REQUIREMENT # Usually we will install versions based on the dependencies in pyproject.toml and upgraded only if needed. # But in cron job we will install latest versions matching pyproject.toml to see if there is no breaking change # and push the constraints if everything is successful -RUN bash /scripts/docker/install_airflow.sh - -COPY --from=scripts entrypoint_ci.sh /entrypoint -COPY --from=scripts entrypoint_exec.sh /entrypoint-exec -RUN chmod a+x /entrypoint /entrypoint-exec +RUN --mount=type=cache,id=ci-$TARGETARCH-$DEPENDENCY_CACHE_EPOCH,target=/root/.cache/ bash /scripts/docker/install_airflow.sh COPY --from=scripts install_packaging_tools.sh install_additional_dependencies.sh /scripts/docker/ -# Additional python deps to install ARG ADDITIONAL_PYTHON_DEPS="" -RUN bash /scripts/docker/install_packaging_tools.sh; \ +ENV ADDITIONAL_PYTHON_DEPS=${ADDITIONAL_PYTHON_DEPS} + +RUN --mount=type=cache,id=ci-$TARGETARCH-$DEPENDENCY_CACHE_EPOCH,target=/root/.cache/ \ + bash /scripts/docker/install_packaging_tools.sh; \ if [[ -n "${ADDITIONAL_PYTHON_DEPS}" ]]; then \ bash /scripts/docker/install_additional_dependencies.sh; \ fi -# Install autocomplete for airflow -RUN if command -v airflow; then \ - register-python-argcomplete airflow >> ~/.bashrc ; \ - fi - -# Install autocomplete for Kubectl -RUN echo "source /etc/bash_completion" >> ~/.bashrc +COPY --from=scripts entrypoint_ci.sh /entrypoint +COPY --from=scripts entrypoint_exec.sh /entrypoint-exec +RUN chmod a+x /entrypoint /entrypoint-exec -# We can copy everything here. The Context is filtered by dockerignore. This makes sure we are not -# copying over stuff that is accidentally generated or that we do not need (such as egg-info) -# if you want to add something that is missing and you expect to see it in the image you can -# add it with ! in .dockerignore next to the airflow, test etc. directories there -COPY . ${AIRFLOW_SOURCES}/ +# Install autocomplete for airflow and kubectl +RUN if command -v airflow; then \ + register-python-argcomplete airflow >> ~/.bashrc ; \ + fi; \ + echo "source /etc/bash_completion" >> ~/.bashrc WORKDIR ${AIRFLOW_SOURCES} @@ -1451,7 +1335,13 @@ ARG AIRFLOW_IMAGE_DATE_CREATED ENV PATH="/files/bin/:/opt/airflow/scripts/in_container/bin/:${PATH}" \ GUNICORN_CMD_ARGS="--worker-tmp-dir /dev/shm/" \ BUILD_ID=${BUILD_ID} \ - COMMIT_SHA=${COMMIT_SHA} + COMMIT_SHA=${COMMIT_SHA} \ + # When we enter the image, the /root/.cache is not mounted from temporary mount cache. + # We do not want to share the cache from host to avoid all kinds of problems where cache + # is different with different platforms / python versions. We want to have a clean cache + # in the image - and in this case /root/.cache is on the same filesystem as the installed packages. + # so we can go back to the default link mode being hardlink. + UV_LINK_MODE=hardlink # Link dumb-init for backwards compatibility (so that older images also work) RUN ln -sf /usr/bin/dumb-init /usr/local/bin/dumb-init diff --git a/RELEASE_NOTES.rst b/RELEASE_NOTES.rst index 146c661c84ffa..cb7572626b434 100644 --- a/RELEASE_NOTES.rst +++ b/RELEASE_NOTES.rst @@ -3367,8 +3367,7 @@ And to mark a task as producing a dataset pass the dataset(s) to the ``outlets`` .. code-block:: python @task(outlets=[dataset]) - def my_task(): - ... + def my_task(): ... # Or for classic operators @@ -3402,8 +3401,7 @@ Previously you had to assign a DAG to a module-level variable in order for Airfl @dag - def dag_maker(): - ... + def dag_maker(): ... dag2 = dag_maker() @@ -3418,8 +3416,7 @@ can become @dag - def dag_maker(): - ... + def dag_maker(): ... dag_maker() @@ -3750,13 +3747,11 @@ For example, in your ``custom_config.py``: # before - class YourCustomFormatter(logging.Formatter): - ... + class YourCustomFormatter(logging.Formatter): ... # after - class YourCustomFormatter(TimezoneAware): - ... + class YourCustomFormatter(TimezoneAware): ... AIRFLOW_FORMATTER = LOGGING_CONFIG["formatters"]["airflow"] @@ -6447,27 +6442,22 @@ The old syntax of passing ``context`` as a dictionary will continue to work with .. code-block:: python - def execution_date_fn(execution_date, ctx): - ... + def execution_date_fn(execution_date, ctx): ... ``execution_date_fn`` can take in any number of keyword arguments available in the task context dictionary. The following forms of ``execution_date_fn`` are all supported: .. code-block:: python - def execution_date_fn(dt): - ... + def execution_date_fn(dt): ... - def execution_date_fn(execution_date): - ... + def execution_date_fn(execution_date): ... - def execution_date_fn(execution_date, ds_nodash): - ... + def execution_date_fn(execution_date, ds_nodash): ... - def execution_date_fn(execution_date, ds_nodash, dag): - ... + def execution_date_fn(execution_date, ds_nodash, dag): ... The default value for ``[webserver] cookie_samesite`` has been changed to ``Lax`` """"""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""""" @@ -7194,8 +7184,7 @@ Previous signature: external_trigger=False, conf=None, session=None, - ): - ... + ): ... current: @@ -7211,8 +7200,7 @@ current: conf=None, run_type=None, session=None, - ): - ... + ): ... If user provides ``run_id`` then the ``run_type`` will be derived from it by checking prefix, allowed types : ``manual``\ , ``scheduled``\ , ``backfill`` (defined by ``airflow.utils.types.DagRunType``\ ). @@ -7310,8 +7298,9 @@ can be replaced by the following code: logger = logging.getLogger("custom-logger") - with redirect_stdout(StreamLogWriter(logger, logging.INFO)), redirect_stderr( - StreamLogWriter(logger, logging.WARN) + with ( + redirect_stdout(StreamLogWriter(logger, logging.INFO)), + redirect_stderr(StreamLogWriter(logger, logging.WARN)), ): print("I Love Airflow") @@ -7340,8 +7329,7 @@ are deprecated and will be removed in future versions. include_examples=conf.getboolean("core", "LOAD_EXAMPLES"), safe_mode=conf.getboolean("core", "DAG_DISCOVERY_SAFE_MODE"), store_serialized_dags=False, - ): - ... + ): ... **current**\ : @@ -7352,8 +7340,7 @@ are deprecated and will be removed in future versions. include_examples=conf.getboolean("core", "LOAD_EXAMPLES"), safe_mode=conf.getboolean("core", "DAG_DISCOVERY_SAFE_MODE"), read_dags_from_db=False, - ): - ... + ): ... If you were using positional arguments, it requires no change but if you were using keyword arguments, please change ``store_serialized_dags`` to ``read_dags_from_db``. @@ -8175,8 +8162,7 @@ Before: dataset_id: str, dataset_resource: dict, # ... - ): - ... + ): ... After: @@ -8186,8 +8172,7 @@ After: dataset_resource: dict, dataset_id: Optional[str] = None, # ... - ): - ... + ): ... Changes in ``amazon`` provider package """""""""""""""""""""""""""""""""""""""""" @@ -10267,16 +10252,14 @@ Old signature: .. code-block:: python - def get_task_instances(self, session, start_date=None, end_date=None): - ... + def get_task_instances(self, session, start_date=None, end_date=None): ... New signature: .. code-block:: python @provide_session - def get_task_instances(self, start_date=None, end_date=None, session=None): - ... + def get_task_instances(self, start_date=None, end_date=None, session=None): ... For ``DAG`` ~~~~~~~~~~~~~~~ @@ -10285,16 +10268,14 @@ Old signature: .. code-block:: python - def get_task_instances(self, session, start_date=None, end_date=None, state=None): - ... + def get_task_instances(self, session, start_date=None, end_date=None, state=None): ... New signature: .. code-block:: python @provide_session - def get_task_instances(self, start_date=None, end_date=None, state=None, session=None): - ... + def get_task_instances(self, start_date=None, end_date=None, state=None, session=None): ... In either case, it is necessary to rewrite calls to the ``get_task_instances`` method that currently provide the ``session`` positional argument. New calls to this method look like: @@ -10775,15 +10756,13 @@ Old signature: .. code-block:: python - def create_transfer_job(self, description, schedule, transfer_spec, project_id=None): - ... + def create_transfer_job(self, description, schedule, transfer_spec, project_id=None): ... New signature: .. code-block:: python - def create_transfer_job(self, body): - ... + def create_transfer_job(self, body): ... It is necessary to rewrite calls to method. The new call looks like this: @@ -10808,15 +10787,13 @@ Old signature: .. code-block:: python - def wait_for_transfer_job(self, job): - ... + def wait_for_transfer_job(self, job): ... New signature: .. code-block:: python - def wait_for_transfer_job(self, job, expected_statuses=(GcpTransferOperationStatus.SUCCESS,)): - ... + def wait_for_transfer_job(self, job, expected_statuses=(GcpTransferOperationStatus.SUCCESS,)): ... The behavior of ``wait_for_transfer_job`` has changed: diff --git a/airflow/executors/executor_loader.py b/airflow/executors/executor_loader.py index 31b9a369bc3fc..3c08887906114 100644 --- a/airflow/executors/executor_loader.py +++ b/airflow/executors/executor_loader.py @@ -337,7 +337,7 @@ def validate_database_executor_compatibility(cls, executor: type[BaseExecutor]) from airflow.settings import engine # SQLite only works with single threaded executors - if engine.dialect.name == "sqlite": + if engine and engine.dialect.name == "sqlite": raise AirflowConfigException(f"error: cannot use SQLite with the {executor.__name__}") @classmethod diff --git a/airflow/providers/fab/auth_manager/cli_commands/user_command.py b/airflow/providers/fab/auth_manager/cli_commands/user_command.py index 3050a9e250e58..e877397bcf2e0 100644 --- a/airflow/providers/fab/auth_manager/cli_commands/user_command.py +++ b/airflow/providers/fab/auth_manager/cli_commands/user_command.py @@ -212,10 +212,10 @@ def users_import(args): users_created, users_updated = _import_users(users_list) if users_created: - print("Created the following users:\n\t{}".format("\n\t".join(users_created))) + print("Created the following users:\n\t{'\\n\\t'.join(users_created)}") if users_updated: - print("Updated the following users:\n\t{}".format("\n\t".join(users_updated))) + print("Updated the following users:\n\t{'\\n\\t.join(users_updated)}") def _import_users(users_list: list[dict[str, Any]]): @@ -231,9 +231,7 @@ def _import_users(users_list: list[dict[str, Any]]): msg.append(f"[Item {row_num}]") for key, value in failure.items(): msg.append(f"\t{key}: {value}") - raise SystemExit( - "Error: Input file didn't pass validation. See below:\n{}".format("\n".join(msg)) - ) + raise SystemExit("Error: Input file didn't pass validation. See below:\n{'\\n'.join(msg)}") for user in users_list: roles = [] diff --git a/airflow/providers/influxdb/hooks/influxdb.py b/airflow/providers/influxdb/hooks/influxdb.py index b1b001730a47d..a34a99b36dfe5 100644 --- a/airflow/providers/influxdb/hooks/influxdb.py +++ b/airflow/providers/influxdb/hooks/influxdb.py @@ -99,7 +99,8 @@ def get_conn(self) -> InfluxDBClient: return self.client self.client = self.get_client(self.uri, self.extras) - + if not self.client: + raise ValueError("InfluxDB connection not present") return self.client def query(self, query) -> list[FluxTable]: diff --git a/airflow/providers/microsoft/azure/hooks/adx.py b/airflow/providers/microsoft/azure/hooks/adx.py index 8ad4095970146..facdb1fb762bf 100644 --- a/airflow/providers/microsoft/azure/hooks/adx.py +++ b/airflow/providers/microsoft/azure/hooks/adx.py @@ -41,7 +41,7 @@ ) if TYPE_CHECKING: - from azure.kusto.data.response import KustoResponseDataSetV2 + from azure.kusto.data.response import KustoResponseDataSet class AzureDataExplorerHook(BaseHook): @@ -214,7 +214,7 @@ def get_required_param(name: str) -> str: return KustoClient(kcsb) - def run_query(self, query: str, database: str, options: dict | None = None) -> KustoResponseDataSetV2: + def run_query(self, query: str, database: str, options: dict | None = None) -> KustoResponseDataSet: """ Run KQL query using provided configuration, and return KustoResponseDataSet instance. diff --git a/airflow/providers/mongo/hooks/mongo.py b/airflow/providers/mongo/hooks/mongo.py index 06c3df6bd9e35..772c00fe64cba 100644 --- a/airflow/providers/mongo/hooks/mongo.py +++ b/airflow/providers/mongo/hooks/mongo.py @@ -166,9 +166,7 @@ def _create_uri(self) -> str: path = f"/{self.connection.schema}" return urlunsplit((scheme, netloc, path, "", "")) - def get_collection( - self, mongo_collection: str, mongo_db: str | None = None - ) -> pymongo.collection.Collection: + def get_collection(self, mongo_collection: str, mongo_db: str | None = None): """ Fetch a mongo collection object for querying. @@ -179,9 +177,7 @@ def get_collection( return mongo_conn.get_database(mongo_db).get_collection(mongo_collection) - def aggregate( - self, mongo_collection: str, aggregate_query: list, mongo_db: str | None = None, **kwargs - ) -> pymongo.command_cursor.CommandCursor: + def aggregate(self, mongo_collection: str, aggregate_query: list, mongo_db: str | None = None, **kwargs): """ Run an aggregation pipeline and returns the results. diff --git a/airflow/providers_manager.py b/airflow/providers_manager.py index dd3e841fa1662..ab8906941a04a 100644 --- a/airflow/providers_manager.py +++ b/airflow/providers_manager.py @@ -531,7 +531,6 @@ def initialize_providers_dataset_uri_resources(self): self.initialize_providers_list() self._discover_dataset_uri_resources() - @provider_info_cache("hook_lineage_writers") @provider_info_cache("taskflow_decorators") def initialize_providers_taskflow_decorator(self): """Lazy initialization of providers hooks.""" diff --git a/airflow/reproducible_build.yaml b/airflow/reproducible_build.yaml index 253e4e793cff1..9f7c6d5a10069 100644 --- a/airflow/reproducible_build.yaml +++ b/airflow/reproducible_build.yaml @@ -1,2 +1,2 @@ -release-notes-hash: 0867869dba7304e7ead28dd0800c5c4b -source-date-epoch: 1733822937 +release-notes-hash: 7be47e2ddbbe1bfbd0d3f572d2b7800a +source-date-epoch: 1736532824 diff --git a/contributing-docs/03_contributors_quick_start.rst b/contributing-docs/03_contributors_quick_start.rst index e663cf2ed19f1..9c7bdbe985178 100644 --- a/contributing-docs/03_contributors_quick_start.rst +++ b/contributing-docs/03_contributors_quick_start.rst @@ -474,14 +474,12 @@ You can still add uv support for pre-commit if you use pipx using the commands: .. code-block:: bash pipx install pre-commit - pipx inject - pipx inject pre-commit pre-commit-uv Also, if you already use ``uvx`` instead of ``pipx``, use this command: .. code-block:: bash - uv tool install pre-commit --with pre-commit-uv --force-reinstall + uv tool install pre-commit --force-reinstall 1. Installing required packages diff --git a/contributing-docs/08_static_code_checks.rst b/contributing-docs/08_static_code_checks.rst index 211ee7c9c34d4..27768d23060b4 100644 --- a/contributing-docs/08_static_code_checks.rst +++ b/contributing-docs/08_static_code_checks.rst @@ -113,282 +113,242 @@ require Breeze Docker image to be built locally. .. BEGIN AUTO-GENERATED STATIC CHECK LIST -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| ID | Description | Image | -+===========================================================+==============================================================+=========+ -| bandit | bandit | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| blacken-docs | Run black on Python code blocks in documentation files | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-aiobotocore-optional | Check if aiobotocore is an optional dependency only | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-airflow-k8s-not-used | Check airflow.kubernetes imports are not used | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-airflow-provider-compatibility | Check compatibility of Providers with Airflow | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-airflow-providers-bug-report-template | Check airflow-bug-report provider list is sorted/unique | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-apache-license-rat | Check if licenses are OK for Apache | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-base-operator-partial-arguments | Check BaseOperator and partial() arguments | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-base-operator-usage | * Check BaseOperator core imports | | -| | * Check BaseOperatorLink core imports | | -| | * Check BaseOperator[Link] other imports | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-boring-cyborg-configuration | Checks for Boring Cyborg configuration consistency | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-breeze-top-dependencies-limited | Breeze should have small number of top-level dependencies | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-builtin-literals | Require literal syntax when initializing builtin types | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-changelog-format | Check changelog format | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-changelog-has-no-duplicates | Check changelogs for duplicate entries | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-cncf-k8s-only-for-executors | Check cncf.kubernetes imports used for executors only | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-code-deprecations | Check deprecations categories in decorators | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-common-compat-used-for-openlineage | Check common.compat is used for OL deprecated classes | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-compat-cache-on-methods | Check that compat cache do not use on class methods | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-core-deprecation-classes | Verify usage of Airflow deprecation classes in core | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-daysago-import-from-utils | Make sure days_ago is imported from airflow.utils.dates | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-decorated-operator-implements-custom-name | Check @task decorator implements custom_operator_name | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-deferrable-default | Check and fix default value of default_deferrable | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-docstring-param-types | Check that docstrings do not specify param types | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-example-dags-urls | Check that example dags url include provider versions | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-executables-have-shebangs | Check that executables have shebang | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-extra-packages-references | Checks setup extra packages | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-extras-order | Check order of extras in Dockerfile | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-fab-migrations | Check no migration is done on FAB related table | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-for-inclusive-language | Check for language that we do not accept as community | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-get-lineage-collector-providers | Check providers import hook lineage code from compat | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-google-re2-as-dependency | Check google-re2 is declared as dependency when needed | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-hatch-build-order | Check order of dependencies in hatch_build.py | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-hooks-apply | Check if all hooks apply to the repository | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-incorrect-use-of-LoggingMixin | Make sure LoggingMixin is not used alone | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-init-decorator-arguments | Check model __init__ and decorator arguments are in sync | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-integrations-list-consistent | Sync integrations list with docs | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-lazy-logging | Check that all logging methods are lazy | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-links-to-example-dags-do-not-use-hardcoded-versions | Verify example dags do not use hard-coded version numbers | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-merge-conflict | Check that merge conflicts are not being committed | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-min-python-version | Check minimum Python version | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-newsfragments-are-valid | Check newsfragments are valid | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-no-airflow-deprecation-in-providers | Do not use DeprecationWarning in providers | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-no-providers-in-core-examples | No providers imports in core example DAGs | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-only-new-session-with-provide-session | Check NEW_SESSION is only used with @provide_session | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-persist-credentials-disabled-in-github-workflows | Check that workflow files have persist-credentials disabled | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-pre-commit-information-consistent | Validate hook IDs & names and sync with docs | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-provide-create-sessions-imports | Check provide_session and create_session imports | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-provider-docs-valid | Validate provider doc files | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-provider-yaml-valid | Validate provider.yaml files | * | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-providers-init-file-missing | Provider init file is missing | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-providers-subpackages-init-file-exist | Provider subpackage init files are there | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-pydevd-left-in-code | Check for pydevd debug statements accidentally left | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-revision-heads-map | Check that the REVISION_HEADS_MAP is up-to-date | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-safe-filter-usage-in-html | Don't use safe in templates | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-sql-dependency-common-data-structure | Check dependency of SQL Providers with common data structure | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-start-date-not-used-in-defaults | start_date not to be defined in default_args in example_dags | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-system-tests-present | Check if system tests have required segments of code | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-system-tests-tocs | Check that system tests is properly added | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-taskinstance-tis-attrs | Check that TI and TIS have the same attributes | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-template-context-variable-in-sync | Check all template context variable references are in sync | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-tests-in-the-right-folders | Check if tests are in the right folders | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-tests-unittest-testcase | Check that unit tests do not inherit from unittest.TestCase | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-urlparse-usage-in-code | Don't use urlparse in code | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-usage-of-re2-over-re | Use re2 module instead of re | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| check-xml | Check XML files with xmllint | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| codespell | Run codespell to check for common misspellings in files | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| compile-www-assets | Compile www assets (manual) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| compile-www-assets-dev | Compile www assets in dev mode (manual) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| create-missing-init-py-files-tests | Create missing init.py files in tests | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| debug-statements | Detect accidentally committed debug statements | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| detect-private-key | Detect if private key is added to the repository | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| doctoc | Add TOC for Markdown and RST files | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| end-of-file-fixer | Make sure that there is an empty line at the end | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| fix-encoding-pragma | Remove encoding header from Python files | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| flynt | Run flynt string format converter for Python | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| generate-airflow-diagrams | Generate airflow diagrams | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| generate-pypi-readme | Generate PyPI README | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| identity | Print input to the static check hooks for troubleshooting | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| insert-license | * Add license for all SQL files | | -| | * Add license for all RST files | | -| | * Add license for all CSS/JS/JSX/PUML/TS/TSX files | | -| | * Add license for all JINJA template files | | -| | * Add license for all Shell files | | -| | * Add license for all toml files | | -| | * Add license for all Python files | | -| | * Add license for all XML files | | -| | * Add license for all Helm template files | | -| | * Add license for all YAML files except Helm templates | | -| | * Add license for all Markdown files | | -| | * Add license for all other files | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| kubeconform | Kubeconform check on our helm chart | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| lint-chart-schema | Lint chart/values.schema.json file | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| lint-css | stylelint | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| lint-dockerfile | Lint Dockerfile | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| lint-helm-chart | Lint Helm Chart | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| lint-json-schema | * Lint JSON Schema files with JSON Schema | | -| | * Lint NodePort Service with JSON Schema | | -| | * Lint Docker compose files with JSON Schema | | -| | * Lint chart/values.schema.json file with JSON Schema | | -| | * Lint chart/values.yaml file with JSON Schema | | -| | * Lint config_templates/config.yml file with JSON Schema | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| lint-markdown | Run markdownlint | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| lint-openapi | * Lint OpenAPI using spectral | | -| | * Lint OpenAPI using openapi-spec-validator | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| mixed-line-ending | Detect if mixed line ending is used (\r vs. \r\n) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| mypy-airflow | * Run mypy for airflow | * | -| | * Run mypy for airflow (manual) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| mypy-dev | * Run mypy for dev | * | -| | * Run mypy for dev (manual) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| mypy-docs | * Run mypy for /docs/ folder | * | -| | * Run mypy for /docs/ folder (manual) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| mypy-providers | * Run mypy for providers | * | -| | * Run mypy for providers (manual) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| pretty-format-json | Format JSON files | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| pylint | pylint | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| python-no-log-warn | Check if there are no deprecate log warn | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| replace-bad-characters | Replace bad characters | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| rst-backticks | Check if RST files use double backticks for code | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| ruff | Run 'ruff' for extremely fast Python linting | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| ruff-format | Run 'ruff format' for extremely fast Python formatting | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| shellcheck | Check Shell scripts syntax correctness | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| trailing-whitespace | Remove trailing whitespace at end of line | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| ts-compile-format-lint-www | TS types generation / ESLint / Prettier against UI files | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-black-version | Update black versions everywhere (manual) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-breeze-cmd-output | Update output of breeze commands in Breeze documentation | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-breeze-readme-config-hash | Update Breeze README.md with config files hash | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-build-dependencies | Update build-dependencies to latest (manual) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-chart-dependencies | Update chart dependencies to latest (manual) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-common-sql-api-stubs | Check and update common.sql API stubs | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-er-diagram | Update ER diagram | * | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-extras | Update extras in documentation | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-in-the-wild-to-be-sorted | Sort INTHEWILD.md alphabetically | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-inlined-dockerfile-scripts | Inline Dockerfile and Dockerfile.ci scripts | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-installed-providers-to-be-sorted | Sort alphabetically and uniquify installed_providers.txt | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-installers | Update installers to latest (manual) | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-local-yml-file | Update mounts in the local yml file | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-migration-references | Update migration ref doc | * | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-openapi-spec-tags-to-be-sorted | Sort alphabetically openapi spec tags | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-providers-dependencies | Update dependencies for provider packages | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-reproducible-source-date-epoch | Update Source Date Epoch for reproducible builds | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-spelling-wordlist-to-be-sorted | Sort alphabetically and uniquify spelling_wordlist.txt | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-supported-versions | Updates supported versions in documentation | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-vendored-in-k8s-json-schema | Vendor k8s definitions into values.schema.json | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| update-version | Update version to the latest version in the documentation | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| validate-operators-init | Prevent templated field logic checks in operators' __init__ | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ -| yamllint | Check YAML files with yamllint | | -+-----------------------------------------------------------+--------------------------------------------------------------+---------+ ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| ID | Description | Image | ++===========================================================+========================================================+=========+ +| bandit | bandit | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| blacken-docs | Run black on docs | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-aiobotocore-optional | Check if aiobotocore is an optional dependency only | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-airflow-k8s-not-used | Check airflow.kubernetes imports are not used | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-apache-license-rat | Check if licenses are OK for Apache | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-base-operator-usage | * Check BaseOperator core imports | | +| | * Check BaseOperatorLink core imports | | +| | * Check BaseOperator[Link] other imports | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-boring-cyborg-configuration | Checks for Boring Cyborg configuration consistency | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-breeze-top-dependencies-limited | Check top-level breeze deps | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-builtin-literals | Require literal syntax when initializing builtins | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-changelog-format | Check changelog format | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-changelog-has-no-duplicates | Check changelogs for duplicate entries | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-common-compat-used-for-openlineage | Check common.compat is used for OL deprecated classes | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-core-deprecation-classes | Verify usage of Airflow deprecation classes in core | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-daysago-import-from-utils | days_ago imported from airflow.utils.dates | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-decorated-operator-implements-custom-name | Check @task decorator implements custom_operator_name | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-docstring-param-types | Check that docstrings do not specify param types | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-executables-have-shebangs | Check that executables have shebang | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-extra-packages-references | Checks setup extra packages | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-extras-order | Check order of extras in Dockerfile | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-fab-migrations | Check no migration is done on FAB related table | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-for-inclusive-language | Check for language that we do not accept as community | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-get-lineage-collector-providers | Check providers import hook lineage code from compat | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-hatch-build-order | Check order of dependencies in hatch_build.py | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-hooks-apply | Check if all hooks apply to the repository | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-incorrect-use-of-LoggingMixin | Make sure LoggingMixin is not used alone | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-integrations-list-consistent | Sync integrations list with docs | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-lazy-logging | Check that all logging methods are lazy | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-links-to-example-dags-do-not-use-hardcoded-versions | Verify no hard-coded version in example dags | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-merge-conflict | Check that merge conflicts are not being committed | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-min-python-version | Check minimum Python version | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-newsfragments-are-valid | Check newsfragments are valid | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-no-airflow-deprecation-in-providers | Do not use DeprecationWarning in providers | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-no-providers-in-core-examples | No providers imports in core example DAGs | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-only-new-session-with-provide-session | Check NEW_SESSION is only used with @provide_session | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-persist-credentials-disabled-in-github-workflows | Check persistent creds in workflow files | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-pre-commit-information-consistent | Validate hook IDs & names and sync with docs | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-provide-create-sessions-imports | Check session util imports | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-provider-docs-valid | Validate provider doc files | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-provider-yaml-valid | Validate provider.yaml files | * | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-providers-subpackages-init-file-exist | Provider subpackage init files are there | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-pydevd-left-in-code | Check for pydevd debug statements accidentally left | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-safe-filter-usage-in-html | Don't use safe in templates | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-start-date-not-used-in-defaults | start_date not in default_args | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-template-context-variable-in-sync | Sync template context variable refs | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-tests-unittest-testcase | Unit tests do not inherit from unittest.TestCase | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-urlparse-usage-in-code | Don't use urlparse in code | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-usage-of-re2-over-re | Use re2 module instead of re | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| check-xml | Check XML files with xmllint | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| codespell | Run codespell | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| compile-www-assets | Compile www assets (manual) | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| compile-www-assets-dev | Compile www assets in dev mode (manual) | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| create-missing-init-py-files-tests | Create missing init.py files in tests | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| debug-statements | Detect accidentally committed debug statements | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| detect-private-key | Detect if private key is added to the repository | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| doctoc | Add TOC for Markdown and RST files | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| end-of-file-fixer | Make sure that there is an empty line at the end | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| fix-encoding-pragma | Remove encoding header from Python files | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| flynt | Run flynt string format converter for Python | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| generate-airflow-diagrams | Generate airflow diagrams | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| generate-pypi-readme | Generate PyPI README | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| identity | Print checked files | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| insert-license | * Add license for all SQL files | | +| | * Add license for all RST files | | +| | * Add license for CSS/JS/JSX/PUML/TS/TSX | | +| | * Add license for all JINJA template files | | +| | * Add license for all Shell files | | +| | * Add license for all toml files | | +| | * Add license for all Python files | | +| | * Add license for all XML files | | +| | * Add license for all Helm template files | | +| | * Add license for all YAML files except Helm templates | | +| | * Add license for all Markdown files | | +| | * Add license for all other files | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| kubeconform | Kubeconform check on our helm chart | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| lint-chart-schema | Lint chart/values.schema.json file | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| lint-css | stylelint | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| lint-dockerfile | Lint Dockerfile | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| lint-helm-chart | Lint Helm Chart | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| lint-json-schema | * Lint JSON Schema files | | +| | * Lint NodePort Service | | +| | * Lint Docker compose files | | +| | * Lint chart/values.schema.json | | +| | * Lint chart/values.yaml | | +| | * Lint config_templates/config.yml | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| lint-markdown | Run markdownlint | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| lint-openapi | * Lint OpenAPI using spectral | | +| | * Lint OpenAPI using openapi-spec-validator | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| mixed-line-ending | Detect if mixed line ending is used (\r vs. \r\n) | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| mypy-airflow | * Run mypy for airflow | * | +| | * Run mypy for airflow (manual) | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| mypy-dev | * Run mypy for dev | * | +| | * Run mypy for dev (manual) | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| mypy-docs | * Run mypy for /docs/ folder | * | +| | * Run mypy for /docs/ folder (manual) | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| mypy-providers | * Run mypy for providers | * | +| | * Run mypy for providers (manual) | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| pretty-format-json | Format JSON files | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| pylint | pylint | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| python-no-log-warn | Check if there are no deprecate log warn | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| replace-bad-characters | Replace bad characters | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| rst-backticks | Check if RST files use double backticks for code | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| ruff | Run 'ruff' for extremely fast Python linting | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| ruff-format | Run 'ruff format' | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| shellcheck | Check Shell scripts syntax correctness | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| trailing-whitespace | Remove trailing whitespace at end of line | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| ts-compile-format-lint-www | Compile / format / lint WWW | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-black-version | Update black versions everywhere (manual) | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-breeze-cmd-output | Update breeze docs | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-breeze-readme-config-hash | Update Breeze README.md with config files hash | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-chart-dependencies | Update chart dependencies to latest (manual) | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-er-diagram | Update ER diagram | * | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-in-the-wild-to-be-sorted | Sort INTHEWILD.md alphabetically | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-inlined-dockerfile-scripts | Inline Dockerfile and Dockerfile.ci scripts | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-installed-providers-to-be-sorted | Sort and uniquify installed_providers.txt | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-installers-and-pre-commit | Update installers and pre-commit to latest (manual) | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-local-yml-file | Update mounts in the local yml file | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-migration-references | Update migration ref doc | * | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-openapi-spec-tags-to-be-sorted | Sort alphabetically openapi spec tags | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-providers-dependencies | Update dependencies for provider packages | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-reproducible-source-date-epoch | Update Source Date Epoch for reproducible builds | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-spelling-wordlist-to-be-sorted | Sort spelling_wordlist.txt | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-supported-versions | Updates supported versions in documentation | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-vendored-in-k8s-json-schema | Vendor k8s definitions into values.schema.json | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| update-version | Update versions in docs | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| yamllint | Check YAML files with yamllint | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ +| zizmor | Run zizmor to check for github workflow syntax errors | | ++-----------------------------------------------------------+--------------------------------------------------------+---------+ .. END AUTO-GENERATED STATIC CHECK LIST diff --git a/contributing-docs/testing/helm_unit_tests.rst b/contributing-docs/testing/helm_unit_tests.rst index 266be65d81db0..9a733104b609f 100644 --- a/contributing-docs/testing/helm_unit_tests.rst +++ b/contributing-docs/testing/helm_unit_tests.rst @@ -25,8 +25,7 @@ add them in ``helm_tests``. .. code-block:: python - class TestBaseChartTest: - ... + class TestBaseChartTest: ... To render the chart create a YAML string with the nested dictionary of options you wish to test. You can then use our ``render_chart`` function to render the object of interest into a testable Python dictionary. Once the chart diff --git a/contributing-docs/testing/integration_tests.rst b/contributing-docs/testing/integration_tests.rst index 322298d4f00c0..ea9dfb7e9529a 100644 --- a/contributing-docs/testing/integration_tests.rst +++ b/contributing-docs/testing/integration_tests.rst @@ -49,39 +49,41 @@ The following integrations are available: .. BEGIN AUTO-GENERATED INTEGRATION LIST -+--------------+----------------------------------------------------+ -| Identifier | Description | -+==============+====================================================+ -| cassandra | Integration required for Cassandra hooks. | -+--------------+----------------------------------------------------+ -| celery | Integration required for Celery executor tests. | -+--------------+----------------------------------------------------+ -| drill | Integration required for drill operator and hook. | -+--------------+----------------------------------------------------+ -| kafka | Integration required for Kafka hooks. | -+--------------+----------------------------------------------------+ -| kerberos | Integration that provides Kerberos authentication. | -+--------------+----------------------------------------------------+ -| mongo | Integration required for MongoDB hooks. | -+--------------+----------------------------------------------------+ -| mssql | Integration required for mssql hooks. | -+--------------+----------------------------------------------------+ -| openlineage | Integration required for Openlineage hooks. | -+--------------+----------------------------------------------------+ -| otel | Integration required for OTEL/opentelemetry hooks. | -+--------------+----------------------------------------------------+ -| pinot | Integration required for Apache Pinot hooks. | -+--------------+----------------------------------------------------+ -| qdrant | Integration required for Qdrant tests. | -+--------------+----------------------------------------------------+ -| redis | Integration required for Redis tests. | -+--------------+----------------------------------------------------+ -| statsd | Integration required for Statsd hooks. | -+--------------+----------------------------------------------------+ -| trino | Integration required for Trino hooks. | -+--------------+----------------------------------------------------+ -| ydb | Integration required for YDB tests. | -+--------------+----------------------------------------------------+ ++--------------+-------------------------------------------------------+ +| Identifier | Description | ++==============+=======================================================+ +| cassandra | Integration required for Cassandra hooks. | ++--------------+-------------------------------------------------------+ +| celery | Integration required for Celery executor tests. | ++--------------+-------------------------------------------------------+ +| drill | Integration required for drill operator and hook. | ++--------------+-------------------------------------------------------+ +| kafka | Integration required for Kafka hooks. | ++--------------+-------------------------------------------------------+ +| kerberos | Integration that provides Kerberos authentication. | ++--------------+-------------------------------------------------------+ +| keycloak | Integration for manual testing of multi-team Airflow. | ++--------------+-------------------------------------------------------+ +| mongo | Integration required for MongoDB hooks. | ++--------------+-------------------------------------------------------+ +| mssql | Integration required for mssql hooks. | ++--------------+-------------------------------------------------------+ +| openlineage | Integration required for Openlineage hooks. | ++--------------+-------------------------------------------------------+ +| otel | Integration required for OTEL/opentelemetry hooks. | ++--------------+-------------------------------------------------------+ +| pinot | Integration required for Apache Pinot hooks. | ++--------------+-------------------------------------------------------+ +| qdrant | Integration required for Qdrant tests. | ++--------------+-------------------------------------------------------+ +| redis | Integration required for Redis tests. | ++--------------+-------------------------------------------------------+ +| statsd | Integration required for Statsd hooks. | ++--------------+-------------------------------------------------------+ +| trino | Integration required for Trino hooks. | ++--------------+-------------------------------------------------------+ +| ydb | Integration required for YDB tests. | ++--------------+-------------------------------------------------------+ .. END AUTO-GENERATED INTEGRATION LIST' diff --git a/contributing-docs/testing/unit_tests.rst b/contributing-docs/testing/unit_tests.rst index ccd38250424df..e40b5c82d3b0b 100644 --- a/contributing-docs/testing/unit_tests.rst +++ b/contributing-docs/testing/unit_tests.rst @@ -320,8 +320,7 @@ Method level: @pytest.mark.db_test - def test_add_tagging(self, sentry, task_instance): - ... + def test_add_tagging(self, sentry, task_instance): ... Class level: @@ -332,8 +331,7 @@ Class level: @pytest.mark.db_test - class TestDatabricksHookAsyncAadTokenSpOutside: - ... + class TestDatabricksHookAsyncAadTokenSpOutside: ... Module level (at the top of the module): @@ -437,8 +435,7 @@ The fix for that is to sort the parameters in ``parametrize``. For example inste .. code-block:: python @pytest.mark.parametrize("status", ALL_STATES) - def test_method(): - ... + def test_method(): ... do that: @@ -447,8 +444,7 @@ do that: .. code-block:: python @pytest.mark.parametrize("status", sorted(ALL_STATES)) - def test_method(): - ... + def test_method(): ... Similarly if your parameters are defined as result of utcnow() or other dynamic method - you should avoid that, or assign unique IDs for those parametrized tests. Instead of this: @@ -470,8 +466,7 @@ avoid that, or assign unique IDs for those parametrized tests. Instead of this: ), ], ) - def test_end_date_gte_lte(url, expected_dag_run_ids): - ... + def test_end_date_gte_lte(url, expected_dag_run_ids): ... Do this: @@ -494,8 +489,7 @@ Do this: ), ], ) - def test_end_date_gte_lte(url, expected_dag_run_ids): - ... + def test_end_date_gte_lte(url, expected_dag_run_ids): ... @@ -558,8 +552,7 @@ the test is marked as DB test: ), ], ) - def test_from_json(self, input, request_class): - ... + def test_from_json(self, input, request_class): ... Instead - this will not break collection. The TaskInstance is not initialized when the module is parsed, @@ -658,8 +651,7 @@ parametrize specification is being parsed - even if test is marked as DB test. ), ], ) - def test_rendered_task_detail_env_secret(patch_app, admin_client, request, env, expected): - ... + def test_rendered_task_detail_env_secret(patch_app, admin_client, request, env, expected): ... You can make the code conditional and mock out the Variable to avoid hitting the database. @@ -704,8 +696,7 @@ You can make the code conditional and mock out the Variable to avoid hitting the ), ], ) - def test_rendered_task_detail_env_secret(patch_app, admin_client, request, env, expected): - ... + def test_rendered_task_detail_env_secret(patch_app, admin_client, request, env, expected): ... You can also use fixture to create object that needs database just like this. @@ -1056,8 +1047,7 @@ Example of the ``postgres`` only test: .. code-block:: python @pytest.mark.backend("postgres") - def test_copy_expert(self): - ... + def test_copy_expert(self): ... Example of the ``postgres,mysql`` test (they are skipped with the ``sqlite`` backend): @@ -1065,8 +1055,7 @@ Example of the ``postgres,mysql`` test (they are skipped with the ``sqlite`` bac .. code-block:: python @pytest.mark.backend("postgres", "mysql") - def test_celery_executor(self): - ... + def test_celery_executor(self): ... You can use the custom ``--backend`` switch in pytest to only run tests specific for that backend. diff --git a/dev/breeze/README.md b/dev/breeze/README.md index 5e5db1f26e866..b2f2df0e9f9b7 100644 --- a/dev/breeze/README.md +++ b/dev/breeze/README.md @@ -22,6 +22,7 @@ **Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)* - [Apache Airflow Breeze](#apache-airflow-breeze) +- [Setting up development env for Breeze](#setting-up-development-env-for-breeze) @@ -34,27 +35,19 @@ for Airflow Development. This package should never be installed in "production" mode. The `breeze` entrypoint will actually fail if you do so. It is supposed to be installed only in [editable/development mode](https://packaging.python.org/en/latest/guides/distributing-packages-using-setuptools/#working-in-development-mode) -directly from Airflow sources using `pipx` - usually with `--force` flag to account for re-installation -that might often be needed if dependencies change during development. +directly from Airflow sources using `uv tool` or `pipx` - usually with `--force` flag to account +for re-installation that might often be needed if dependencies change during development. ```shell -pipx install -e ./dev/breeze --force +uv tool install -e ./dev/breeze --force ``` -NOTE! If you see below warning - it means that you hit [known issue](https://github.com/pypa/pipx/issues/1092) -with `packaging` version 23.2 -⚠️ Ignoring --editable install option. pipx disallows it for anything but a local path, -to avoid having to create a new src/ directory. - -The workaround is to downgrade packaging to 23.1 and re-running the `pipx install` command, for example -by running `pip install "packaging<23.2"`. +or ```shell -pip install "packaging<23.2" pipx install -e ./dev/breeze --force ``` - You can read more about Breeze in the [documentation](https://github.com/apache/airflow/blob/main/dev/breeze/doc/README.rst) This README file contains automatically generated hash of the `pyproject.toml` files that were @@ -62,10 +55,79 @@ available when the package was installed. Since this file becomes part of the in to detect automatically if any of the files have changed. If they did, the user will be warned to upgrade their installations. +Setting up development env for Breeze +------------------------------------- + +> [!NOTE] +> This section is for developers of Breeze. If you are a user of Breeze, you do not need to read this section. + +Breeze is actively developed by Airflow maintainers and contributors, Airflow is an active project +and we are in the process of developing Airflow 3, so breeze requires a lot of adjustments to keep up +the dev environment in sync with Airflow 3 development - this is also why it is part of the same +repository as Airflow - because it needs to be closely synchronized with Airflow development. + +As of November 2024 Airflow switches to using `uv` as the main development environment for Airflow +and for Breeze. So the instructions below are for setting up the development environment for Breeze +using `uv`. However we are using only standard python packaging tools, so you can still use `pip` or +`pipenv` or other build frontends to install Breeze, but we recommend using `uv` as it is the most +convenient way to install, manage python packages and virtual environments. + +Unlike in Airflow, where we manage our own constraints, we use `uv` to manage requirements for Breeze +and we use `uv` to lock the dependencies. This way we can ensure that the dependencies are always +up-to-date and that the development environment is always consistent for different people. This is +why Breeze's `uv.lock` is committed to the repository and is used to install the dependencies by +default by Breeze. Here's how to install breeze with `uv` + + +1. Install `uv` - see [uv documentation](https://docs.astral.sh/uv/getting-started/installation/) + +> [!IMPORTANT] +> All the commands below should be executed while you are in `dev/breeze` directory of the Airflow repository. + +2. Create a new virtual environment for Breeze development: + +```shell +uv venv +``` + +3. Synchronize Breeze dependencies with `uv` to the latest dependencies stored in uv.lock file: + +```shell +uv sync +``` + +After syncing, the `.venv` directory will contain the virtual environment with all the dependencies +installed - you can use that environment to develop Breeze - for example with your favourite IDE +or text editor, you can also use `uv run` to run the scripts in the virtual environment. + +For example to run all tests in the virtual environment you can use: + +```shell +uv run pytest +``` + +4. Add/remove dependencies with `uv`: + +```shell +uv add +uv remove +``` + +5. Update and lock the dependencies (after adding them or periodically to keep them up-to-date): + +```shell +uv lock +``` + +Note that when you update dependencies/lock them you should commit the changes in `pyproject.toml` and `uv.lock`. + +See [uv documentation](https://docs.astral.sh/uv/getting-started/) for more details on using `uv`. + + PLEASE DO NOT MODIFY THE HASH BELOW! IT IS AUTOMATICALLY UPDATED BY PRE-COMMIT. --------------------------------------------------------------------------------------------------------- -Package config hash: b2e0a459155b5c685597fdf153589d13ce8730b312bccc5fb8810ced491730649239e07261bb9c7fdbe361a20d3c5497d0ca65dbd2964a2b74cf065d04f710fc +Package config hash: f2fa293aecdc1deadd4f08785b9ea77cd597e50f01ad79c092219b7f06eed68f3b0509d0ee7626a2fe25f23875e42a838899b274b5311690546e34e6c660caeb --------------------------------------------------------------------------------------------------------- diff --git a/dev/breeze/doc/01_installation.rst b/dev/breeze/doc/01_installation.rst index aad8640e7f60c..9c053bf280b66 100644 --- a/dev/breeze/doc/01_installation.rst +++ b/dev/breeze/doc/01_installation.rst @@ -249,27 +249,31 @@ In case of disk space errors on macOS, increase the disk space available for Doc Installation ============ +First, clone the Airflow repository, but make sure not to clone it into your home directory. Cloning it into your home directory will cause the following error: +``Your Airflow sources are checked out in /Users/username/airflow, which is also your AIRFLOW_HOME where Airflow writes logs and database files. This setup is problematic because Airflow might overwrite or clean up your source code and .git repository.`` -Set your working directory to root of (this) cloned repository. -Run this command to install Breeze (make sure to use ``-e`` flag): +.. code-block:: bash + + git clone https://github.com/apache/airflow.git + +Set your working directory to the root of this cloned repository. .. code-block:: bash - pipx install -e ./dev/breeze + cd airflow -.. warning:: +Run this command to install Breeze (make sure to use ``-e`` flag) - you can choose ``uv`` (recommended) or +``pipx``: - If you see below warning - it means that you hit `known issue `_ - with ``packaging`` version 23.2: - ⚠️ Ignoring --editable install option. pipx disallows it for anything but a local path, - to avoid having to create a new src/ directory. - The workaround is to downgrade packaging to 23.1 and re-running the ``pipx install`` command. +.. code-block:: bash - .. code-block:: bash + uv tool install -e ./dev/breeze - pip install "packaging<23.2" - pipx install -e ./dev/breeze --force + +.. code-block:: bash + + pipx install -e ./dev/breeze .. note:: Note for Windows users @@ -278,6 +282,12 @@ Run this command to install Breeze (make sure to use ``-e`` flag): If you are on Windows, you should use Windows way to point to the ``dev/breeze`` sub-folder of Airflow either as absolute or relative path. For example: + .. code-block:: bash + + uv tool install -e dev\breeze + + or + .. code-block:: bash pipx install -e dev\breeze @@ -317,8 +327,14 @@ that Breeze works on .. warning:: Upgrading from earlier Python version - If you used Breeze with Python 3.7 and when running it, it will complain that it needs Python 3.8. In this - case you should force-reinstall Breeze with ``pipx``: + If you used Breeze with Python 3.8 and when running it, it will complain that it needs Python 3.9. In this + case you should force-reinstall Breeze with ``uv`` (or ``pipx``): + + .. code-block:: bash + + uv tool install --force -e ./dev/breeze + + or .. code-block:: bash @@ -330,30 +346,35 @@ that Breeze works on If you are on Windows, you should use Windows way to point to the ``dev/breeze`` sub-folder of Airflow either as absolute or relative path. For example: + .. code-block:: bash + + uv tool install --force -e dev\breeze + + or + .. code-block:: bash pipx install --force -e dev\breeze - .. note:: creating pipx virtual env ``apache-airflow-breeze`` with a specific python version - In ``pipx install -e ./dev/breeze`` or ``pipx install -e dev\breeze``, ``pipx`` uses default - system python version to create virtual env for breeze. - We can use a specific version by providing python executable in ``--python`` argument. For example: + .. note:: creating virtual env for ``apache-airflow-breeze`` with a specific python version + The ``uv tool install`` or ``pipx install`` use default system python version to create virtual + env for breeze. You can use a specific version by providing python version in ``uv`` or + python executable in ``pipx`` in ``--python``. If you have breeze installed already with another Python version you can reinstall breeze with reinstall command .. code-block:: bash - pipx reinstall --python /Users/airflow/.pyenv/versions/3.8.16/bin/python apache-airflow-breeze + uv tool install --python 3.9.16 ./dev/breeze --force - Or you can uninstall breeze and install it with a specific python version: + or .. code-block:: bash - pipx uninstall apache-airflow-breeze - pipx install -e ./dev/breeze --python /Users/airflow/.pyenv/versions/3.8.16/bin/python + pipx install -e ./dev/breeze --python /Users/airflow/.pyenv/versions/3.9.16/bin/python --force Running Breeze for the first time @@ -466,19 +487,18 @@ Automating breeze installation ------------------------------ Breeze on POSIX-compliant systems (Linux, MacOS) can be automatically installed by running the -``scripts/tools/setup_breeze`` bash script. This includes checking and installing ``pipx``, setting up +``scripts/tools/setup_breeze`` bash script. This includes checking and installing ``uv``, setting up ``breeze`` with it and setting up autocomplete. Uninstalling Breeze ------------------- -Since Breeze is installed with ``pipx``, with ``pipx list``, you can list the installed packages. -Once you have the name of ``breeze`` package you can proceed to uninstall it. +Since Breeze is installed with ``uv tool`` or ``pipx``, you need to use the appropriate tool to uninstall it. .. code-block:: bash - pipx list + uv tool uninstall apache-airflow-breeze This will also remove breeze from the folder: ``${HOME}.local/bin/`` diff --git a/dev/breeze/doc/02_customizing.rst b/dev/breeze/doc/02_customizing.rst index 291314abfc339..009b5a25149d6 100644 --- a/dev/breeze/doc/02_customizing.rst +++ b/dev/breeze/doc/02_customizing.rst @@ -61,6 +61,40 @@ so you can change it at any place, and run inside container, to enable modified tmux configurations. +Tmux tldr +~~~~~~~~~ + +In case you, like some Airflow core devs, are a tmux dummy, here are some tmux config entries +that you may find helpful. + +.. code-block:: + + # if you like vi mode instead of emacs + set-window-option -g mode-keys vi + + # will not clear the selection immediately + bind-key -T copy-mode-vi MouseDragEnd1Pane send-keys -X copy-selection-no-clear + + # make it so ctrl+shift+arrow moves the focused pane + bind -T root C-S-Left select-pane -L + bind -T root C-S-Right select-pane -R + bind -T root C-S-Up select-pane -U + bind -T root C-S-Down select-pane -D + +Some helpful commands: + + - ``ctrl-b + z``: zoom into selected pane + - ``ctrl-b + [``: enter copy mode + +To copy an entire pane: + - select the pane + - enter copy mode: ``ctrl-b + [`` + - go to start: ``g`` + - begin selection: ``space`` + - extend selection to end: ``G`` + - copy and clear selection: ``enter`` + + Additional tools in Breeze container ------------------------------------ diff --git a/dev/breeze/doc/03_developer_tasks.rst b/dev/breeze/doc/03_developer_tasks.rst index 3ad8df4773b55..402e67b1bf8e6 100644 --- a/dev/breeze/doc/03_developer_tasks.rst +++ b/dev/breeze/doc/03_developer_tasks.rst @@ -34,12 +34,12 @@ You can use additional ``breeze`` flags to choose your environment. You can spec version to use, and backend (the meta-data database). Thanks to that, with Breeze, you can recreate the same environments as we have in matrix builds in the CI. See next chapter for backend selection. -For example, you can choose to run Python 3.8 tests with MySQL as backend and with mysql version 8 +For example, you can choose to run Python 3.9 tests with MySQL as backend and with mysql version 8 as follows: .. code-block:: bash - breeze --python 3.8 --backend mysql --mysql-version 8.0 + breeze --python 3.9 --backend mysql --mysql-version 8.0 .. note:: Note for Windows WSL2 users @@ -55,7 +55,7 @@ Try adding ``--builder=default`` to your command. For example: .. code-block:: bash - breeze --builder=default --python 3.8 --backend mysql --mysql-version 8.0 + breeze --builder=default --python 3.9 --backend mysql --mysql-version 8.0 The choices you make are persisted in the ``./.build/`` cache directory so that next time when you use the ``breeze`` script, it could use the values that were used previously. This way you do not have to specify @@ -328,7 +328,7 @@ When you are starting airflow from local sources, www asset compilation is autom .. code-block:: bash - breeze --python 3.8 --backend mysql start-airflow + breeze --python 3.9 --backend mysql start-airflow You can also use it to start different executor. @@ -341,7 +341,7 @@ You can also use it to start any released version of Airflow from ``PyPI`` with .. code-block:: bash - breeze start-airflow --python 3.8 --backend mysql --use-airflow-version 2.7.0 + breeze start-airflow --python 3.9 --backend mysql --use-airflow-version 2.7.0 When you are installing version from PyPI, it's also possible to specify extras that should be used when installing Airflow - you can provide several extras separated by coma - for example to install @@ -397,6 +397,17 @@ command takes care about it. This is needed when you want to run webserver insid :width: 100% :alt: Breeze compile-www-assets +Compiling ui assets +-------------------- + +Airflow webserver needs to prepare www assets - compiled with node and yarn. The ``compile-ui-assets`` +command takes care about it. This is needed when you want to run webserver inside of the breeze. + +.. image:: ./images/output_compile-ui-assets.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_compile-ui-assets.svg + :width: 100% + :alt: Breeze compile-ui-assets + Breeze cleanup -------------- @@ -405,7 +416,7 @@ are several reasons why you might want to do that. Breeze uses docker images heavily and those images are rebuild periodically and might leave dangling, unused images in docker cache. This might cause extra disk usage. Also running various docker compose commands -(for example running tests with ``breeze testing tests``) might create additional docker networks that might +(for example running tests with ``breeze testing core-tests``) might create additional docker networks that might prevent new networks from being created. Those networks are not removed automatically by docker-compose. Also Breeze uses it's own cache to keep information about all images. @@ -433,7 +444,7 @@ Then, next time when you start Breeze, it will have the data pre-populated. These are all available flags of ``down`` command: -.. image:: ./images/output-down.svg +.. image:: ./images/output_down.svg :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_down.svg :width: 100% :alt: Breeze down diff --git a/dev/breeze/doc/04_troubleshooting.rst b/dev/breeze/doc/04_troubleshooting.rst index fd0b1dfa401dc..c9f9aa2b1c35f 100644 --- a/dev/breeze/doc/04_troubleshooting.rst +++ b/dev/breeze/doc/04_troubleshooting.rst @@ -72,8 +72,51 @@ describe your problem. stated in `This comment `_ and allows to run Breeze with no problems. -Bad Interpreter Error ---------------------- +Cannot import name 'cache' or Python >=3.9 required +--------------------------------------------------- + +When you see this error: + +.. code-block:: + + ImportError: cannot import name 'cache' from 'functools' (/Users/jarek/Library/Application Support/hatch/pythons/3.8/python/lib/python3.8/functools.py) + +or + +.. code-block:: + + ERROR: Package 'blacken-docs' requires a different Python: 3.8.18 not in '>=3.9' + + +It means that your pre-commit hook is installed with (already End-Of-Life) Python 3.8 and you should reinstall +it and clean pre-commit cache. + +This can be done with ``uv tool`` to install ``pre-commit``) + +.. code-block:: bash + + uv tool uninstall pre-commit + uv tool install pre-commit --python 3.9 --force + pre-commit clean + pre-commit install + +You can also use ``pipx`` + +.. code-block:: bash + + pipx uninstall pre-commit + pipx install pre-commit --python $(which python3.9) --force + # This one allows pre-commit to use uv for venvs installed by pre-commit + pipx inject pre-commit + pre-commit clean + pre-commit install + +If you installed ``pre-commit`` differently, you should remove and reinstall +it (and clean cache) following the way you installed it. + + +Bad Interpreter Error with ``pipx`` +----------------------------------- If you are experiencing bad interpreter errors ``zsh: /Users/eladkal/.local/bin/breeze: bad interpreter: /Users/eladkal/.local/pipx/venvs/apache-airflow-breeze/bin/python: no such file or directory`` diff --git a/dev/breeze/doc/05_test_commands.rst b/dev/breeze/doc/05_test_commands.rst index 79aa206921d21..a3c973d911ff9 100644 --- a/dev/breeze/doc/05_test_commands.rst +++ b/dev/breeze/doc/05_test_commands.rst @@ -75,34 +75,28 @@ This applies to all kind of tests - all our tests can be run using pytest. Running unit tests with ``breeze testing`` commands ................................................... -An option you have is that you can also run tests via built-in ``breeze testing tests`` command - which -is a "swiss-army-knife" of unit testing with Breeze. This command has a lot of parameters and is very -flexible thus might be a bit overwhelming. +An option you have is that you can also run tests via built-in ``breeze testing *tests*`` commands - which +is a "swiss-army-knife" of unit testing with Breeze. You can run all groups of tests with that Airflow +supports with one of the commands below. -In most cases if you want to run tess you want to use dedicated ``breeze testing db-tests`` -or ``breeze testing non-db-tests`` commands that automatically run groups of tests that allow you to choose -subset of tests to run (with ``--parallel-test-types`` flag) +Using ``breeze testing core-tests`` command +........................................... -Using ``breeze testing tests`` command -...................................... +The ``breeze testing core-tests`` command is that you can run for all or specify sub-set of the tests +for Core. -The ``breeze testing tests`` command is that you can easily specify sub-set of the tests -- including -selecting specific Providers tests to run. - -For example this will only run provider tests for airbyte and http providers: +For example this will run all core tests : .. code-block:: bash - breeze testing tests --test-type "Providers[airbyte,http]" - -You can also exclude tests for some providers from being run when whole "Providers" test type is run. + breeze testing core-tests -For example this will run tests for all providers except amazon and google provider tests: +For example this will only run "Other" tests : .. code-block:: bash - breeze testing tests --test-type "Providers[-amazon,google]" + breeze testing core-tests --test-type "Other" You can also run parallel tests with ``--run-in-parallel`` flag - by default it will run all tests types in parallel, but you can specify the test type that you want to run with space separated list of test @@ -112,124 +106,140 @@ For example this will run API and WWW tests in parallel: .. code-block:: bash - breeze testing tests --parallel-test-types "API WWW" --run-in-parallel + breeze testing core-tests --parallel-test-types "API WWW" --run-in-parallel -There are few special types of tests that you can run: +Here is the detailed set of options for the ``breeze testing core-tests`` command. -* ``All`` - all tests are run in single pytest run. -* ``All-Postgres`` - runs all tests that require Postgres database -* ``All-MySQL`` - runs all tests that require MySQL database -* ``All-Quarantine`` - runs all tests that are in quarantine (marked with ``@pytest.mark.quarantined`` - decorator) +.. image:: ./images/output_testing_core-tests.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_testing_core-tests.svg + :width: 100% + :alt: Breeze testing core-tests -Here is the detailed set of options for the ``breeze testing tests`` command. +Using ``breeze testing providers-tests`` command +................................................ -.. image:: ./images/output_testing_tests.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_testing_tests.svg - :width: 100% - :alt: Breeze testing tests +The ``breeze testing providers-tests`` command is that you can run for all or specify sub-set of the tests +for Providers. -Using ``breeze testing db-tests`` command -......................................... +For example this will run all provider tests tests : -The ``breeze testing db-tests`` command is simplified version of the ``breeze testing tests`` command -that only allows you to run tests that are not bound to a database - in parallel utilising all your CPUS. -The DB-bound tests are the ones that require a database to be started and configured separately for -each test type run and they are run in parallel containers/parallel docker compose projects to -utilise multiple CPUs your machine has - thus allowing you to quickly run few groups of tests in parallel. -This command is used in CI to run DB tests. +.. code-block:: bash -By default this command will run complete set of test types we have, thus allowing you to see result -of all DB tests we have but you can choose a subset of test types to run by ``--parallel-test-types`` -flag or exclude some test types by specifying ``--excluded-parallel-test-types`` flag. + breeze testing providers-tests -Run all DB tests: +This will only run "amazon" and "google" provider tests : .. code-block:: bash - breeze testing db-tests + breeze testing providers-tests --test-type "Providers[amazon,google]" -Only run DB tests from "API CLI WWW" test types: +You can also run "all but" provider tests - this will run all providers tests except amazon and google : .. code-block:: bash - breeze testing db-tests --parallel-test-types "API CLI WWW" + breeze testing providers-tests --test-type "Providers[-amazon,google]" + +You can also run parallel tests with ``--run-in-parallel`` flag - by default it will run all tests types +in parallel, but you can specify the test type that you want to run with space separated list of test +types passed to ``--parallel-test-types`` flag. -Run all DB tests excluding those in CLI and WWW test types: +For example this will run ``amazon`` and ``google`` tests in parallel: .. code-block:: bash - breeze testing db-tests --excluded-parallel-test-types "CLI WWW" + breeze testing providers-tests --parallel-test-types "Providers[amazon] Providers[google]" --run-in-parallel -Here is the detailed set of options for the ``breeze testing db-tests`` command. +Here is the detailed set of options for the ``breeze testing providers-test`` command. -.. image:: ./images/output_testing_db-tests.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_testing_db-tests.svg +.. image:: ./images/output_testing_providers-tests.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_testing_providers-tests.svg :width: 100% - :alt: Breeze testing db-tests + :alt: Breeze testing providers-tests +Running integration core tests +............................... -Using ``breeze testing non-db-tests`` command -......................................... +You can also run integration core tests via built-in ``breeze testing core-integration-tests`` command. +Some of our core tests require additional integrations to be started in docker-compose. +The integration tests command will run the expected integration and tests that need that integration. -The ``breeze testing non-db-tests`` command is simplified version of the ``breeze testing tests`` command -that only allows you to run tests that are not bound to a database - in parallel utilising all your CPUS. -The non-DB-bound tests are the ones that do not expect a database to be started and configured and we can -utilise multiple CPUs your machine has via ``pytest-xdist`` plugin - thus allowing you to quickly -run few groups of tests in parallel using single container rather than many of them as it is the case for -DB-bound tests. This command is used in CI to run Non-DB tests. +For example this will only run kerberos tests: -By default this command will run complete set of test types we have, thus allowing you to see result -of all DB tests we have but you can choose a subset of test types to run by ``--parallel-test-types`` -flag or exclude some test types by specifying ``--excluded-parallel-test-types`` flag. +.. code-block:: bash -Run all non-DB tests: + breeze testing core-integration-tests --integration kerberos -.. code-block:: bash +Here is the detailed set of options for the ``breeze testing core-integration-tests`` command. + +.. image:: ./images/output_testing_core-integration-tests.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_testing_core-integration-tests.svg + :width: 100% + :alt: Breeze testing core-integration-tests - breeze testing non-db-tests +Running integration providers tests +................................... -Only run non-DB tests from "API CLI WWW" test types: +You can also run integration core tests via built-in ``breeze testing providers-integration-tests`` command. +Some of our core tests require additional integrations to be started in docker-compose. +The integration tests command will run the expected integration and tests that need that integration. + +For example this will only run kerberos tests: .. code-block:: bash - breeze testing non-db-tests --parallel-test-types "API CLI WWW" + breeze testing providers-integration-tests --integration kerberos + +Here is the detailed set of options for the ``breeze testing providers-integration-tests`` command. + +.. image:: ./images/output_testing_providers-integration-tests.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_testing_providers-integration-tests.svg + :width: 100% + :alt: Breeze testing providers-integration-tests + + +Running Python API client tests +............................... -Run all non-DB tests excluding those in CLI and WWW test types: +To run Python API client tests, you need to have airflow python client packaged in dist folder. +To package the client, clone the airflow-python-client repository and run the following command: .. code-block:: bash - breeze testing non-db-tests --excluded-parallel-test-types "CLI WWW" + breeze release-management prepare-python-client --package-format both + --version-suffix-for-pypi dev0 --python-client-repo ./airflow-client-python -Here is the detailed set of options for the ``breeze testing non-db-tests`` command. +.. code-block:: bash + + breeze testing python-api-client-tests -.. image:: ./images/output_testing_non-db-tests.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_testing_non-db-tests.svg +Here is the detailed set of options for the ``breeze testing python-api-client-tests`` command. + +.. image:: ./images/output_testing_python-api-client-tests.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_testing_python-api-client-tests.svg :width: 100% - :alt: Breeze testing non-db-tests + :alt: Breeze testing python-api-client-tests -Running integration tests -......................... +Running system tests +.................... -You can also run integration tests via built-in ``breeze testing integration-tests`` command. Some of our -tests require additional integrations to be started in docker-compose. The integration tests command will -run the expected integration and tests that need that integration. +You can also run system core tests via built-in ``breeze testing system-tests`` command. +Some of our core system tests runs against external systems and we can run them providing that +credentials are configured to connect to those systems. Usually you should run only one or +set of related tests this way. -For example this will only run kerberos tests: +For example this will only run example_external_task_child_deferrable tests: .. code-block:: bash - breeze testing integration-tests --integration kerberos - + breeze testing system-tests tests/system/example_empty.py -Here is the detailed set of options for the ``breeze testing integration-tests`` command. +Here is the detailed set of options for the ``breeze testing system-tests`` command. -.. image:: ./images/output_testing_integration-tests.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_testing_integration_tests.svg +.. image:: ./images/output_testing_system-tests.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_testing_system-tests.svg :width: 100% - :alt: Breeze testing integration-tests - + :alt: Breeze testing system-tests Running Helm unit tests ....................... @@ -307,7 +317,7 @@ Kubernetes environment can be set with the ``breeze k8s setup-env`` command. It will create appropriate virtualenv to run tests and download the right set of tools to run the tests: ``kind``, ``kubectl`` and ``helm`` in the right versions. You can re-run the command when you want to make sure the expected versions of the tools are installed properly in the -virtualenv. The Virtualenv is available in ``.build/.k8s-env/bin`` subdirectory of your Airflow +virtualenv. The Virtualenv is available in ``.build/k8s-env/bin`` subdirectory of your Airflow installation. .. image:: ./images/output_k8s_setup-env.svg @@ -530,7 +540,7 @@ be created and airflow deployed to it before running the tests): (kind-airflow-python-3.9-v1.24.0:KubernetesExecutor)> pytest test_kubernetes_executor.py ================================================= test session starts ================================================= - platform linux -- Python 3.10.6, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /home/jarek/code/airflow/.build/.k8s-env/bin/python + platform linux -- Python 3.10.6, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /home/jarek/code/airflow/.build/k8s-env/bin/python cachedir: .pytest_cache rootdir: /home/jarek/code/airflow, configfile: pytest.ini plugins: anyio-3.6.1 @@ -540,8 +550,8 @@ be created and airflow deployed to it before running the tests): test_kubernetes_executor.py::TestKubernetesExecutor::test_integration_run_dag_with_scheduler_failure PASSED [100%] ================================================== warnings summary =================================================== - .build/.k8s-env/lib/python3.10/site-packages/_pytest/config/__init__.py:1233 - /home/jarek/code/airflow/.build/.k8s-env/lib/python3.10/site-packages/_pytest/config/__init__.py:1233: PytestConfigWarning: Unknown config option: asyncio_mode + .build/k8s-env/lib/python3.10/site-packages/_pytest/config/__init__.py:1233 + /home/jarek/code/airflow/.build/k8s-env/lib/python3.10/site-packages/_pytest/config/__init__.py:1233: PytestConfigWarning: Unknown config option: asyncio_mode self._warn_or_fail_if_strict(f"Unknown config option: {key}\n") diff --git a/dev/breeze/doc/06_managing_docker_images.rst b/dev/breeze/doc/06_managing_docker_images.rst index 294f1540f3667..ac0b0e6b1f61e 100644 --- a/dev/breeze/doc/06_managing_docker_images.rst +++ b/dev/breeze/doc/06_managing_docker_images.rst @@ -76,7 +76,7 @@ These are all available flags of ``pull`` command: Verifying CI image .................. -Finally, you can verify CI image by running tests - either with the pulled/built images or +You can verify CI image by running tests - either with the pulled/built images or with an arbitrary image. These are all available flags of ``verify`` command: @@ -86,6 +86,86 @@ These are all available flags of ``verify`` command: :width: 100% :alt: Breeze ci-image verify +Loading and saving CI image +........................... + +You can load and save PROD image - for example to transfer it to another machine or to load an image +that has been built in our CI. + +These are all available flags of ``save`` command: + +.. image:: ./images/output_ci-image_save.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_ci-image_save.svg + :width: 100% + :alt: Breeze ci-image save + +These are all available flags of ``load`` command: + +.. image:: ./images/output_ci-image_load.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_ci-image_load.svg + :width: 100% + :alt: Breeze ci-image load + +Images for every build from our CI are uploaded as artifacts to the +GitHub Action run (in summary) and can be downloaded from there for 2 days in order to reproduce the complete +environment used during the tests and loaded to the local Docker registry (note that you have +to use the same platform as the CI run). + +You will find the artifacts for each image in the summary of the CI run. The artifacts are named +``ci-image-docker-export---_merge``. Those are compressed zip files that +contain the ".tar" image that should be used with ``--image-file`` flag of the load method. Make sure to +use the same ``--python`` version as the image was built with. + +To load the image from specific PR, you can use the following command: + +.. code-block:: bash + + breeze ci-image load --from-pr 12345 --python 3.9 --github-token + +To load the image from specific job run (for example 12538475388), you can use the following command, find the run id from github action runs. + +.. code-block:: bash + + breeze ci-image load --from-run 12538475388 --python 3.9 --github-token + +After you load the image, you can reproduce the very exact environment that was used in the CI run by +entering breeze container without mounting your local sources: + +.. code-block:: bash + + breeze shell --mount-sources skip [OTHER OPTIONS] + +And you should be able to run any tests and commands interactively in the very exact environment that +was used in the failing CI run. This is a powerful tool to debug and fix CI issues. + + +.. image:: ./images/image_artifacts.png + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_ci-image_load.svg + :width: 100% + :alt: Breeze image artifacts + +Exporting and importing CI image cache mount +............................................ + +During the build, cache of ``uv`` and ``pip`` is stored in a separate "cache mount" volum that is mounted +during the build. This cache mount volume is preserved between builds and can be exported and imported +to speed up the build process in CI - where cache is stored as artifact and can be imported in the next +build. + +These are all available flags of ``export-mount-cache`` command: + +.. image:: ./images/output_ci-image_export-mount-cache.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_ci-image_export-mount-cache.svg + :width: 100% + :alt: Breeze ci-image + +These are all available flags of ``import-mount-cache`` command: + +.. image:: ./images/output_ci-image_import-mount-cache.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_ci-image_import-mount-cache.svg + :width: 100% + :alt: Breeze ci-image import-mount-cache + PROD Image tasks ---------------- @@ -140,10 +220,10 @@ suffix and they need to also be paired with corresponding runtime dependency add .. code-block:: bash - breeze prod-image build --python 3.8 --additional-dev-deps "libasound2-dev" \ + breeze prod-image build --python 3.9 --additional-dev-deps "libasound2-dev" \ --additional-runtime-apt-deps "libasound2" -Same as above but uses python 3.8. +Same as above but uses python 3.9. Building PROD image ................... @@ -170,7 +250,7 @@ These are all available flags of ``pull-prod-image`` command: Verifying PROD image .................... -Finally, you can verify PROD image by running tests - either with the pulled/built images or +You can verify PROD image by running tests - either with the pulled/built images or with an arbitrary image. These are all available flags of ``verify-prod-image`` command: @@ -180,6 +260,31 @@ These are all available flags of ``verify-prod-image`` command: :width: 100% :alt: Breeze prod-image verify +Loading and saving PROD image +............................. + +You can load and save PROD image - for example to transfer it to another machine or to load an image +that has been built in our CI. + +These are all available flags of ``save`` command: + +.. image:: ./images/output_prod-image_save.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_prod-image_save.svg + :width: 100% + :alt: Breeze prod-image save + +These are all available flags of ``load`` command: + +.. image:: ./images/output-prod-image_load.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_prod-image_load.svg + :width: 100% + :alt: Breeze prod-image load + +Similarly as in case of CI images, Images for every build from our CI are uploaded as artifacts to the +GitHub Action run (in summary) and can be downloaded from there for 2 days in order to reproduce the complete +environment used during the tests and loaded to the local Docker registry (note that you have +to use the same platform as the CI run). + ------ Next step: Follow the `Breeze maintenance tasks <07_breeze_maintenance_tasks.rst>`_ to learn about tasks that diff --git a/dev/breeze/doc/09_release_management_tasks.rst b/dev/breeze/doc/09_release_management_tasks.rst index 930f61159d16e..9feb9a5b195e6 100644 --- a/dev/breeze/doc/09_release_management_tasks.rst +++ b/dev/breeze/doc/09_release_management_tasks.rst @@ -26,7 +26,7 @@ do not need or have no access to run). Those are usually connected with releasin Those are all of the available release management commands: .. image:: ./images/output_release-management.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management.svg :width: 100% :alt: Breeze release management @@ -55,7 +55,7 @@ default is to build ``both`` type of packages ``sdist`` and ``wheel``. breeze release-management prepare-airflow-package --package-format=wheel .. image:: ./images/output_release-management_prepare-airflow-package.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_prepare-airflow-package.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_prepare-airflow-package.svg :width: 100% :alt: Breeze release-management prepare-airflow-package @@ -79,7 +79,7 @@ tarball for. breeze release-management prepare-airflow-tarball --version 2.8.0rc1 .. image:: ./images/output_release-management_prepare-airflow-tarball.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_prepare-airflow-tarball.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_prepare-airflow-tarball.svg :width: 100% :alt: Breeze release-management prepare-airflow-tarball @@ -94,7 +94,7 @@ automates it. breeze release-management create-minor-branch .. image:: ./images/output_release-management_create-minor-branch.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_create-minor-branch.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_create-minor-branch.svg :width: 100% :alt: Breeze release-management create-minor-branch @@ -109,7 +109,7 @@ When we prepare release candidate, we automate some of the steps we need to do. breeze release-management start-rc-process .. image:: ./images/output_release-management_start-rc-process.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_start-rc-process.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_start-rc-process.svg :width: 100% :alt: Breeze release-management start-rc-process @@ -123,7 +123,7 @@ When we prepare final release, we automate some of the steps we need to do. breeze release-management start-release .. image:: ./images/output_release-management_start-release.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_start-rc-process.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_start-rc-process.svg :width: 100% :alt: Breeze release-management start-rc-process @@ -154,7 +154,7 @@ You can also generate python client with custom security schemes. These are all of the available flags for the command: .. image:: ./images/output_release-management_prepare-python-client.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_prepare-python-client.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_prepare-python-client.svg :width: 100% :alt: Breeze release management prepare Python client @@ -185,7 +185,7 @@ step can be skipped if you pass the ``--skip-latest`` flag. These are all of the available flags for the ``release-prod-images`` command: .. image:: ./images/output_release-management_release-prod-images.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_release-prod-images.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_release-prod-images.svg :width: 100% :alt: Breeze release management release prod images @@ -208,7 +208,7 @@ However, If you want to disable this behaviour, set the envvar CLEAN_LOCAL_TAGS These are all of the available flags for the ``tag-providers`` command: .. image:: ./images/output_release-management_tag-providers.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_tag-providers.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_tag-providers.svg :width: 100% :alt: Breeze release management tag-providers @@ -234,7 +234,7 @@ which version of Helm Chart you are preparing the tarball for. breeze release-management prepare-helm-chart-tarball --version 1.12.0 --version-suffix rc1 .. image:: ./images/output_release-management_prepare-helm-chart-tarball.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_prepare-helm-chart-tarball.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_prepare-helm-chart-tarball.svg :width: 100% :alt: Breeze release-management prepare-helm-chart-tarball @@ -256,7 +256,7 @@ This prepares helm chart .tar.gz package in the dist folder. breeze release-management prepare-helm-chart-package --sign myemail@apache.org .. image:: ./images/output_release-management_prepare-helm-chart-package.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_prepare-helm-chart-package.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_prepare-helm-chart-package.svg :width: 100% :alt: Breeze release-management prepare-helm-chart-package @@ -292,7 +292,7 @@ The below example perform documentation preparation for provider packages. You can also add ``--answer yes`` to perform non-interactive build. .. image:: ./images/output_release-management_prepare-provider-documentation.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_prepare-provider-documentation.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_prepare-provider-documentation.svg :width: 100% :alt: Breeze prepare-provider-documentation @@ -325,7 +325,7 @@ You can see all providers available by running this command: breeze release-management prepare-provider-packages --help .. image:: ./images/output_release-management_prepare-provider-packages.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_prepare-provider-packages.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_prepare-provider-packages.svg :width: 100% :alt: Breeze prepare-provider-packages @@ -349,7 +349,7 @@ You can also run the verification with an earlier airflow version to check for c All the command parameters are here: .. image:: ./images/output_release-management_install-provider-packages.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_install-provider-packages.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_install-provider-packages.svg :width: 100% :alt: Breeze install-provider-packages @@ -373,7 +373,7 @@ You can also run the verification with an earlier airflow version to check for c All the command parameters are here: .. image:: ./images/output_release-management_verify-provider-packages.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_verify-provider-packages.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_verify-provider-packages.svg :width: 100% :alt: Breeze verify-provider-packages @@ -387,7 +387,7 @@ provider has been released) and date of the release of the provider version. These are all of the available flags for the ``generate-providers-metadata`` command: .. image:: ./images/output_release-management_generate-providers-metadata.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_generate-providers-metadata.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_generate-providers-metadata.svg :width: 100% :alt: Breeze release management generate providers metadata @@ -398,7 +398,7 @@ Generating Provider Issue You can use Breeze to generate a provider issue when you release new providers. .. image:: ./images/output_release-management_generate-issue-content-providers.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_generate-issue-content-providers.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_generate-issue-content-providers.svg :width: 100% :alt: Breeze generate-issue-content-providers @@ -414,7 +414,7 @@ command. These are all available flags of ``clean-old-provider-artifacts`` command: .. image:: ./images/output_release-management_clean-old-provider-artifacts.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_clean-old-provider-artifacts.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_clean-old-provider-artifacts.svg :width: 100% :alt: Breeze Clean Old Provider Artifacts @@ -462,7 +462,7 @@ Constraints are generated separately for each python version and there are separ These are all available flags of ``generate-constraints`` command: .. image:: ./images/output_release-management_generate-constraints.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_generate-constraints.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_generate-constraints.svg :width: 100% :alt: Breeze generate-constraints @@ -485,7 +485,7 @@ tagged already in the past. This can be done using ``breeze release-management u These are all available flags of ``update-constraints`` command: .. image:: ./images/output_release-management_update-constraints.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_update-constraints.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_update-constraints.svg :width: 100% :alt: Breeze update-constraints @@ -552,7 +552,7 @@ publishing docs for multiple providers. These are all available flags of ``release-management publish-docs`` command: .. image:: ./images/output_release-management_publish-docs.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_publish-docs.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_publish-docs.svg :width: 100% :alt: Breeze Publish documentation @@ -596,7 +596,7 @@ providers - you can mix apache-airflow, helm-chart and provider packages this wa These are all available flags of ``release-management add-back-references`` command: .. image:: ./images/output_release-management_add-back-references.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_release-management_add-back-references.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_release-management_add-back-references.svg :width: 100% :alt: Breeze Add Back References @@ -606,7 +606,7 @@ SBOM generation tasks Maintainers also can use Breeze for SBOM generation: .. image:: ./images/output_sbom.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_sbom.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_sbom.svg :width: 100% :alt: Breeze sbom @@ -619,7 +619,7 @@ done by the ``generate-providers-requirements`` command. This command generates selected provider and python version, using the airflow version specified. .. image:: ./images/output_sbom_generate-providers-requirements.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_sbom_generate-providers-requirements.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_sbom_generate-providers-requirements.svg :width: 100% :alt: Breeze generate SBOM provider requirements @@ -634,7 +634,7 @@ information is written directly to ``docs-archive`` in airflow-site repository. These are all of the available flags for the ``update-sbom-information`` command: .. image:: ./images/output_sbom_update-sbom-information.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_sbomt_update-sbom-information.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_sbomt_update-sbom-information.svg :width: 100% :alt: Breeze update sbom information @@ -646,7 +646,7 @@ such images are built with the ``build-all-airflow-images`` command. This command will build one docker image per python version, with all the airflow versions >=2.0.0 compatible. .. image:: ./images/output_sbom_build-all-airflow-images.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_sbom_build-all-airflow-images.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_sbom_build-all-airflow-images.svg :width: 100% :alt: Breeze build all airflow images @@ -658,7 +658,7 @@ The SBOM information published on our website can be converted into a spreadshee properties of the dependencies. This is done by the ``export-dependency-information`` command. .. image:: ./images/output_sbom_export-dependency-information.svg - :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/images/output_sbom_export-dependency-information.svg + :target: https://raw.githubusercontent.com/apache/airflow/main/dev/breeze/doc/images/output_sbom_export-dependency-information.svg :width: 100% :alt: Breeze sbom export dependency information diff --git a/dev/breeze/doc/10_advanced_breeze_topics.rst b/dev/breeze/doc/10_advanced_breeze_topics.rst index ac5421f85aa9a..9bbf113cb7e62 100644 --- a/dev/breeze/doc/10_advanced_breeze_topics.rst +++ b/dev/breeze/doc/10_advanced_breeze_topics.rst @@ -29,17 +29,10 @@ Debugging/developing Breeze Breeze can be quite easily debugged with PyCharm/VSCode or any other IDE - but it might be less discoverable if you never tested modules and if you do not know how to bypass version check of breeze. -For testing, you can create your own virtual environment, or use the one that ``pipx`` created for you if you -already installed breeze following the recommended ``pipx install -e ./dev/breeze`` command. +For testing, you can create your own virtual environment, or use the one that ``uv`` or ``pipx`` created +for you if you already installed breeze following the recommended installation. -For local virtualenv, you can use ``pyenv`` or any other virtualenv wrapper. For example with ``pyenv``, -you can use ``pyenv virtualenv 3.8.6 airflow-breeze`` to create virtualenv called ``airflow-breeze`` -with Python 3.8.6. Then you can use ``pyenv activate airflow-breeze`` to activate it and install breeze -in editable mode with ``pip install -e ./dev/breeze``. - -For ``pipx`` virtualenv, you can use the virtualenv that ``pipx`` created for you. You can find the name -where ``pipx`` keeps their venvs via ``pipx list`` command. Usually it is -``${HOME}/.local/pipx/venvs/apache-airflow-breeze`` where ``$HOME`` is your home directory. +Or you can change your directory to The venv can be used for running breeze tests and for debugging breeze. While running tests should be usually "out-of-the-box" for most IDEs, once you configure ``./dev/breeze`` project to use the venv, @@ -56,7 +49,7 @@ make sure to follow these steps: this will bypass the check we run in Breeze to see if there are new requirements to install for it See example configuration for PyCharm which has run/debug configuration for -``breeze sbom generate-providers-requirements --provider-id sqlite --python 3.8`` +``breeze sbom generate-providers-requirements --provider-id sqlite --python 3.9`` .. raw:: html diff --git a/dev/breeze/doc/adr/0002-implement-standalone-python-command.md b/dev/breeze/doc/adr/0002-implement-standalone-python-command.md index 37eebcf3e15d1..ddd005fd92dde 100644 --- a/dev/breeze/doc/adr/0002-implement-standalone-python-command.md +++ b/dev/breeze/doc/adr/0002-implement-standalone-python-command.md @@ -138,7 +138,7 @@ There are a few properties of Breeze/CI scripts that should be maintained though run a command and get everything done with the least number of prerequisites * The prerequisites for Breeze and CI are: - * Python 3.8+ (Python 3.8 end of life is October 2024) + * Python 3.9+ (Python 3.9 end of life is October 2025) * Docker (23.0+) * Docker Compose (2.16.0+) * No other tools and CLI commands should be needed diff --git a/dev/breeze/doc/adr/0016-use-uv-tool-to-install-breeze.md b/dev/breeze/doc/adr/0016-use-uv-tool-to-install-breeze.md new file mode 100644 index 0000000000000..d425b6c40aa34 --- /dev/null +++ b/dev/breeze/doc/adr/0016-use-uv-tool-to-install-breeze.md @@ -0,0 +1,56 @@ + + + + +**Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)* + +- [10. Use uv tool to install breeze](#10-use-uv-tool-to-install-breeze) + - [Status](#status) + - [Context](#context) + - [Decision](#decision) + - [Consequences](#consequences) + + + +# 10. Use uv tool to install breeze + +Date: 2024-11-11 + +## Status + +Accepted + +Supersedes [10. Use pipx to install breeze](0010-use-pipx-to-install-breeze.md) + +## Context + +The ``uv`` tools is a new modern python development environment management tool +and we adopt it in ``Airflow`` as recommended way to manage airflow local virtualenv and development +setup. It's much faster to install dependencies with ``uv`` than with ``pip`` and it has many +more features - including managing python interpreters, workspaces, syncing virtualenv and more. + +## Decision + +While it is still possible to install breeze using ``pipx``, we are now recommending ``uv`` and specifically +``uv tool`` as the way to install breeze. Contributors should use ``uv tool`` to install breeze. + +## Consequences + +Those who used ``pipx``, should clean-up and reinstall their environment with ``uv``. diff --git a/dev/breeze/doc/ci/01_ci_environment.md b/dev/breeze/doc/ci/01_ci_environment.md index c9501a13b208a..21044af51412a 100644 --- a/dev/breeze/doc/ci/01_ci_environment.md +++ b/dev/breeze/doc/ci/01_ci_environment.md @@ -23,8 +23,9 @@ - [CI Environment](#ci-environment) - [GitHub Actions workflows](#github-actions-workflows) - - [Container Registry used as cache](#container-registry-used-as-cache) + - [GitHub Registry used as cache](#github-registry-used-as-cache) - [Authentication in GitHub Registry](#authentication-in-github-registry) + - [GitHub Artifacts used to store built images](#github-artifacts-used-to-store-built-images) @@ -32,7 +33,8 @@ Continuous Integration is an important component of making Apache Airflow robust and stable. We run a lot of tests for every pull request, -for main and v2-\*-test branches and regularly as scheduled jobs. +for `canary` runs from `main` and `v*-\*-test` branches +regularly as scheduled jobs. Our execution environment for CI is [GitHub Actions](https://github.com/features/actions). GitHub Actions. @@ -60,57 +62,22 @@ To run the tests, we need to ensure that the images are built using the latest sources and that the build process is efficient. A full rebuild of such an image from scratch might take approximately 15 minutes. Therefore, we've implemented optimization techniques that efficiently -use the cache from the GitHub Docker registry. In most cases, this -reduces the time needed to rebuild the image to about 4 minutes. -However, when dependencies change, it can take around 6-7 minutes, and -if the base image of Python releases a new patch-level, it can take -approximately 12 minutes. - -## Container Registry used as cache - -We are using GitHub Container Registry to store the results of the -`Build Images` workflow which is used in the `Tests` workflow. - -Currently in main version of Airflow we run tests in all versions of -Python supported, which means that we have to build multiple images (one -CI and one PROD for each Python version). Yet we run many jobs (\>15) - -for each of the CI images. That is a lot of time to just build the -environment to run. Therefore we are utilising the `pull_request_target` -feature of GitHub Actions. - -This feature allows us to run a separate, independent workflow, when the -main workflow is run -this separate workflow is different than the main -one, because by default it runs using `main` version of the sources but -also - and most of all - that it has WRITE access to the GitHub -Container Image registry. - -This is especially important in our case where Pull Requests to Airflow -might come from any repository, and it would be a huge security issue if -anyone from outside could utilise the WRITE access to the Container -Image Registry via external Pull Request. - -Thanks to the WRITE access and fact that the `pull_request_target` workflow named -`Build Imaages` which - by default - uses the `main` version of the sources. -There we can safely run some code there as it has been reviewed and merged. -The workflow checks-out the incoming Pull Request, builds -the container image from the sources from the incoming PR (which happens in an -isolated Docker build step for security) and pushes such image to the -GitHub Docker Registry - so that this image can be built only once and -used by all the jobs running tests. The image is tagged with unique -`COMMIT_SHA` of the incoming Pull Request and the tests run in the `pull` workflow -can simply pull such image rather than build it from the scratch. -Pulling such image takes ~ 1 minute, thanks to that we are saving a -lot of precious time for jobs. - -We use [GitHub Container Registry](https://docs.github.com/en/packages/guides/about-github-container-registry). -A `GITHUB_TOKEN` is needed to push to the registry. We configured -scopes of the tokens in our jobs to be able to write to the registry, -but only for the jobs that need it. - -The latest cache is kept as `:cache-linux-amd64` and `:cache-linux-arm64` -tagged cache of our CI images (suitable for `--cache-from` directive of -buildx). It contains metadata and cache for all segments in the image, -and cache is kept separately for different platform. +use the cache from Github Actions Artifacts. + +## GitHub Registry used as cache + +We are using GitHub Registry to store the last image built in canary run +to build images in CI and local docker container. +This is done to speed up the build process and to ensure that the +first - time-consuming-to-build layers of the image are +reused between the builds. The cache is stored in the GitHub Registry +by the `canary` runs and then used in the subsequent runs. + +The latest GitHub registry cache is kept as `:cache-linux-amd64` and +`:cache-linux-arm64` tagged cache of our CI images (suitable for +`--cache-from` directive of buildx). It contains +metadata and cache for all segments in the image, +and cache is kept separately for different platforms. The `latest` images of CI and PROD are `amd64` only images for CI, because there is no easy way to push multiplatform images without @@ -118,11 +85,25 @@ merging the manifests, and it is not really needed nor used for cache. ## Authentication in GitHub Registry -We are using GitHub Container Registry as cache for our images. -Authentication uses GITHUB_TOKEN mechanism. Authentication is needed for -pushing the images (WRITE) only in `push`, `pull_request_target` -workflows. When you are running the CI jobs in GitHub Actions, -GITHUB_TOKEN is set automatically by the actions. +Authentication to GitHub Registry in CI uses GITHUB_TOKEN mechanism. +The Authentication is needed for pushing the images (WRITE) in the `canary` runs. +When you are running the CI jobs in GitHub Actions, GITHUB_TOKEN is set automatically +by the actions. This is used only in the `canary` runs that have "write" access +to the repository. + +No `write` access is needed (nor possible) by Pull Requests coming from the forks, +since we are only using "GitHub Artifacts" for cache source in those runs. + +## GitHub Artifacts used to store built images + +We are running most tests in reproducible CI image for all the jobs and +instead of build the image multiple times we build image for each python +version only once (one CI and one PROD). Those images are then used by +All jobs that need them in the same build. The images - after building +are exported to a file and stored in the GitHub Artifacts. +The export files are then downloaded from artifacts and image is +loaded from the file in all jobs in the same workflow after they are +built and uploaded in the build image job. ---- diff --git a/dev/breeze/doc/ci/02_images.md b/dev/breeze/doc/ci/02_images.md index 94d97c6962e91..df8446f5a8606 100644 --- a/dev/breeze/doc/ci/02_images.md +++ b/dev/breeze/doc/ci/02_images.md @@ -43,9 +43,9 @@ Airflow has two main images (build from Dockerfiles): production-ready Airflow installation. You can read more about building and using the production image in the [Docker stack](https://airflow.apache.org/docs/docker-stack/index.html) - documentation. The image is built using [Dockerfile](Dockerfile). + documentation. The image is built using [Dockerfile](../../../../Dockerfile). - CI image (Dockerfile.ci) - used for running tests and local - development. The image is built using [Dockerfile.ci](Dockerfile.ci). + development. The image is built using [Dockerfile.ci](../../../../Dockerfile.ci). ## PROD image @@ -108,7 +108,7 @@ it uses the latest installed version of airflow and providers. However, you can choose different installation methods as described in [Building PROD docker images from released PIP packages](#building-prod-docker-images-from-released-pip-packages). Detailed reference for building production image from different sources can be -found in: [Build Args reference](docs/docker-stack/build-arg-ref.rst#installing-airflow-using-different-methods) +found in: [Build Args reference](../../../../docs/docker-stack/build-arg-ref.rst#installing-airflow-using-different-methods) You can build the CI image using current sources this command: @@ -126,20 +126,20 @@ By adding `--python ` parameter you can build the image version for the chosen Python version. The images are built with default extras - different extras for CI and -production image and you can change the extras via the `--extras` +production image and you can change the extras via the `--airflow-extras` parameters and add new ones with `--additional-airflow-extras`. -For example if you want to build Python 3.8 version of production image +For example if you want to build Python 3.9 version of production image with "all" extras installed you should run this command: ``` bash -breeze prod-image build --python 3.8 --extras "all" +breeze prod-image build --python 3.9 --airflow-extras "all" ``` If you just want to add new extras you can add them like that: ``` bash -breeze prod-image build --python 3.8 --additional-airflow-extras "all" +breeze prod-image build --python 3.9 --additional-airflow-extras "all" ``` The command that builds the CI image is optimized to minimize the time @@ -160,7 +160,7 @@ You can also build production images from PIP packages via providing `--install-airflow-version` parameter to Breeze: ``` bash -breeze prod-image build --python 3.8 --additional-airflow-extras=trino --install-airflow-version=2.0.0 +breeze prod-image build --python 3.9 --additional-airflow-extras=trino --install-airflow-version=2.0.0 ``` This will build the image using command similar to: @@ -168,7 +168,7 @@ This will build the image using command similar to: ``` bash pip install \ apache-airflow[async,amazon,celery,cncf.kubernetes,docker,elasticsearch,ftp,grpc,hashicorp,http,ldap,google,microsoft.azure,mysql,postgres,redis,sendgrid,sftp,slack,ssh,statsd,virtualenv]==2.0.0 \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.0.0/constraints-3.8.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.0.0/constraints-3.9.txt" ``` > [!NOTE] @@ -199,7 +199,7 @@ HEAD of development for constraints): ``` bash pip install "https://github.com/apache/airflow/archive/.tar.gz#egg=apache-airflow" \ - --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.8.txt" + --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-main/constraints-3.9.txt" ``` You can also skip installing airflow and install it from locally @@ -207,7 +207,7 @@ provided files by using `--install-packages-from-context` parameter to Breeze: ``` bash -breeze prod-image build --python 3.8 --additional-airflow-extras=trino --install-packages-from-context +breeze prod-image build --python 3.9 --additional-airflow-extras=trino --install-packages-from-context ``` In this case you airflow and all packages (.whl files) should be placed @@ -215,10 +215,11 @@ in `docker-context-files` folder. # Using docker cache during builds -Default mechanism used in Breeze for building CI images uses images -pulled from GitHub Container Registry. This is done to speed up local +Default mechanism used in Breeze for building CI images locally uses images +pulled from GitHub Container Registry combined with locally mounted cache +folders where `uv` cache is stored. This is done to speed up local builds and building images for CI runs - instead of \> 12 minutes for -rebuild of CI images, it takes usually about 1 minute when cache is +rebuild of CI images, it takes usually less than a minute when cache is used. For CI images this is usually the best strategy - to use default "pull" cache. This is default strategy when [Breeze](../README.rst) builds are performed. @@ -227,7 +228,8 @@ For Production Image - which is far smaller and faster to build, it's better to use local build cache (the standard mechanism that docker uses. This is the default strategy for production images when [Breeze](../README.rst) builds are -performed. The first time you run it, it will take considerably longer +performed. The local `uv` cache is used from mounted sources. +The first time you run it, it will take considerably longer time than if you use the pull mechanism, but then when you do small, incremental changes to local sources, Dockerfile image and scripts, further rebuilds with local build cache will be considerably faster. @@ -241,20 +243,20 @@ flags: `registry` (default), `local`, or `disabled` flags when you run Breeze commands. For example: ``` bash -breeze ci-image build --python 3.8 --docker-cache local +breeze ci-image build --python 3.9 --docker-cache local ``` Will build the CI image using local build cache (note that it will take quite a long time the first time you run it). ``` bash -breeze prod-image build --python 3.8 --docker-cache registry +breeze prod-image build --python 3.9 --docker-cache registry ``` Will build the production image with cache used from registry. ``` bash -breeze prod-image build --python 3.8 --docker-cache disabled +breeze prod-image build --python 3.9 --docker-cache disabled ``` Will build the production image from the scratch. @@ -293,19 +295,12 @@ See Naming convention for the GitHub packages. -Images with a commit SHA (built for pull requests and pushes). Those are -images that are snapshot of the currently run build. They are built once -per each build and pulled by each test job. - ``` bash -ghcr.io/apache/airflow//ci/python: - for CI images -ghcr.io/apache/airflow//prod/python: - for production images +ghcr.io/apache/airflow//ci/python - for CI images +ghcr.io/apache/airflow//prod/python - for production images ``` -Thoe image contain inlined cache. - -You can see all the current GitHub images at - +You can see all the current GitHub images at Note that you need to be committer and have the right to refresh the images in the GitHub Registry with latest sources from main via @@ -314,12 +309,23 @@ need to login with your Personal Access Token with "packages" write scope to be able to push to those repositories or pull from them in case of GitHub Packages. -GitHub Container Registry +You need to login to GitHub Container Registry with your API token +if you want to interact with the GitHub Registry for writing (only +committers). ``` bash docker login ghcr.io ``` +Note that when your token is expired and you are still +logged in, you are not able to interact even with read-only operations +like pulling images. You need to logout and login again to refresh the +token. + +``` bash +docker logout ghcr.io +``` + Since there are different naming conventions used for Airflow images and there are multiple images used, [Breeze](../README.rst) provides easy to use management interface for the images. The CI @@ -329,22 +335,14 @@ new version of base Python is released. However, occasionally, you might need to rebuild images locally and push them directly to the registries to refresh them. -Every developer can also pull and run images being result of a specific +Every contributor can also pull and run images being result of a specific CI run in GitHub Actions. This is a powerful tool that allows to reproduce CI failures locally, enter the images and fix them much -faster. It is enough to pass `--image-tag` and the registry and Breeze -will download and execute commands using the same image that was used -during the CI tests. +faster. It is enough to download and uncompress the artifact that stores the +image and run ``breeze ci-image load -i `` to load the +image and mark the image as refreshed in the local cache. -For example this command will run the same Python 3.8 image as was used -in build identified with 9a621eaa394c0a0a336f8e1b31b35eff4e4ee86e commit -SHA with enabled rabbitmq integration. - -``` bash -breeze --image-tag 9a621eaa394c0a0a336f8e1b31b35eff4e4ee86e --python 3.8 --integration rabbitmq -``` - -You can see more details and examples in[Breeze](../README.rst) +You can see more details and examples in[Breeze](../06_managing_docker_images.rst) # Customizing the CI image @@ -361,7 +359,7 @@ you can build the image in the Here just a few examples are presented which should give you general understanding of what you can customize. -This builds the production image in version 3.8 with additional airflow +This builds the production image in version 3.9 with additional airflow extras from 2.0.0 PyPI package and additional apt dev and runtime dependencies. @@ -373,7 +371,7 @@ plugin installed. ``` bash DOCKER_BUILDKIT=1 docker build . -f Dockerfile.ci \ --pull \ - --build-arg PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" \ + --build-arg PYTHON_BASE_IMAGE="python:3.9-slim-bookworm" \ --build-arg ADDITIONAL_AIRFLOW_EXTRAS="jdbc" \ --build-arg ADDITIONAL_PYTHON_DEPS="pandas" \ --build-arg ADDITIONAL_DEV_APT_DEPS="gcc g++" \ @@ -384,7 +382,7 @@ the same image can be built using `breeze` (it supports auto-completion of the options): ``` bash -breeze ci-image build --python 3.8 --additional-airflow-extras=jdbc --additional-python-deps="pandas" \ +breeze ci-image build --python 3.9 --additional-airflow-extras=jdbc --additional-python-deps="pandas" \ --additional-dev-apt-deps="gcc g++" ``` @@ -398,7 +396,7 @@ comment](https://github.com/apache/airflow/issues/8605#issuecomment-690065621): ``` bash DOCKER_BUILDKIT=1 docker build . -f Dockerfile.ci \ --pull \ - --build-arg PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" \ + --build-arg PYTHON_BASE_IMAGE="python:3.9-slim-bookworm" \ --build-arg AIRFLOW_INSTALLATION_METHOD="apache-airflow" \ --build-arg ADDITIONAL_AIRFLOW_EXTRAS="slack" \ --build-arg ADDITIONAL_PYTHON_DEPS="apache-airflow-providers-odbc \ @@ -421,96 +419,92 @@ DOCKER_BUILDKIT=1 docker build . -f Dockerfile.ci \ The following build arguments (`--build-arg` in docker build command) can be used for CI images: -| Build argument | Default value | Description | -|-----------------------------------|----------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------| -| `PYTHON_BASE_IMAGE` | `python:3.8-slim-bookworm` | Base Python image | -| `PYTHON_MAJOR_MINOR_VERSION` | `3.8` | major/minor version of Python (should match base image) | -| `DEPENDENCIES_EPOCH_NUMBER` | `2` | increasing this number will reinstall all apt dependencies | -| `ADDITIONAL_PIP_INSTALL_FLAGS` | | additional `pip` flags passed to the installation commands (except when reinstalling `pip` itself) | -| `PIP_NO_CACHE_DIR` | `true` | if true, then no pip cache will be stored | -| `UV_NO_CACHE` | `true` | if true, then no uv cache will be stored | -| `HOME` | `/root` | Home directory of the root user (CI image has root user as default) | -| `AIRFLOW_HOME` | `/root/airflow` | Airflow's HOME (that's where logs and sqlite databases are stored) | -| `AIRFLOW_SOURCES` | `/opt/airflow` | Mounted sources of Airflow | -| `AIRFLOW_REPO` | `apache/airflow` | the repository from which PIP dependencies are pre-installed | -| `AIRFLOW_BRANCH` | `main` | the branch from which PIP dependencies are pre-installed | -| `AIRFLOW_CI_BUILD_EPOCH` | `1` | increasing this value will reinstall PIP dependencies from the repository from scratch | -| `AIRFLOW_CONSTRAINTS_LOCATION` | | If not empty, it will override the source of the constraints with the specified URL or file. | -| `AIRFLOW_CONSTRAINTS_REFERENCE` | | reference (branch or tag) from GitHub repository from which constraints are used. By default it is set to `constraints-main` but can be `constraints-2-X`. | -| `AIRFLOW_EXTRAS` | `all` | extras to install | -| `UPGRADE_INVALIDATION_STRING` | | If set to any random value the dependencies are upgraded to newer versions. In CI it is set to build id. | -| `AIRFLOW_PRE_CACHED_PIP_PACKAGES` | `true` | Allows to pre-cache airflow PIP packages from the GitHub of Apache Airflow This allows to optimize iterations for Image builds and speeds up CI jobs. | -| `ADDITIONAL_AIRFLOW_EXTRAS` | | additional extras to install | -| `ADDITIONAL_PYTHON_DEPS` | | additional Python dependencies to install | -| `DEV_APT_COMMAND` | | Dev apt command executed before dev deps are installed in the first part of image | -| `ADDITIONAL_DEV_APT_COMMAND` | | Additional Dev apt command executed before dev dep are installed in the first part of the image | -| `DEV_APT_DEPS` | | Dev APT dependencies installed in the first part of the image | -| `ADDITIONAL_DEV_APT_DEPS` | | Additional apt dev dependencies installed in the first part of the image | -| `ADDITIONAL_DEV_APT_ENV` | | Additional env variables defined when installing dev deps | -| `AIRFLOW_PIP_VERSION` | `24.3.1` | PIP version used. | -| `AIRFLOW_UV_VERSION` | `0.5.11` | UV version used. | -| `AIRFLOW_USE_UV` | `true` | Whether to use UV for installation. | -| `PIP_PROGRESS_BAR` | `on` | Progress bar for PIP installation | - - -The" +| Build argument | Default value | Description | +|---------------------------------|----------------------------|-------------------------------------------------------------------------------------------------------------------| +| `PYTHON_BASE_IMAGE` | `python:3.9-slim-bookworm` | Base Python image | +| `PYTHON_MAJOR_MINOR_VERSION` | `3.9` | major/minor version of Python (should match base image) | +| `DEPENDENCIES_EPOCH_NUMBER` | `2` | increasing this number will reinstall all apt dependencies | +| `ADDITIONAL_PIP_INSTALL_FLAGS` | | additional `pip` flags passed to the installation commands (except when reinstalling `pip` itself) | +| `HOME` | `/root` | Home directory of the root user (CI image has root user as default) | +| `AIRFLOW_HOME` | `/root/airflow` | Airflow's HOME (that's where logs and sqlite databases are stored) | +| `AIRFLOW_SOURCES` | `/opt/airflow` | Mounted sources of Airflow | +| `AIRFLOW_REPO` | `apache/airflow` | the repository from which PIP dependencies are pre-installed | +| `AIRFLOW_BRANCH` | `main` | the branch from which PIP dependencies are pre-installed | +| `AIRFLOW_CI_BUILD_EPOCH` | `1` | increasing this value will reinstall PIP dependencies from the repository from scratch | +| `AIRFLOW_CONSTRAINTS_LOCATION` | | If not empty, it will override the source of the constraints with the specified URL or file. | +| `AIRFLOW_CONSTRAINTS_REFERENCE` | `constraints-main` | reference (branch or tag) from GitHub repository from which constraints are used. | +| `AIRFLOW_EXTRAS` | `all` | extras to install | +| `UPGRADE_INVALIDATION_STRING` | | If set to any random value the dependencies are upgraded to newer versions. In CI it is set to build id. | +| `ADDITIONAL_AIRFLOW_EXTRAS` | | additional extras to install | +| `ADDITIONAL_PYTHON_DEPS` | | additional Python dependencies to install | +| `DEV_APT_COMMAND` | | Dev apt command executed before dev deps are installed in the first part of image | +| `ADDITIONAL_DEV_APT_COMMAND` | | Additional Dev apt command executed before dev dep are installed in the first part of the image | +| `DEV_APT_DEPS` | | Dev APT dependencies installed in the first part of the image (default empty means default dependencies are used) | +| `ADDITIONAL_DEV_APT_DEPS` | | Additional apt dev dependencies installed in the first part of the image | +| `ADDITIONAL_DEV_APT_ENV` | | Additional env variables defined when installing dev deps | +| `AIRFLOW_PIP_VERSION` | `24.3.1` | `pip` version used. | +| `AIRFLOW_UV_VERSION` | `0.5.17` | `uv` version used. | +| `AIRFLOW_PRE_COMMIT_VERSION` | `3.5.0` | `pre-commit` version used. | +| `AIRFLOW_USE_UV` | `true` | Whether to use UV for installation. | +| `PIP_PROGRESS_BAR` | `on` | Progress bar for PIP installation | + Here are some examples of how CI images can built manually. CI is always built from local sources. -This builds the CI image in version 3.8 with default extras ("all"). +This builds the CI image in version 3.9 with default extras ("all"). ``` bash DOCKER_BUILDKIT=1 docker build . -f Dockerfile.ci \ --pull \ - --build-arg PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" --tag my-image:0.0.1 + --build-arg PYTHON_BASE_IMAGE="python:3.9-slim-bookworm" --tag my-image:0.0.1 ``` -This builds the CI image in version 3.8 with "gcp" extra only. +This builds the CI image in version 3.9 with "gcp" extra only. ``` bash DOCKER_BUILDKIT=1 docker build . -f Dockerfile.ci \ --pull \ - --build-arg PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" \ + --build-arg PYTHON_BASE_IMAGE="python:3.9-slim-bookworm" \ --build-arg AIRFLOW_EXTRAS=gcp --tag my-image:0.0.1 ``` -This builds the CI image in version 3.8 with "apache-beam" extra added. +This builds the CI image in version 3.9 with "apache-beam" extra added. ``` bash DOCKER_BUILDKIT=1 docker build . -f Dockerfile.ci \ --pull \ - --build-arg PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" \ + --build-arg PYTHON_BASE_IMAGE="python:3.9-slim-bookworm" \ --build-arg ADDITIONAL_AIRFLOW_EXTRAS="apache-beam" --tag my-image:0.0.1 ``` -This builds the CI image in version 3.8 with "mssql" additional package +This builds the CI image in version 3.9 with "mssql" additional package added. ``` bash DOCKER_BUILDKIT=1 docker build . -f Dockerfile.ci \ --pull \ - --build-arg PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" \ + --build-arg PYTHON_BASE_IMAGE="python:3.9-slim-bookworm" \ --build-arg ADDITIONAL_PYTHON_DEPS="mssql" --tag my-image:0.0.1 ``` -This builds the CI image in version 3.8 with "gcc" and "g++" additional +This builds the CI image in version 3.9 with "gcc" and "g++" additional apt dev dependencies added. ``` DOCKER_BUILDKIT=1 docker build . -f Dockerfile.ci \ --pull - --build-arg PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" \ + --build-arg PYTHON_BASE_IMAGE="python:3.9-slim-bookworm" \ --build-arg ADDITIONAL_DEV_APT_DEPS="gcc g++" --tag my-image:0.0.1 ``` -This builds the CI image in version 3.8 with "jdbc" extra and +This builds the CI image in version 3.9 with "jdbc" extra and "default-jre-headless" additional apt runtime dependencies added. ``` DOCKER_BUILDKIT=1 docker build . -f Dockerfile.ci \ --pull \ - --build-arg PYTHON_BASE_IMAGE="python:3.8-slim-bookworm" \ + --build-arg PYTHON_BASE_IMAGE="python:3.9-slim-bookworm" \ --build-arg AIRFLOW_EXTRAS=jdbc \ --tag my-image:0.0.1 ``` @@ -545,8 +539,8 @@ The entrypoint performs those operations: sets the right pytest flags - Sets default "tests" target in case the target is not explicitly set as additional argument -- Runs system tests if RUN_SYSTEM_TESTS flag is specified, otherwise - runs regular unit and integration tests +- Runs system tests if TEST_GROUP is "system-core" or "system-providers" + otherwise runs regular unit and integration tests # Naming conventions for stored images @@ -555,10 +549,6 @@ The images produced during the `Build Images` workflow of CI jobs are stored in the [GitHub Container Registry](https://github.com/orgs/apache/packages?repo_name=airflow) -The images are stored with both "latest" tag (for last main push image -that passes all the tests as well with the COMMIT_SHA id for images that -were used in particular build. - The image names follow the patterns (except the Python image, all the images are stored in in `apache` organization. @@ -569,22 +559,15 @@ percent-encoded when you access them via UI (/ = %2F) `https://github.com/apache/airflow/pkgs/container/` -| Image | Name:tag (both cases latest version and per-build) | Description | -|--------------------------|----------------------------------------------------|---------------------------------------------------------------| -| Python image (DockerHub) | python:\-slim-bookworm | Base Python image used by both production and CI image. | -| CI image | airflow/\/ci/python\:\ | CI image - this is the image used for most of the tests. | -| PROD image | airflow/\/prod/python\:\ | faster to build or pull. Production image optimized for size. | +| Image | Name | Description | +|--------------------------|----------------------------------------|---------------------------------------------------------------| +| Python image (DockerHub) | python:\-slim-bookworm | Base Python image used by both production and CI image. | +| CI image | airflow/\/ci/python\ | CI image - this is the image used for most of the tests. | +| PROD image | airflow/\/prod/python\ | faster to build or pull. Production image optimized for size. | - \ might be either "main" or "v2-\*-test" -- \ - Python version (Major + Minor).Should be one of \["3.8", - "3.9", "3.10", "3.11", "3.12" \]. -- \ - full-length SHA of commit either from the tip of the - branch (for pushes/schedule) or commit from the tip of the branch used - for the PR. -- \ - tag of the image. It is either "latest" or \ - (full-length SHA of commit either from the tip of the branch (for - pushes/schedule) or commit from the tip of the branch used for the - PR). +- \ - Python version (Major + Minor).Should be one of \["3.9", "3.10", "3.11", "3.12" \]. + ---- diff --git a/dev/breeze/doc/ci/03_github_variables.md b/dev/breeze/doc/ci/03_github_variables.md index 10983369784e1..bf1353df08069 100644 --- a/dev/breeze/doc/ci/03_github_variables.md +++ b/dev/breeze/doc/ci/03_github_variables.md @@ -71,4 +71,4 @@ docker tag ghcr.io/apache/airflow/main/ci/python3.10 your-image-name:tag ----- -Read next about [Static checks](04_static_checks.md) +Read next about [Selective checks](04_selective_checks.md) diff --git a/dev/breeze/doc/ci/04_selective_checks.md b/dev/breeze/doc/ci/04_selective_checks.md index 3f8d8a97fae03..2734fdda113af 100644 --- a/dev/breeze/doc/ci/04_selective_checks.md +++ b/dev/breeze/doc/ci/04_selective_checks.md @@ -27,7 +27,7 @@ - [Skipping pre-commits (Static checks)](#skipping-pre-commits-static-checks) - [Suspended providers](#suspended-providers) - [Selective check outputs](#selective-check-outputs) - - [Committer vs. non-committer PRs](#committer-vs-non-committer-prs) + - [Committer vs. Non-committer PRs](#committer-vs-non-committer-prs) - [Changing behaviours of the CI runs by setting labels](#changing-behaviours-of-the-ci-runs-by-setting-labels) @@ -59,6 +59,7 @@ We have the following Groups of files for CI that determine which tests are run: provider and `hatch_build.py` for all regular dependencies. * `DOC files` - change in those files indicate that we should run documentation builds (both airflow sources and airflow documentation) +* `UI files` - those are files for the new full React UI (useful to determine if UI tests should run) * `WWW files` - those are files for the WWW part of our UI (useful to determine if UI tests should run) * `System test files` - those are the files that are part of system tests (system tests are not automatically run in our CI, but Airflow stakeholders are running the tests and expose dashboards for them at @@ -143,6 +144,7 @@ when some files are not changed. Those are the rules implemented: * if no `All Airflow Python files` changed - `mypy-airflow` check is skipped * if no `All Docs Python files` changed - `mypy-docs` check is skipped * if no `All Dev Python files` changed - `mypy-dev` check is skipped + * if no `UI files` changed - `ts-compile-format-lint-ui` check is skipped * if no `WWW files` changed - `ts-compile-format-lint-www` check is skipped * if no `All Python files` changed - `flynt` check is skipped * if no `Helm files` changed - `lint-helm-chart` check is skipped @@ -164,73 +166,89 @@ separated by spaces. This is to accommodate for the wau how outputs of this kind Github Actions to pass the list of parameters to a command to execute -| Output | Meaning of the output | Example value | List as string | -|----------------------------------------|------------------------------------------------------------------------------------------------------|-------------------------------------------|----------------| -| affected-providers-list-as-string | List of providers affected when they are selectively affected. | airbyte http | * | -| all-python-versions | List of all python versions there are available in the form of JSON array | ['3.8', '3.9', '3.10'] | | -| all-python-versions-list-as-string | List of all python versions there are available in the form of space separated string | 3.8 3.9 3.10 | * | -| all-versions | If set to true, then all python, k8s, DB versions are used for tests. | false | | -| basic-checks-only | Whether to run all static checks ("false") or only basic set of static checks ("true") | false | | -| build_system_changed_in_pyproject_toml | When builds system dependencies changed in pyproject.toml changed in the PR. | false | | -| chicken-egg-providers | List of providers that should be considered as "chicken-egg" - expecting development Airflow version | | | -| ci-image-build | Whether CI image build is needed | true | | -| debug-resources | Whether resources usage should be printed during parallel job execution ("true"/ "false") | false | | -| default-branch | Which branch is default for the build ("main" for main branch, "v2-4-test" for 2.4 line etc.) | main | | -| default-constraints-branch | Which branch is default for the build ("constraints-main" for main branch, "constraints-2-4" etc.) | constraints-main | | -| default-helm-version | Which Helm version to use as default | v3.9.4 | | -| default-kind-version | Which Kind version to use as default | v0.16.0 | | -| default-kubernetes-version | Which Kubernetes version to use as default | v1.25.2 | | -| default-mysql-version | Which MySQL version to use as default | 5.7 | | -| default-postgres-version | Which Postgres version to use as default | 10 | | -| default-python-version | Which Python version to use as default | 3.8 | | -| docker-cache | Which cache should be used for images ("registry", "local" , "disabled") | registry | | -| docs-build | Whether to build documentation ("true"/"false") | true | | -| docs-list-as-string | What filter to apply to docs building - based on which documentation packages should be built | apache-airflow helm-chart google | | -| full-tests-needed | Whether this build runs complete set of tests or only subset (for faster PR builds) [1] | false | | -| generated-dependencies-changed | Whether generated dependencies have changed ("true"/"false") | false | | -| hatch-build-changed | When hatch build.py changed in the PR. | false | | -| helm-version | Which Helm version to use for tests | v3.9.4 | | -| is-airflow-runner | Whether runner used is an airflow or infrastructure runner (true if airflow/false if infrastructure) | false | | -| is-amd-runner | Whether runner used is an AMD one | true | | -| is-arm-runner | Whether runner used is an ARM one | false | | -| is-committer-build | Whether the build is triggered by a committer | false | | -| is-k8s-runner | Whether the build runs on our k8s infrastructure | false | | -| is-self-hosted-runner | Whether the runner is self-hosted | false | | -| is-vm-runner | Whether the runner uses VM to run | true | | -| kind-version | Which Kind version to use for tests | v0.16.0 | | -| kubernetes-combos-list-as-string | All combinations of Python version and Kubernetes version to use for tests as space-separated string | 3.8-v1.25.2 3.9-v1.26.4 | * | -| kubernetes-versions | All Kubernetes versions to use for tests as JSON array | ['v1.25.2'] | | -| kubernetes-versions-list-as-string | All Kubernetes versions to use for tests as space-separated string | v1.25.2 | * | -| mypy-checks | List of folders to be considered for mypy | [] | | -| mysql-exclude | Which versions of MySQL to exclude for tests as JSON array | [] | | -| mysql-versions | Which versions of MySQL to use for tests as JSON array | ['5.7'] | | -| needs-api-codegen | Whether "api-codegen" are needed to run ("true"/"false") | true | | -| needs-api-tests | Whether "api-tests" are needed to run ("true"/"false") | true | | -| needs-helm-tests | Whether Helm tests are needed to run ("true"/"false") | true | | -| needs-javascript-scans | Whether javascript CodeQL scans should be run ("true"/"false") | true | | -| needs-mypy | Whether mypy check is supposed to run in this build | true | | -| needs-python-scans | Whether Python CodeQL scans should be run ("true"/"false") | true | | -| parallel-test-types-list-as-string | Which test types should be run for unit tests | API Always Providers Providers\[-google\] | * | -| postgres-exclude | Which versions of Postgres to exclude for tests as JSON array | [] | | -| postgres-versions | Which versions of Postgres to use for tests as JSON array | ['10'] | | -| prod-image-build | Whether PROD image build is needed | true | | -| prod-image-build | Whether PROD image build is needed | true | | -| providers-compatibility-checks | List of dicts: (python_version, airflow_version, removed_providers) for compatibility checks | [] | | -| pyproject-toml-changed | When pyproject.toml changed in the PR. | false | | -| python-versions | List of python versions to use for that build | ['3.8'] | * | -| python-versions-list-as-string | Which versions of MySQL to use for tests as space-separated string | 3.8 | * | -| run-amazon-tests | Whether Amazon tests should be run ("true"/"false") | true | | -| run-kubernetes-tests | Whether Kubernetes tests should be run ("true"/"false") | true | | -| run-tests | Whether unit tests should be run ("true"/"false") | true | | -| run-www-tests | Whether WWW tests should be run ("true"/"false") | true | | -| runs-on-as-json-default | List of labels assigned for runners for that build for default runs for that build (as string) | ["ubuntu-22.04"] | | -| runs-on-as-json-self-hosted | List of labels assigned for runners for that build for self hosted runners | ["self-hosted", "Linux", "X64"] | | -| runs-on-as-json-public | List of labels assigned for runners for that build for public runners | ["ubuntu-22.04"] | | -| skip-pre-commits | Which pre-commits should be skipped during the static-checks run | check-provider-yaml-valid,flynt,identity | | -| skip-provider-tests | When provider tests should be skipped (on non-main branch or when no provider changes detected) | true | | -| sqlite-exclude | Which versions of Sqlite to exclude for tests as JSON array | [] | | -| testable-integrations | List of integrations that are testable in the build as JSON array | ['mongo', 'kafka', 'mssql'] | | -| upgrade-to-newer-dependencies | Whether the image build should attempt to upgrade all dependencies (true/false or commit hash) | false | | +| Output | Meaning of the output | Example value | List | +|------------------------------------------------|--------------------------------------------------------------------------------------------------------|-----------------------------------------|------| +| all-python-versions | List of all python versions there are available in the form of JSON array | \['3.9', '3.10'\] | | +| all-python-versions-list-as-string | List of all python versions there are available in the form of space separated string | 3.9 3.10 | * | +| all-versions | If set to true, then all python, k8s, DB versions are used for tests. | false | | +| basic-checks-only | Whether to run all static checks ("false") or only basic set of static checks ("true") | false | | +| build_system_changed_in_pyproject_toml | When builds system dependencies changed in pyproject.toml changed in the PR. | false | | +| chicken-egg-providers | List of providers that should be considered as "chicken-egg" - expecting development Airflow version | | | +| ci-image-build | Whether CI image build is needed | true | | +| core-test-types-list-as-string | Which test types should be run for unit tests for core | API Always Providers | * | +| debug-resources | Whether resources usage should be printed during parallel job execution ("true"/ "false") | false | | +| default-branch | Which branch is default for the build ("main" for main branch, "v2-4-test" for 2.4 line etc.) | main | | +| default-constraints-branch | Which branch is default for the build ("constraints-main" for main branch, "constraints-2-4" etc.) | constraints-main | | +| default-helm-version | Which Helm version to use as default | v3.9.4 | | +| default-kind-version | Which Kind version to use as default | v0.16.0 | | +| default-kubernetes-version | Which Kubernetes version to use as default | v1.25.2 | | +| default-mysql-version | Which MySQL version to use as default | 5.7 | | +| default-postgres-version | Which Postgres version to use as default | 10 | | +| default-python-version | Which Python version to use as default | 3.9 | | +| disable-airflow-repo-cache | Disables cache of the repo main cache in CI - aiflow will be installed without main installation cache | true | | +| docker-cache | Which cache should be used for images ("registry", "local" , "disabled") | registry | | +| docs-build | Whether to build documentation ("true"/"false") | true | | +| docs-list-as-string | What filter to apply to docs building - based on which documentation packages should be built | apache-airflow helm-chart google | * | +| excluded-providers-as-string c | List of providers that should be excluded from the build as space-separated string | amazon google | * | +| force-pip | Whether pip should be forced in the image build instead of uv ("true"/"false") | false | | +| full-tests-needed | Whether this build runs complete set of tests or only subset (for faster PR builds) \[1\] | false | | +| generated-dependencies-changed | Whether generated dependencies have changed ("true"/"false") | false | | +| has-migrations | Whether the PR has migrations ("true"/"false") | false | | +| hatch-build-changed | When hatch build.py changed in the PR. | false | | +| helm-test-packages-list-as-string | List of helm packages to test as JSON array | \["airflow_aux", "airflow_core"\] | * | +| helm-version | Which Helm version to use for tests | v3.15.3 | | +| include-success-outputs | Whether to include outputs of successful parallel tests ("true"/"false") | false | | +| individual-providers-test-types-list-as-string | Which test types should be run for unit tests for providers (individually listed) | Providers[\amazon\] Providers\[google\] | * | +| is-airflow-runner | Whether runner used is an airflow or infrastructure runner (true if airflow/false if infrastructure) | false | | +| is-amd-runner | Whether runner used is an AMD one | true | | +| is-arm-runner | Whether runner used is an ARM one | false | | +| is-committer-build | Whether the build is triggered by a committer | false | | +| is-k8s-runner | Whether the build runs on our k8s infrastructure | false | | +| is-legacy-ui-api-labeled | Whether the PR is labeled as legacy UI/API | false | | +| is-self-hosted-runner | Whether the runner is self-hosted | false | | +| is-vm-runner | Whether the runner uses VM to run | true | | +| kind-version | Which Kind version to use for tests | v0.24.0 | | +| kubernetes-combos-list-as-string | All combinations of Python version and Kubernetes version to use for tests as space-separated string | 3.9-v1.25.2 3.10-v1.28.13 | * | +| kubernetes-versions | All Kubernetes versions to use for tests as JSON array | \['v1.25.2'\] | | +| kubernetes-versions-list-as-string | All Kubernetes versions to use for tests as space-separated string | v1.25.2 | * | +| latest-versions-only | If set, the number of Python, Kubernetes, DB versions will be limited to the latest ones. | false | | +| mypy-checks | List of folders to be considered for mypy checks | \["airflow_aux", "airflow_core"\] | | +| mysql-exclude | Which versions of MySQL to exclude for tests as JSON array | [] | | +| mysql-versions | Which versions of MySQL to use for tests as JSON array | \['8.0'\] | | +| needs-api-codegen | Whether "api-codegen" are needed to run ("true"/"false") | true | | +| needs-api-tests | Whether "api-tests" are needed to run ("true"/"false") | true | | +| needs-helm-tests | Whether Helm tests are needed to run ("true"/"false") | true | | +| needs-javascript-scans | Whether javascript CodeQL scans should be run ("true"/"false") | true | | +| needs-mypy | Whether mypy check is supposed to run in this build | true | | +| needs-python-scans | Whether Python CodeQL scans should be run ("true"/"false") | true | | +| only-new-ui-files | Whether only new UI files are present in the PR ("true"/"false") | false | | +| postgres-exclude | Which versions of Postgres to exclude for tests as JSON array | [] | | +| postgres-versions | Which versions of Postgres to use for tests as JSON array | \['12'\] | | +| prod-image-build | Whether PROD image build is needed | true | | +| providers-compatibility-tests-matrix | Matrix of providers compatibility tests: (python_version, airflow_version, removed_providers) | \[{}\] | | +| providers-test-types-list-as-string | Which test types should be run for unit tests for providers | Providers Providers\[-google\] | * | +| pyproject-toml-changed | When pyproject.toml changed in the PR. | false | | +| python-versions | List of python versions to use for that build | \['3.9'\] | | +| python-versions-list-as-string | Which versions of MySQL to use for tests as space-separated string | 3.9 | * | +| run-amazon-tests | Whether Amazon tests should be run ("true"/"false") | true | | +| run-kubernetes-tests | Whether Kubernetes tests should be run ("true"/"false") | true | | +| run-system-tests | Whether system tests should be run ("true"/"false") | true | | +| run-tests | Whether unit tests should be run ("true"/"false") | true | | +| run-ui-tests | Whether UI tests should be run ("true"/"false") | true | | +| run-www-tests | Whether Legacy WWW tests should be run ("true"/"false") | true | | +| runs-on-as-json-default | List of labels assigned for runners for that build for default runs for that build (as string) | \["ubuntu-22.04"\] | | +| runs-on-as-json-docs-build | List of labels assigned for runners for that build for ddcs build (as string) | \["ubuntu-22.04"\] | | +| runs-on-as-json-self-hosted | List of labels assigned for runners for that build for self hosted runners | \["self-hosted", "Linux", "X64"\] | | +| runs-on-as-json-self-hosted-asf | List of labels assigned for runners for that build for ASF self hosted runners | \["self-hosted", "Linux", "X64"\] | | +| runs-on-as-json-public | List of labels assigned for runners for that build for public runners | \["ubuntu-22.04"\] | | +| selected-providers-list-as-string | List of providers affected when they are selectively affected. | airbyte http | * | +| skip-pre-commits | Which pre-commits should be skipped during the static-checks run | flynt,identity | | +| skip-providers-tests | When provider tests should be skipped (on non-main branch or when no provider changes detected) | true | | +| sqlite-exclude | Which versions of Sqlite to exclude for tests as JSON array | [] | | +| test-groups | List of test groups that are valid for this run | \['core', 'providers'\] | | +| testable-core-integrations | List of core integrations that are testable in the build as JSON array | \['celery', 'kerberos'\] | | +| testable-providers-integrations | List of core integrations that are testable in the build as JSON array | \['mongo', 'kafka'\] | | +| upgrade-to-newer-dependencies | Whether the image build should attempt to upgrade all dependencies (true/false or commit hash) | false | | [1] Note for deciding if `full tests needed` mode is enabled and provider.yaml files. @@ -250,23 +268,15 @@ That's why we do not base our `full tests needed` decision on changes in depende from the `provider.yaml` files, but on `generated/provider_dependencies.json` and `pyproject.toml` files being modified. This can be overridden by setting `full tests needed` label in the PR. -## Committer vs. non-committer PRs +## Committer vs. Non-committer PRs There is a difference in how the CI jobs are run for committer and non-committer PRs from forks. -Main reason is security - we do not want to run untrusted code on our infrastructure for self-hosted runners, -but also we do not want to run unverified code during the `Build imaage` workflow, because that workflow has -access to GITHUB_TOKEN that has access to write to the Github Registry of ours (which is used to cache -images between runs). Also those images are build on self-hosted runners and we have to make sure that -those runners are not used to (fore example) mine cryptocurrencies on behalf of the person who opened the -pull request from their newly opened fork of airflow. +The main reason is security; we do not want to run untrusted code on our infrastructure for self-hosted runners. -This is why the `Build Images` workflow checks if the actor of the PR (GITHUB_ACTOR) is one of the committers, -and if not, then workflows and scripts used to run image building are coming only from the ``target`` branch -of the repository, where such scripts were reviewed and approved by the committers before being merged. - -This is controlled by `Selective checks <04_selective_checks.md>`__ that set appropriate output in -the build-info job of the workflow (see`is-committer-build` to `true`) if the actor is in the committer's -list and can be overridden by `non committer build` label in the PR. +Currently there is no difference because we are not using `self-hosted` runners (until we implement `Action +Runner Controller` but most of the jobs, committer builds will use "Self-hosted" runners by default, +while non-committer builds will use "Public" runners. For committers, this can be overridden by setting the +`use public runners` label in the PR. ## Changing behaviours of the CI runs by setting labels @@ -316,12 +326,13 @@ This table summarizes the labels you can use on PRs to control the selective che | debug ci resources | debug-ci-resources | If set, then debugging resources is enabled during parallel tests and you can see them. | | default versions only | all-versions, *-versions-* | If set, the number of Python and Kubernetes, DB versions are limited to the default ones. | | disable image cache | docker-cache | If set, the image cache is disables when building the image. | +| force pip | force-pip | If set, the image build uses pip instead of uv. | | full tests needed | full-tests-needed | If set, complete set of tests are run | | include success outputs | include-success-outputs | If set, outputs of successful parallel tests are shown not only failed outputs. | | latest versions only | *-versions-*, *-versions-* | If set, the number of Python, Kubernetes, DB versions will be limited to the latest ones. | | non committer build | is-committer-build | If set, the scripts used for images are used from target branch for committers. | | upgrade to newer dependencies | upgrade-to-newer-dependencies | If set to true (default false) then dependencies in the CI image build are upgraded. | -| use public runners | runs-on-as-json-default | Force using public runners as default runners. | +| use public runners | runs-on-as-json-public | Force using public runners as default runners. | | use self-hosted runners | runs-on-as-json-default | Force using self-hosted runners as default runners. | ----- diff --git a/dev/breeze/doc/ci/05_workflows.md b/dev/breeze/doc/ci/05_workflows.md index b70a81ef64ce2..0c66505508f02 100644 --- a/dev/breeze/doc/ci/05_workflows.md +++ b/dev/breeze/doc/ci/05_workflows.md @@ -24,11 +24,8 @@ - [CI run types](#ci-run-types) - [Pull request run](#pull-request-run) - [Canary run](#canary-run) - - [Scheduled runs](#scheduled-runs) - [Workflows](#workflows) - - [Build Images Workflow](#build-images-workflow) - - [Differences for main and release branches](#differences-for-main-and-release-branches) - - [Committer vs. non-committer PRs](#committer-vs-non-committer-prs) + - [Differences for `main` and `v*-*-test` branches](#differences-for-main-and-v--test-branches) - [Tests Workflow](#tests-workflow) - [CodeQL scan](#codeql-scan) - [Publishing documentation](#publishing-documentation) @@ -86,214 +83,133 @@ run in the context of the "apache/airflow" repository and has WRITE access to the GitHub Container Registry. When the PR changes important files (for example `generated/provider_depdencies.json` or -`pyproject.toml`), the PR is run in "upgrade to newer dependencies" mode - where instead -of using constraints to build images, attempt is made to upgrade all dependencies to latest -versions and build images with them. This way we check how Airflow behaves when the +`pyproject.toml` or `hatch_build.py`), the PR is run in "upgrade to newer dependencies" mode - +where instead of using constraints to build images, attempt is made to upgrade +all dependencies to latest versions and build images with them. This way we check how Airflow behaves when the dependencies are upgraded. This can also be forced by setting the `upgrade to newer dependencies` label in the PR if you are a committer and want to force dependency upgrade. ## Canary run -This workflow is triggered when a pull request is merged into the "main" -branch or pushed to any of the "v2-\*-test" branches. The "Canary" run +This workflow is triggered when a pull request is merged into the `main` +branch or pushed to any of the `v*-*-test` branches. The `canary` run aims to upgrade dependencies to their latest versions and promptly pushes a preview of the CI/PROD image cache to the GitHub Registry. This allows pull requests to quickly utilize the new cache, which is particularly beneficial when the Dockerfile or installation scripts have been modified. Even if some tests fail, this cache will already include -the latest Dockerfile and scripts.Upon successful execution, the run +the latest Dockerfile and scripts. Upon successful execution, the run updates the constraint files in the "constraints-main" branch with the latest constraints and pushes both the cache and the latest CI/PROD images to the GitHub Registry. -If the "Canary" build fails, it often indicates that a new version of +If the `canary` build fails, it often indicates that a new version of our dependencies is incompatible with the current tests or Airflow code. Alternatively, it could mean that a breaking change has been merged into -"main". Both scenarios require prompt attention from the maintainers. +`main`. Both scenarios require prompt attention from the maintainers. While a "broken main" due to our code should be fixed quickly, "broken dependencies" may take longer to resolve. Until the tests pass, the constraints will not be updated, meaning that regular PRs will continue using the older version of dependencies that passed one of the previous -"Canary" runs. +`canary` runs. -## Scheduled runs - -The "scheduled" workflow, which is designed to run regularly (typically -overnight), is triggered when a scheduled run occurs. This workflow is -largely identical to the "Canary" run, with one key difference: the -image is always built from scratch, not from a cache. This approach -ensures that we can verify whether any "system" dependencies in the -Debian base image have changed, and confirm that the build process -remains reproducible. Since the process for a scheduled run mirrors that -of a "Canary" run, no separate diagram is necessary to illustrate it. +The `canary` runs are executed 6 times a day on schedule, you can also +trigger the `canary` run manually via `workflow-dispatch` mechanism. # Workflows -A general note about cancelling duplicated workflows: for the -`Build Images`, `Tests` and `CodeQL` workflows we use the `concurrency` -feature of GitHub actions to automatically cancel "old" workflow runs of -each type -- meaning if you push a new commit to a branch or to a pull -request and there is a workflow running, GitHub Actions will cancel the -old workflow run automatically. - -## Build Images Workflow - -This workflow builds images for the CI Workflow for Pull Requests coming -from forks. - -It's a special type of workflow: `pull_request_target` which means that -it is triggered when a pull request is opened. This also means that the -workflow has Write permission to push to the GitHub registry the images -used by CI jobs which means that the images can be built only once and -reused by all the CI jobs (including the matrix jobs). We've implemented -it so that the `Tests` workflow waits until the images are built by the -`Build Images` workflow before running. - -Those "Build Image" steps are skipped in case Pull Requests do not come -from "forks" (i.e. those are internal PRs for Apache Airflow repository. -This is because in case of PRs coming from Apache Airflow (only -committers can create those) the "pull_request" workflows have enough -permission to push images to GitHub Registry. - -This workflow is not triggered on normal pushes to our "main" branches, -i.e. after a pull request is merged and whenever `scheduled` run is -triggered. Again in this case the "CI" workflow has enough permissions -to push the images. In this case we simply do not run this workflow. - -The workflow has the following jobs: - -| Job | Description | -|-------------------|---------------------------------------------| -| Build Info | Prints detailed information about the build | -| Build CI images | Builds all configured CI images | -| Build PROD images | Builds all configured PROD images | - -The images are stored in the [GitHub Container -Registry](https://github.com/orgs/apache/packages?repo_name=airflow) and the names of those images follow the patterns -described in [Images](02_images.md#naming-conventions) +A general note about cancelling duplicated workflows: for `Tests` and `CodeQL` workflows, +we use the `concurrency` feature of GitHub actions to automatically cancel "old" workflow runs of +each type. This means that if you push a new commit to a branch or to a pull +request while a workflow is already running, GitHub Actions will automatically cancel the +old workflow run. -Image building is configured in "fail-fast" mode. When any of the images -fails to build, it cancels other builds and the source `Tests` workflow -run that triggered it. - -## Differences for main and release branches +## Differences for `main` and `v*-*-test` branches The type of tests executed varies depending on the version or branch -under test. For the "main" development branch, we run all tests to +being tested. For the "main" development branch, we run all tests to maintain the quality of Airflow. However, when releasing patch-level -updates on older branches, we only run a subset of these tests. This is -because older branches are exclusively used for releasing Airflow and -its corresponding image, not for releasing providers or helm charts. +updates on older branches, we only run a subset of tests. This is +because older branches are used exclusively for releasing Airflow and +its corresponding image, not for releasing providers or Helm charts, +so all those tests are skipped there by default. This behaviour is controlled by `default-branch` output of the -build-info job. Whenever we create a branch for old version we update +build-info job. Whenever we create a branch for an older version, we update the `AIRFLOW_BRANCH` in `airflow_breeze/branch_defaults.py` to point to -the new branch and there are a few places where selection of tests is -based on whether this output is `main`. They are marked as - in the -"Release branches" column of the table below. - -## Committer vs. non-committer PRs - -There is a difference in how the CI jobs are run for committer and non-committer PRs from forks. -Main reason is security - we do not want to run untrusted code on our infrastructure for self-hosted runners, -but also we do not want to run unverified code during the `Build imaage` workflow, because that workflow has -access to GITHUB_TOKEN that has access to write to the Github Registry of ours (which is used to cache -images between runs). Also those images are build on self-hosted runners and we have to make sure that -those runners are not used to (fore example) mine cryptocurrencies on behalf of the person who opened the -pull request from their newly opened fork of airflow. - -This is why the `Build Images` workflow checks if the actor of the PR (GITHUB_ACTOR) is one of the committers, -and if not, then workflows and scripts used to run image building are coming only from the ``target`` branch -of the repository, where such scripts were reviewed and approved by the committers before being merged. - -This is controlled by `Selective checks <04_selective_checks.md>`__ that set appropriate output in -the build-info job of the workflow (see`is-committer-build` to `true`) if the actor is in the committer's -list and can be overridden by `non committer build` label in the PR. - -Also, for most of the jobs, committer builds by default use "Self-hosted" runners, while non-committer -builds use "Public" runners. For committers, this can be overridden by setting the -`use public runners` label in the PR. +the new branch. In several places, the selection of tests is +based on whether this output is `main`. They are marked in the "Release branches" column of +the table below. ## Tests Workflow -This workflow is a regular workflow that performs all checks of Airflow -code. - -| Job | Description | PR | Canary | Scheduled | Release branches | -|---------------------------------|----------------------------------------------------------|----------|----------|------------|------------------| -| Build info | Prints detailed information about the build | Yes | Yes | Yes | Yes | -| Push early cache & images | Pushes early cache/images to GitHub Registry | | Yes | | | -| Check that image builds quickly | Checks that image builds quickly | | Yes | | Yes | -| Build CI images | Builds images in-workflow (not in the build images) | | Yes | Yes (1) | Yes (4) | -| Generate constraints/CI verify | Generate constraints for the build and verify CI image | Yes (2) | Yes (2) | Yes (2) | Yes (2) | -| Build PROD images | Builds images in-workflow (not in the build images) | | Yes | Yes (1) | Yes (4) | -| Run breeze tests | Run unit tests for Breeze | Yes | Yes | Yes | Yes | -| Test OpenAPI client gen | Tests if OpenAPIClient continues to generate | Yes | Yes | Yes | Yes | -| React WWW tests | React UI tests for new Airflow UI | Yes | Yes | Yes | Yes | -| Test examples image building | Tests if PROD image build examples work | Yes | Yes | Yes | Yes | -| Test git clone on Windows | Tests if Git clone for for Windows | Yes (5) | Yes (5) | Yes (5) | Yes (5) | -| Waits for CI Images | Waits for and verify CI Images | Yes (2) | Yes (2) | Yes (2) | Yes (2) | -| Upgrade checks | Performs checks if there are some pending upgrades | | Yes | Yes | Yes | -| Static checks | Performs full static checks | Yes (6) | Yes | Yes | Yes (7) | -| Basic static checks | Performs basic static checks (no image) | Yes (6) | | | | -| Build docs | Builds and tests publishing of the documentation | Yes | Yes (11) | Yes | Yes | -| Spellcheck docs | Spellcheck docs | Yes | Yes | Yes | Yes | -| Tests wheel provider packages | Tests if provider packages can be built and released | Yes | Yes | Yes | | -| Tests Airflow compatibility | Compatibility of provider packages with older Airflow | Yes | Yes | Yes | | -| Tests dist provider packages | Tests if dist provider packages can be built | | Yes | Yes | | -| Tests airflow release commands | Tests if airflow release command works | | Yes | Yes | | -| Tests (Backend/Python matrix) | Run the Pytest unit DB tests (Backend/Python matrix) | Yes | Yes | Yes | Yes (8) | -| No DB tests | Run the Pytest unit Non-DB tests (with pytest-xdist) | Yes | Yes | Yes | Yes (8) | -| Integration tests | Runs integration tests (Postgres/Mysql) | Yes | Yes | Yes | Yes (9) | -| Quarantined tests | Runs quarantined tests (with flakiness and side-effects) | Yes | Yes | Yes | Yes (8) | -| Test airflow packages | Tests that Airflow package can be built and released | Yes | Yes | Yes | Yes | -| Helm tests | Run the Helm integration tests | Yes | Yes | Yes | | -| Helm release tests | Run the tests for Helm releasing | Yes | Yes | Yes | | -| Summarize warnings | Summarizes warnings from all other tests | Yes | Yes | Yes | Yes | -| Wait for PROD Images | Waits for and verify PROD Images | Yes (2) | Yes (2) | Yes (2) | Yes (2) | -| Docker Compose test/PROD verify | Tests quick-start Docker Compose and verify PROD image | Yes | Yes | Yes | Yes | -| Tests Kubernetes | Run Kubernetes test | Yes | Yes | Yes | | -| Update constraints | Upgrade constraints to latest ones | Yes (3) | Yes (3) | Yes (3) | Yes (3) | -| Push cache & images | Pushes cache/images to GitHub Registry (3) | | Yes (3) | | Yes | -| Build CI ARM images | Builds CI images for ARM | Yes (10) | | Yes | | +This workflow is a regular workflow that performs all checks of Airflow code. The `main` and `v*-*-test` +pushes are `canary` runs. + +| Job | Description | PR | main | v*-*-test | +|---------------------------------|----------------------------------------------------------|---------|---------|-----------| +| Build info | Prints detailed information about the build | Yes | Yes | Yes | +| Push early cache & images | Pushes early cache/images to GitHub Registry | | Yes (2) | Yes (2) | +| Check that image builds quickly | Checks that image builds quickly | | Yes | Yes | +| Build CI images | Builds images | Yes | Yes | Yes | +| Generate constraints/CI verify | Generate constraints for the build and verify CI image | Yes | Yes | Yes | +| Build PROD images | Builds images | Yes | Yes | Yes (3) | +| Run breeze tests | Run unit tests for Breeze | Yes | Yes | Yes | +| Test OpenAPI client gen | Tests if OpenAPIClient continues to generate | Yes | Yes | Yes | +| React WWW tests | React UI tests for new Airflow UI | Yes | Yes | Yes | +| Test examples image building | Tests if PROD image build examples work | Yes | Yes | Yes | +| Test git clone on Windows | Tests if Git clone for for Windows | Yes (4) | Yes (4) | Yes (4) | +| Upgrade checks | Performs checks if there are some pending upgrades | | Yes | Yes | +| Static checks | Performs full static checks | Yes (5) | Yes | Yes (6) | +| Basic static checks | Performs basic static checks (no image) | Yes (5) | | | +| Build and publish docs | Builds and tests publishing of the documentation | Yes (8) | Yes (8) | Yes (8) | +| Spellcheck docs | Spellcheck docs | Yes | Yes | Yes (7) | +| Tests wheel provider packages | Tests if provider packages can be built and released | Yes | Yes | | +| Tests Airflow compatibility | Compatibility of provider packages with older Airflow | Yes | Yes | | +| Tests dist provider packages | Tests if dist provider packages can be built | | Yes | | +| Tests airflow release commands | Tests if airflow release command works | | Yes | Yes | +| DB tests matrix | Run the Pytest unit DB tests | Yes | Yes | Yes (7) | +| No DB tests | Run the Pytest unit Non-DB tests (with pytest-xdist) | Yes | Yes | Yes (7) | +| Integration tests | Runs integration tests (Postgres/Mysql) | Yes | Yes | Yes (7) | +| Quarantined tests | Runs quarantined tests (with flakiness and side-effects) | Yes | Yes | Yes (7) | +| Test airflow packages | Tests that Airflow package can be built and released | Yes | Yes | Yes | +| Helm tests | Run the Helm integration tests | Yes | Yes | | +| Helm release tests | Run the tests for Helm releasing | Yes | Yes | | +| Summarize warnings | Summarizes warnings from all other tests | Yes | Yes | Yes | +| Docker Compose test/PROD verify | Tests quick-start Docker Compose and verify PROD image | Yes | Yes | Yes | +| Tests Kubernetes | Run Kubernetes test | Yes | Yes | | +| Update constraints | Upgrade constraints to latest ones | Yes | Yes (2) | Yes (2) | +| Push cache & images | Pushes cache/images to GitHub Registry (3) | | Yes (3) | | +| Build CI ARM images | Builds CI images for ARM | Yes (9) | | | `(1)` Scheduled jobs builds images from scratch - to test if everything works properly for clean builds -`(2)` The jobs wait for CI images to be available. It only actually runs when build image is needed (in -case of simpler PRs that do not change dependencies or source code, -images are not build) - -`(3)` PROD and CI cache & images are pushed as "cache" (both AMD and -ARM) and "latest" (only AMD) to GitHub Container registry and +`(2)` PROD and CI cache & images are pushed as "cache" (both AMD and +ARM) and "latest" (only AMD) to GitHub Container Registry and constraints are upgraded only if all tests are successful. The images are rebuilt in this step using constraints pushed in the previous step. -Constraints are only actually pushed in the `canary/scheduled` runs. +Constraints are only actually pushed in the `canary` runs. -`(4)` In main, PROD image uses locally build providers using "latest" +`(3)` In main, PROD image uses locally build providers using "latest" version of the provider code. In the non-main version of the build, the latest released providers from PyPI are used. -`(5)` Always run with public runners to test if Git clone works on +`(4)` Always run with public runners to test if Git clone works on Windows. -`(6)` Run full set of static checks when selective-checks determine that +`(5)` Run full set of static checks when selective-checks determine that they are needed (basically, when Python code has been modified). -`(7)` On non-main builds some of the static checks that are related to +`(6)` On non-main builds some of the static checks that are related to Providers are skipped via selective checks (`skip-pre-commits` check). -`(8)` On non-main builds the unit tests for providers are skipped via -selective checks removing the "Providers" test type. - -`(9)` On non-main builds the integration tests for providers are skipped -via `skip-provider-tests` selective check output. +`(7)` On non-main builds the unit tests, docs and integration tests +for providers are skipped via selective checks. -`(10)` Only run the builds in case PR is run by a committer from -"apache" repository and in scheduled build. +`(8)` Docs publishing is only done in Canary run. -`(11)` Docs publishing is only done in Canary run, to handle the case where -cloning whole airflow site on Public Runner cannot complete due to the size of the repository. +`(9)` ARM images are not currently built - until we have ARM runners available. ## CodeQL scan @@ -303,8 +219,7 @@ violations. It is run for JavaScript and Python code. ## Publishing documentation -Documentation from the `main` branch is automatically published on -Amazon S3. +Documentation from the `main` branch is automatically published on Amazon S3. To make this possible, GitHub Action has secrets set up with credentials for an Amazon Web Service account - `DOCS_AWS_ACCESS_KEY_ID` and @@ -321,4 +236,4 @@ Website endpoint: ----- -Read next about [Diagrams](06_diagrams.md) +Read next about [Debugging CI builds](06_debugging.md) diff --git a/dev/breeze/doc/ci/06_debugging.md b/dev/breeze/doc/ci/06_debugging.md new file mode 100644 index 0000000000000..8d030034728c7 --- /dev/null +++ b/dev/breeze/doc/ci/06_debugging.md @@ -0,0 +1,64 @@ + + + + +**Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)* + +- [Debugging CI Jobs in Github Actions and changing their behaviour](#debugging-ci-jobs-in-github-actions-and-changing-their-behaviour) + + + +# Debugging CI Jobs in Github Actions and changing their behaviour + +The CI jobs are notoriously difficult to test, because you can only +really see results of it when you run them in CI environment, and the +environment in which they run depend on who runs them (they might be +either run in our Self-Hosted runners (with 64 GB RAM 8 CPUs) or in the +GitHub Public runners (6 GB of RAM, 2 CPUs) and the results will vastly +differ depending on which environment is used. We are utilizing +parallelism to make use of all the available CPU/Memory but sometimes +you need to enable debugging and force certain environments. + +There are several ways how you can debug the CI jobs and modify their +behaviour when you are maintainer. + +When you create the PR you can set one of the labels below, also +in some cases, you need to run the PR as coming from the "apache" +repository rather than from your fork. + +You can also apply the label later and rebase the PR or close/reopen +the PR to apply the label to the PR. + +| Action to perform | Label to set | PR from "apache" repo | +|------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------|:---------------------:| +| Run the build with all combinations of all
python, backends, kubernetes etc on PR,
and run all types of tests for all test
groups. | full tests needed | | +| Force to use public runners for the build | use public runners | | +| Debug resources used during the build for
parallel jobs | debug ci resources | | +| Force running PR on latest versions of
python, backends, kubernetes etc. when you
want to save resources and test only latest
versions | latest versions only | | +| Force running PR on minimal (default)
versions of python, backends, kubernetes etc.
in order to save resources and run tests only
for minimum versions | default versions only | | +| Make sure to clean dependency cache
usually when removing dependencies
You also need to increase
`DEPENDENCIES_EPOCH_NUMBER` in `Dockerfile.ci` | disable image cache | | +| Change build images workflows, breeze code or
scripts that are used during image build
so that the scripts can be modified by PR
| | Yes | +| Treat your build as "canary" build - including
updating constraints and pushing "main"
documentation. | | Yes | +| Remove any behaviour specific for the committers
such as using different runners by default. | non committer build | | + + +----- + +Read next about [Running CI locally](07_running_ci_locally.md) diff --git a/dev/breeze/doc/ci/06_diagrams.md b/dev/breeze/doc/ci/06_diagrams.md deleted file mode 100644 index afe51a309e8eb..0000000000000 --- a/dev/breeze/doc/ci/06_diagrams.md +++ /dev/null @@ -1,466 +0,0 @@ - - - - -**Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)* - -- [CI Sequence diagrams](#ci-sequence-diagrams) - - [Pull request flow from fork](#pull-request-flow-from-fork) - - [Pull request flow from "apache/airflow" repo](#pull-request-flow-from-apacheairflow-repo) - - [Merge "Canary" run](#merge-canary-run) - - [Scheduled run](#scheduled-run) - - - -# CI Sequence diagrams - -You can see here the sequence diagrams of the flow happening during the CI Jobs. - -## Pull request flow from fork - -This is the flow that happens when a pull request is created from a fork - which is the most frequent -pull request flow that happens in Airflow. The "pull_request" workflow does not have write access -to the GitHub Registry, so it cannot push the CI/PROD images there. Instead, we push the images -from the "pull_request_target" workflow, which has write access to the GitHub Registry. Note that -this workflow always uses scripts and workflows from the "target" branch of the "apache/airflow" -repository, so the user submitting such pull request cannot override our build scripts and inject malicious -code into the workflow that has potentially write access to the GitHub Registry (and can override cache). - -Security is the main reason why we have two workflows for pull requests and such complex workflows. - -```mermaid -sequenceDiagram - Note over Airflow Repo: pull request - Note over Tests: pull_request
[Read Token] - Note over Build Images: pull_request_target
[Write Token] - activate Airflow Repo - Airflow Repo -->> Tests: Trigger 'pull_request' - activate Tests - Tests -->> Build Images: Trigger 'pull_request_target' - activate Build Images - Note over Tests: Build info - Note over Tests: Selective checks
Decide what to do - Note over Build Images: Build info - Note over Build Images: Selective checks
Decide what to do - Note over Tests: Skip Build
(Runs in 'Build Images')
CI Images - Note over Tests: Skip Build
(Runs in 'Build Images')
PROD Images - par - GitHub Registry ->> Build Images: Use cache from registry - Airflow Repo ->> Build Images: Use constraints from `constraints-BRANCH` - Note over Build Images: Build CI Images
[COMMIT_SHA]
Upgrade to newer dependencies if deps changed - Build Images ->> GitHub Registry: Push CI Images
[COMMIT_SHA] - Build Images ->> Artifacts: Upload source constraints - and - Note over Tests: OpenAPI client gen - and - Note over Tests: React WWW tests - and - Note over Tests: Test git clone on Windows - and - Note over Tests: Helm release tests - and - opt - Note over Tests: Run basic
static checks - end - end - loop Wait for CI images - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - end - par - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Verify CI Images
[COMMIT_SHA] - Note over Tests: Generate constraints
source,pypi,no-providers - Tests ->> Artifacts: Upload source,pypi,no-providers constraints - and - Artifacts ->> Build Images: Download source constraints - GitHub Registry ->> Build Images: Use cache from registry - Note over Build Images: Build PROD Images
[COMMIT_SHA] - Build Images ->> GitHub Registry: Push PROD Images
[COMMIT_SHA] - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Run static checks - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Build docs - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Spellcheck docs - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Unit Tests
Python/DB matrix - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Unit Tests
Python/Non-DB matrix - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Integration Tests - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Quarantined Tests - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Build/test provider packages
wheel, sdist, old airflow - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Test airflow
release commands - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Helm tests - end - end - par - Note over Tests: Summarize Warnings - and - opt - Artifacts ->> Tests: Download source,pypi,no-providers constraints - Note over Tests: Display constraints diff - end - and - opt - loop Wait for PROD images - GitHub Registry ->> Tests: Pull PROD Images
[COMMIT_SHA] - end - end - and - opt - Note over Tests: Build ARM CI images - end - end - par - opt - GitHub Registry ->> Tests: Pull PROD Images
[COMMIT_SHA] - Note over Tests: Test examples
PROD image building - end - and - opt - GitHub Registry ->> Tests: Pull PROD Images
[COMMIT_SHA] - Note over Tests: Run Kubernetes
tests - end - and - opt - GitHub Registry ->> Tests: Pull PROD Images
[COMMIT_SHA] - Note over Tests: Verify PROD Images
[COMMIT_SHA] - Note over Tests: Run docker-compose
tests - end - end - Tests -->> Airflow Repo: Status update - deactivate Airflow Repo - deactivate Tests -``` - -## Pull request flow from "apache/airflow" repo - -The difference between this flow and the previous one is that the CI/PROD images are built in the -CI workflow and pushed to the GitHub Registry from there. This cannot be done in case of fork -pull request, because Pull Request from forks cannot have "write" access to GitHub Registry. All the steps -except "Build Info" from the "Build Images" workflows are skipped in this case. - -THis workflow can be used by maintainers in case they have a Pull Request that changes the scripts and -CI workflows used to build images, because in this case the "Build Images" workflow will use them -from the Pull Request. This is safe, because the Pull Request is from the "apache/airflow" repository -and only maintainers can push to that repository and create Pull Requests from it. - -```mermaid -sequenceDiagram - Note over Airflow Repo: pull request - Note over Tests: pull_request
[Write Token] - Note over Build Images: pull_request_target
[Unused Token] - activate Airflow Repo - Airflow Repo -->> Tests: Trigger 'pull_request' - activate Tests - Tests -->> Build Images: Trigger 'pull_request_target' - activate Build Images - Note over Tests: Build info - Note over Tests: Selective checks
Decide what to do - Note over Build Images: Build info - Note over Build Images: Selective checks
Decide what to do - Note over Build Images: Skip Build
(Runs in 'Tests')
CI Images - Note over Build Images: Skip Build
(Runs in 'Tests')
PROD Images - deactivate Build Images - Note over Tests: Build info - Note over Tests: Selective checks
Decide what to do - par - GitHub Registry ->> Tests: Use cache from registry - Airflow Repo ->> Tests: Use constraints from `constraints-BRANCH` - Note over Tests: Build CI Images
[COMMIT_SHA]
Upgrade to newer dependencies if deps changed - Tests ->> GitHub Registry: Push CI Images
[COMMIT_SHA] - Tests ->> Artifacts: Upload source constraints - and - Note over Tests: OpenAPI client gen - and - Note over Tests: React WWW tests - and - Note over Tests: Test examples
PROD image building - and - Note over Tests: Test git clone on Windows - and - Note over Tests: Helm release tests - and - opt - Note over Tests: Run basic
static checks - end - end - Note over Tests: Skip waiting for CI images - par - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Verify CI Images
[COMMIT_SHA] - Note over Tests: Generate constraints
source,pypi,no-providers - Tests ->> Artifacts: Upload source,pypi,no-providers constraints - and - Artifacts ->> Tests: Download source constraints - GitHub Registry ->> Tests: Use cache from registry - Note over Tests: Build PROD Images
[COMMIT_SHA] - Tests ->> GitHub Registry: Push PROD Images
[COMMIT_SHA] - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Run static checks - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Build docs - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Spellcheck docs - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Unit Tests
Python/DB matrix - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Unit Tests
Python/Non-DB matrix - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Integration Tests - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Quarantined Tests - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Build/test provider packages
wheel, sdist, old airflow - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Test airflow
release commands - end - and - opt - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Helm tests - end - end - Note over Tests: Skip waiting for PROD images - par - Note over Tests: Summarize Warnings - and - opt - Artifacts ->> Tests: Download source,pypi,no-providers constraints - Note over Tests: Display constraints diff - end - and - Note over Tests: Build ARM CI images - and - opt - GitHub Registry ->> Tests: Pull PROD Images
[COMMIT_SHA] - Note over Tests: Run Kubernetes
tests - end - and - opt - GitHub Registry ->> Tests: Pull PROD Images
[COMMIT_SHA] - Note over Tests: Verify PROD Images
[COMMIT_SHA] - Note over Tests: Run docker-compose
tests - end - end - Tests -->> Airflow Repo: Status update - deactivate Airflow Repo - deactivate Tests -``` - -## Merge "Canary" run - -This is the flow that happens when a pull request is merged to the "main" branch or pushed to any of -the "v2-*-test" branches. The "Canary" run attempts to upgrade dependencies to the latest versions -and quickly pushes an early cache the CI/PROD images to the GitHub Registry - so that pull requests -can quickly use the new cache - this is useful when Dockerfile or installation scripts change because such -cache will already have the latest Dockerfile and scripts pushed even if some tests will fail. -When successful, the run updates the constraints files in the "constraints-BRANCH" branch with the latest -constraints and pushes both cache and latest CI/PROD images to the GitHub Registry. - -```mermaid -sequenceDiagram - Note over Airflow Repo: push/merge - Note over Tests: push
[Write Token] - activate Airflow Repo - Airflow Repo -->> Tests: Trigger 'push' - activate Tests - Note over Tests: Build info - Note over Tests: Selective checks
Decide what to do - par - GitHub Registry ->> Tests: Use cache from registry
(Not for scheduled run) - Airflow Repo ->> Tests: Use constraints from `constraints-BRANCH` - Note over Tests: Build CI Images
[COMMIT_SHA]
Always upgrade to newer deps - Tests ->> GitHub Registry: Push CI Images
[COMMIT_SHA] - Tests ->> Artifacts: Upload source constraints - and - GitHub Registry ->> Tests: Use cache from registry
(Not for scheduled run) - Note over Tests: Check that image builds quickly - and - GitHub Registry ->> Tests: Use cache from registry
(Not for scheduled run) - Note over Tests: Push early CI Image cache - Tests ->> GitHub Registry: Push CI cache Images - and - Note over Tests: OpenAPI client gen - and - Note over Tests: React WWW tests - and - Note over Tests: Test git clone on Windows - and - Note over Tests: Run upgrade checks - end - Note over Tests: Skip waiting for CI images - par - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Verify CI Images
[COMMIT_SHA] - Note over Tests: Generate constraints
source,pypi,no-providers - Tests ->> Artifacts: Upload source,pypi,no-providers constraints - and - Artifacts ->> Tests: Download source constraints - GitHub Registry ->> Tests: Use cache from registry - Note over Tests: Build PROD Images
[COMMIT_SHA] - Tests ->> GitHub Registry: Push PROD Images
[COMMIT_SHA] - and - Artifacts ->> Tests: Download source constraints - and - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Run static checks - and - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Build docs - and - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Spellcheck docs - and - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Unit Tests
Python/DB matrix - and - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Unit Tests
Python/Non-DB matrix - and - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Integration Tests - and - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Quarantined Tests - and - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Build/test provider packages
wheel, sdist, old airflow - and - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Test airflow
release commands - and - GitHub Registry ->> Tests: Pull CI Images
[COMMIT_SHA] - Note over Tests: Helm tests - end - Note over Tests: Skip waiting for PROD images - par - Note over Tests: Summarize Warnings - and - Artifacts ->> Tests: Download source,pypi,no-providers constraints - Note over Tests: Display constraints diff - Tests ->> Airflow Repo: Push constraints if changed to 'constraints-BRANCH' - and - GitHub Registry ->> Tests: Pull PROD Images
[COMMIT_SHA] - Note over Tests: Test examples
PROD image building - and - GitHub Registry ->> Tests: Pull PROD Image
[COMMIT_SHA] - Note over Tests: Run Kubernetes
tests - and - GitHub Registry ->> Tests: Pull PROD Image
[COMMIT_SHA] - Note over Tests: Verify PROD Images
[COMMIT_SHA] - Note over Tests: Run docker-compose
tests - end - par - GitHub Registry ->> Tests: Use cache from registry - Airflow Repo ->> Tests: Get latest constraints from 'constraints-BRANCH' - Note over Tests: Build CI latest images/cache - Tests ->> GitHub Registry: Push CI latest images/cache - GitHub Registry ->> Tests: Use cache from registry - Airflow Repo ->> Tests: Get latest constraints from 'constraints-BRANCH' - Note over Tests: Build PROD latest images/cache - Tests ->> GitHub Registry: Push PROD latest images/cache - and - GitHub Registry ->> Tests: Use cache from registry - Airflow Repo ->> Tests: Get latest constraints from 'constraints-BRANCH' - Note over Tests: Build ARM CI cache - Tests ->> GitHub Registry: Push ARM CI cache - GitHub Registry ->> Tests: Use cache from registry - Airflow Repo ->> Tests: Get latest constraints from 'constraints-BRANCH' - Note over Tests: Build ARM PROD cache - Tests ->> GitHub Registry: Push ARM PROD cache - end - Tests -->> Airflow Repo: Status update - deactivate Airflow Repo - deactivate Tests -``` - -## Scheduled run - -This is the flow that happens when a scheduled run is triggered. The "scheduled" workflow is aimed to -run regularly (overnight) even if no new PRs are merged to "main". Scheduled run is generally the -same as "Canary" run, with the difference that the image used to run the tests is built without using -cache - it's always built from the scratch. This way we can check that no "system" dependencies in debian -base image have changed and that the build is still reproducible. No separate diagram is needed for -scheduled run as it is identical to that of "Canary" run. - ------ - -Read next about [Debugging](07_debugging.md) diff --git a/dev/breeze/doc/ci/07_debugging.md b/dev/breeze/doc/ci/07_debugging.md deleted file mode 100644 index 6e6d46584edfa..0000000000000 --- a/dev/breeze/doc/ci/07_debugging.md +++ /dev/null @@ -1,88 +0,0 @@ - - - - -**Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)* - -- [Debugging CI Jobs in Github Actions](#debugging-ci-jobs-in-github-actions) - - - -# Debugging CI Jobs in Github Actions - -The CI jobs are notoriously difficult to test, because you can only -really see results of it when you run them in CI environment, and the -environment in which they run depend on who runs them (they might be -either run in our Self-Hosted runners (with 64 GB RAM 8 CPUs) or in the -GitHub Public runners (6 GB of RAM, 2 CPUs) and the results will vastly -differ depending on which environment is used. We are utilizing -parallelism to make use of all the available CPU/Memory but sometimes -you need to enable debugging and force certain environments. Additional -difficulty is that `Build Images` workflow is `pull-request-target` -type, which means that it will always run using the `main` version - no -matter what is in your Pull Request. - -There are several ways how you can debug the CI jobs when you are -maintainer. - -- When you want to tests the build with all combinations of all python, - backends etc on regular PR, add `full tests needed` label to the PR. -- When you want to test maintainer PR using public runners, add - `public runners` label to the PR -- When you want to see resources used by the run, add - `debug ci resources` label to the PR -- When you want to test changes to breeze that include changes to how - images are build you should push your PR to `apache` repository not to - your fork. This will run the images as part of the `CI` workflow - rather than using `Build images` workflow and use the same breeze - version for building image and testing -- When you want to test changes to workflows and CI scripts you can set - `all versions` label to the PR or `latest versions only`. - This will make the PR run using "all" versions of - Python, Kubernetes and the DBS. By default - unless you also change - dependencies in `pyproject.toml` or `generated/provider_dependencies.json` - such PRs will only use "default" versions of Python, Kubernetes and - DBs. This is useful when you want to test changes to the CI scripts - are not affected by the versions of Python, Kubernetes and DBs. -- Even if you change dependencies in `pyproject.toml`, or - `generated/provider_dependencies.json`, when you want to test changes to workflows - and CI scripts you can set `default versions only` label to the - This will make the PR run using the default (or latest) versions of - Python and Kubernetes and DBs. This is useful when you want to test - changes to the CI scripts and workflows and you want to use far - less resources than the full tests. -- When you want to test changes to `build-images.yml` workflow you - should push your branch as `main` branch in your local fork. This will - run changed `build-images.yml` workflow as it will be in `main` branch - of your fork -- When you are a committer and you change build images workflow, together - with build scripts, your build might fail because your scripts are used - in `build-images.yml` workflow, but the workflow is run using the `main` - version. Setting `non committer build` label will make your PR run using - the main version of the scripts and the workflow -- When you are a committer want to test how changes in your workflow affect - `canary` run, as maintainer, you should push your PR to `apache` repository - not to your fork and set `canary` label to the PR -- When you are a committer and want to test if the tests are passing if the - image is freshly built without cache, you can set `disable image cache` label. - ------ - -Read next about [Running CI locally](08_running_ci_locally.md) diff --git a/dev/breeze/doc/ci/07_running_ci_locally.md b/dev/breeze/doc/ci/07_running_ci_locally.md new file mode 100644 index 0000000000000..5f414667b0151 --- /dev/null +++ b/dev/breeze/doc/ci/07_running_ci_locally.md @@ -0,0 +1,187 @@ + + + + +**Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)* + +- [Running the CI Jobs locally](#running-the-ci-jobs-locally) +- [Getting the CI image from failing job](#getting-the-ci-image-from-failing-job) +- [Options and environment variables used](#options-and-environment-variables-used) + - [Basic variables](#basic-variables) + - [Test variables](#test-variables) + - [In-container environment initialization](#in-container-environment-initialization) + - [Host & GIT variables](#host--git-variables) + + + +# Running the CI Jobs locally + +The main goal of the CI philosophy we have that no matter how complex +the test and integration infrastructure, as a developer you should be +able to reproduce and re-run any of the failed checks locally. One part +of it are pre-commit checks, that allow you to run the same static +checks in CI and locally, but another part is the CI environment which +is replicated locally with Breeze. + +You can read more about Breeze in +[README.rst](../README.rst) but in essence it is a python wrapper around +docker commands that allows you (among others) to re-create CI environment +in your local development instance and interact with it. +In its basic form, when you do development you can run all the same +tests that will be run in CI - but +locally, before you submit them as PR. Another use case where Breeze is +useful is when tests fail on CI. + +All our CI jobs are executed via `breeze` commands. You can replicate +exactly what our CI is doing by running the sequence of corresponding +`breeze` command. Make sure however that you look at both: + +- flags passed to `breeze` commands +- environment variables used when `breeze` command is run - this is + useful when we want to set a common flag for all `breeze` commands in + the same job or even the whole workflow. For example `VERBOSE` + variable is set to `true` for all our workflows so that more detailed + information about internal commands executed in CI is printed. + +In the output of the CI jobs, you will find both - the flags passed and +environment variables set. + +# Getting the CI image from failing job + +Every contributor can also pull and run images being result of a specific +CI run in GitHub Actions. This is a powerful tool that allows to +reproduce CI failures locally, enter the images and fix them much +faster. + +Note that this currently only works for AMD machines, not for ARM machines, but +this will change soon. + +To load the image from specific PR, you can use the following command: + +```bash +breeze ci-image load --from-pr 12345 --python 3.9 --github-token +``` + +To load the image from specific run (for example 12538475388), +you can use the following command, find the run id from github action runs. + +```bash +breeze ci-image load --from-run 12538475388 --python 3.9 --github-token +``` + +After you load the image, you can reproduce the very exact environment that was used in the CI run by +entering breeze container without mounting your local sources: + +```bash +breeze shell --mount-sources skip [OPTIONS] +``` + +And you should be able to run any tests and commands interactively in the very exact environment that +was used in the failing CI run even without checking out sources of the failing PR. +This is a powerful tool to debug and fix CI issues. + +You can also build the image locally by checking-out the branch of the PR that was used and running: + +```bash +breeze ci-image build +``` + +You have to be aware that some of the PRs and canary builds use the `--upgrade-to-newer-dependencies` flag +(`UPGRADE_TO_NEWER_DEPENDENCIES` environment variable set to `true`) and they are not using constraints +to build the image so if you want to build it locally, you should pass the `--upgrade-to-newer-dependencies` +flag when you are building the image. + +Note however, that if constraints changed for regulare builds and if someone released a new package in PyPI +since the build was run (which is very likely - we have many packages released a day), the image you +build locally might be different than the one in CI, that's why loading image using `breeze ci-image load` +is more reliable way to reproduce the CI build. + +If you check-out the branch of the PR that was used, regular ``breeze`` commands will +also reproduce the CI environment without having to rebuild the image - for example when dependencies +changed or when new dependencies were released and used in the CI job - and you will +be able to edit source files locally as usual and use your IDE and tools you usually use to develop Airflow. + +In order to reproduce the exact job you also need to set the "[OPTIONS]" corresponding to the particular +job you want to reproduce within the run. You can find those in the logs of the CI job. Note that some +of the options can be passed by `--flags` and some via environment variables, for convenience, so you should +take a look at both if you want to be sure to reproduce the exact job configuration. See the next chapter +for summary of the most important environment variables and options used in the CI jobs. + +You can read more about it in [Breeze](../README.rst) and [Testing](../../../../contributing-docs/09_testing.rst) + +# Options and environment variables used + +Depending whether the scripts are run locally via [Breeze](../README.rst) or whether they are run in +`Build Images` or `Tests` workflows can behave differently. + +You can use those variables when you try to reproduce the build locally - alternatively you can pass +those via corresponding command line flag passed to `breeze shell` command. + +## Basic variables + +Those variables are controlling basic configuration and behaviour of the breeze command. + +| Variable | Option | Local dev | CI | Comment | +|----------------------------|--------------------------|-----------|------|------------------------------------------------------------------------------| +| PYTHON_MAJOR_MINOR_VERSION | --python | | | Major/Minor version of Python used. | +| BACKEND | --backend | | | Backend used in the tests. | +| INTEGRATION | --integration | | | Integration used in tests. | +| DB_RESET | --db-reset/--no-db-reset | false | true | Determines whether database should be reset at the container entry. | +| ANSWER | --answer | | yes | This variable determines if answer to questions should be automatically set. | + +## Test variables + +Those variables are used to control the test execution. + +| Variable | Option | Local dev | CI | Comment | +|-------------------|---------------------|-----------|----------------------|-------------------------------------------| +| RUN_DB_TESTS_ONLY | --run-db-tests-only | | true in db tests | Whether only db tests should be executed. | +| SKIP_DB_TESTS | --skip-db-tests | | true in non-db tests | Whether db tests should be skipped. | + + +## In-container environment initialization + +Those variables are used to control the initialization of the environment in the container. + +| Variable | Option | Local dev | CI | Comment | +|---------------------------------|------------------------------------|-----------|-----------|-----------------------------------------------------------------------------| +| MOUNT_SOURCES | --mount-sources | | skip | Whether to mount the local sources into the container. | +| SKIP_ENVIRONMENT_INITIALIZATION | --skip-enviromnment-initialization | false (*) | false (*) | Skip initialization of test environment (*) set to true in pre-commits. | +| SKIP_IMAGE_UPGRADE_CHECK | --skip-image-upgrade-check | false (*) | false (*) | Skip checking if image should be upgraded (*) set to true in pre-commits. | +| SKIP_PROVIDERS_TESTS | | false | false | Skip running provider integration tests (in non-main branch). | +| SKIP_SSH_SETUP | | false | false (*) | Skip setting up SSH server for tests. (*) set to true in GitHub CodeSpaces. | +| VERBOSE_COMMANDS | | false | false | Whether every command executed in docker should be printed. | + +## Host & GIT variables + +Those variables are automatically set by Breeze when running the commands locally, but you can override them +if you want to run the commands in a different environment. + +| Variable | Local dev | CI | Comment | +|---------------|-----------|------------|----------------------------------------| +| HOST_USER_ID | Host UID | | User id of the host user. | +| HOST_GROUP_ID | Host GID | | Group id of the host user. | +| HOST_OS | | linux | OS of the Host (darwin/linux/windows). | +| COMMIT_SHA | | GITHUB_SHA | SHA of the commit of the build is run | + + +---- + +**Thank you** for reading this far. We hope that you have learned a lot about Reproducing Airlfow's CI job locally and CI in general. diff --git a/dev/breeze/doc/ci/08_running_ci_locally.md b/dev/breeze/doc/ci/08_running_ci_locally.md deleted file mode 100644 index 6e1cbb0917536..0000000000000 --- a/dev/breeze/doc/ci/08_running_ci_locally.md +++ /dev/null @@ -1,141 +0,0 @@ - - - - -**Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)* - -- [Running the CI Jobs locally](#running-the-ci-jobs-locally) -- [Upgrade to newer dependencies](#upgrade-to-newer-dependencies) - - - -# Running the CI Jobs locally - -The main goal of the CI philosophy we have that no matter how complex -the test and integration infrastructure, as a developer you should be -able to reproduce and re-run any of the failed checks locally. One part -of it are pre-commit checks, that allow you to run the same static -checks in CI and locally, but another part is the CI environment which -is replicated locally with Breeze. - -You can read more about Breeze in -[README.rst](../README.rst) but in essence it is a script -that allows you to re-create CI environment in your local development -instance and interact with it. In its basic form, when you do -development you can run all the same tests that will be run in CI - but -locally, before you submit them as PR. Another use case where Breeze is -useful is when tests fail on CI. You can take the full `COMMIT_SHA` of -the failed build pass it as `--image-tag` parameter of Breeze and it -will download the very same version of image that was used in CI and run -it locally. This way, you can very easily reproduce any failed test that -happens in CI - even if you do not check out the sources connected with -the run. - -All our CI jobs are executed via `breeze` commands. You can replicate -exactly what our CI is doing by running the sequence of corresponding -`breeze` command. Make sure however that you look at both: - -- flags passed to `breeze` commands -- environment variables used when `breeze` command is run - this is - useful when we want to set a common flag for all `breeze` commands in - the same job or even the whole workflow. For example `VERBOSE` - variable is set to `true` for all our workflows so that more detailed - information about internal commands executed in CI is printed. - -In the output of the CI jobs, you will find both - the flags passed and -environment variables set. - -You can read more about it in [Breeze](../README.rst) and -[Testing](contributing-docs/09_testing.rst) - -Since we store images from every CI run, you should be able easily -reproduce any of the CI tests problems locally. You can do it by pulling -and using the right image and running it with the right docker command, -For example knowing that the CI job was for commit -`cd27124534b46c9688a1d89e75fcd137ab5137e3`: - -``` bash -docker pull ghcr.io/apache/airflow/main/ci/python3.8:cd27124534b46c9688a1d89e75fcd137ab5137e3 - -docker run -it ghcr.io/apache/airflow/main/ci/python3.8:cd27124534b46c9688a1d89e75fcd137ab5137e3 -``` - -But you usually need to pass more variables and complex setup if you -want to connect to a database or enable some integrations. Therefore it -is easiest to use [Breeze](../README.rst) for that. For -example if you need to reproduce a MySQL environment in python 3.8 -environment you can run: - -``` bash -breeze --image-tag cd27124534b46c9688a1d89e75fcd137ab5137e3 --python 3.8 --backend mysql -``` - -You will be dropped into a shell with the exact version that was used -during the CI run and you will be able to run pytest tests manually, -easily reproducing the environment that was used in CI. Note that in -this case, you do not need to checkout the sources that were used for -that run - they are already part of the image - but remember that any -changes you make in those sources are lost when you leave the image as -the sources are not mapped from your host machine. - -Depending whether the scripts are run locally via -[Breeze](../README.rst) or whether they are run in -`Build Images` or `Tests` workflows they can take different values. - -You can use those variables when you try to reproduce the build locally -(alternatively you can pass those via corresponding command line flags -passed to `breeze shell` command. - -| Variable | Local development | Build Images workflow | CI Workflow | Comment | -|-----------------------------------------|--------------------|------------------------|--------------|--------------------------------------------------------------------------------| -| Basic variables | | | | | -| PYTHON_MAJOR_MINOR_VERSION | | | | Major/Minor version of Python used. | -| DB_RESET | false | true | true | Determines whether database should be reset at the container entry. | -| Forcing answer | | | | | -| ANSWER | | yes | yes | This variable determines if answer to questions should be automatically given. | -| Host variables | | | | | -| HOST_USER_ID | | | | User id of the host user. | -| HOST_GROUP_ID | | | | Group id of the host user. | -| HOST_OS | | linux | linux | OS of the Host (darwin/linux/windows). | -| Git variables | | | | | -| COMMIT_SHA | | GITHUB_SHA | GITHUB_SHA | SHA of the commit of the build is run | -| In container environment initialization | | | | | -| SKIP_ENVIRONMENT_INITIALIZATION | false* | false* | false* | Skip initialization of test environment * set to true in pre-commits | -| SKIP_IMAGE_UPGRADE_CHECK | false* | false* | false* | Skip checking if image should be upgraded * set to true in pre-commits | -| SKIP_PROVIDER_TESTS | false* | false* | false* | Skip running provider integration tests | -| SKIP_SSH_SETUP | false* | false* | false* | Skip setting up SSH server for tests. * set to true in GitHub CodeSpaces | -| VERBOSE_COMMANDS | false | false | false | Determines whether every command executed in docker should be printed. | -| Image build variables | | | | | -| UPGRADE_TO_NEWER_DEPENDENCIES | false | false | false* | Determines whether the build should attempt to upgrade dependencies. | - -# Upgrade to newer dependencies - -By default we are using a tested set of dependency constraints stored in separated "orphan" branches of the airflow repository -("constraints-main, "constraints-2-0") but when this flag is set to anything but false (for example random value), -they are not used used and "eager" upgrade strategy is used when installing dependencies. We set it to true in case of direct -pushes (merges) to main and scheduled builds so that the constraints are tested. In those builds, in case we determine -that the tests pass we automatically push latest set of "tested" constraints to the repository. Setting the value to random -value is best way to assure that constraints are upgraded even if there is no change to pyproject.toml -This way our constraints are automatically tested and updated whenever new versions of libraries are released. -(*) true in case of direct pushes and scheduled builds - ----- - -**Thank you** for reading this far. We hope that you have learned a lot about Airflow's CI. diff --git a/dev/breeze/doc/ci/README.md b/dev/breeze/doc/ci/README.md index f52376e18b125..bf20a3a700923 100644 --- a/dev/breeze/doc/ci/README.md +++ b/dev/breeze/doc/ci/README.md @@ -24,6 +24,5 @@ This directory contains detailed design of the Airflow CI setup. * [GitHub Variables](03_github_variables.md) - contains description of the GitHub variables used in CI * [Selective checks](04_selective_checks.md) - contains description of the selective checks performed in CI * [Workflows](05_workflows.md) - contains description of the workflows used in CI -* [Diagrams](06_diagrams.md) - contains diagrams of the CI workflows -* [Debugging](07_debugging.md) - contains description of debugging CI issues -* [Running CI Locally](08_running_ci_locally.md) - contains description of running CI locally +* [Debugging](06_debugging.md) - contains description of debugging CI issues +* [Running CI Locally](07_running_ci_locally.md) - contains description of running CI locally diff --git a/dev/breeze/doc/images/image_artifacts.png b/dev/breeze/doc/images/image_artifacts.png new file mode 100644 index 0000000000000000000000000000000000000000..485a6a2c9cf10ebdc0241b317c089d5be40f9079 GIT binary patch literal 47666 zcmeFZWmuH`*ETwe0xG4HfPhFT-5pAI!_eI+-Jz7U#Lyw#-7$bDAl=d6Gwke_#6ZRp-W1LD1ksIbRf_pThzzE zFWNkA$RLm(NK)jJikmKM;a*E?%Ukg7LU6gv;w{w|RBEYl5p=m?5qg|@#jF~&XQfKH zieK?3RC!C8f`y842*)%d2xC7Mi{w}q*Ck62m~yaIC&^{FyIGssaYR)P^oO%d_nY<> zC`lcgZ(Ovl2M2KISC z;AmT%a&-;(47}dn2(Ai0d_{O=27Ul=Uus(sMC23FG=WmHm% z&+!ujs=*^*NkO19C^s2Qc0+X)ml5a%PL@{m`xM{&f82~}z>JIs0=Ya152nWwQvZ7s z$a{+S?=VBbjld6c*t@^*9{*<>Tla`N45=7Epy^LJWASbbNErWp7kfY_6VSx8J=1Fr z#((!&3f*AIXX;mdlN@2XPQ1kZ_dA@q_e`YtAP|fzF_ZRxT=@L&g+XQs#dB)7qZvHh zumN1;<5ggM`n<7nx~l}}@Bh2i=LUApX*thIWIPnCKhU;0N4ePpmRFG`NzHg1j*Ro~ zs3-#~=9$z@m`bLVsB~u@7ReWAs37fChrrpk5sG)|Aa%yf&BwF`oWGOT-;2ZP2nCw8 z{(#MB1*~G7Akf@69Mj-JKw)a}<#yPe0vW>nChg)I{Xc75_xPbNKfd3rgbH(f0sPxm zxyQ7W^g=lQ9E7Py|Krab0nP$fXbDYHQE$#_TaT0y1uloPqsT&^KJ@KJABY9a^)2Q@ zha6}R9d=WeIt>?_I)CM?R%USjA#)RnrPWm0_MV*=U*#?%2K~9> z#0}>h*RTw|(%4Q`nNWu@zXCG?rnugZR4)ugXO7SHY*nN|Q> zm6j|M0hW7yp@wQeYIE7(aCO$vSI%$0tZWGU?q@}nKH@hkKvtrDcb+Luik+gUgv1aE zta1N&xZz%;TyfI!wVM_D(nn#@Q+UdMHg zFXPgW-|(w#xR~?S5zmbwcQrE#X-qO*`oUeb7b7KuEF2}J-^)lfX<{!bQm((;J|z)# zGAx2srfhiz(vRuM9fb`XmnXae0SKqQt;Rn3XATeNN94gk_pv%ck@PXIH9JikKL;A- z`s&rcncOBF(!FlAH*x=yb~CD}y7IjEXXrsLJ1SPk*Y@?J?v4?4u2-L1I+2Yw(G4OP z?h9cR9=hpRliS3aZ}+hS11n7G+K>1W;0`gY-6=1csXrPN@UpShN~u+4k{0333AFJ+ z4ffNQUzrW^5p^VM|74Y-DKgm6{zG+V2-aPL5$OGihHvsHtIRgKQgF{_U9=*BK7Mfd zjOtl`wQ#qkr%;}t>mw_4FJb{4vuT|Fn#Cx0gRa|$5)I#vq~Nq41!(mYxaaAu(UpZe zA=SbjdjE2&)me2%zLFXQ7Er47LC*v5HqCtBwDK^2&j{3Y5{k^&1SS6ZW$Fl-b7 zQO9XgSL}t0^jS=HE+SIjUBJfX951{LkpNHp0&B?pqfIbY{GHyHH<@_sajI`5 zZaSn5DhXPiz72R9sX-8Nr-o+GQ{c`JP0n>6T>yz{F!S7_LA=PPOD)CV3wT+QB0^$cY>9V>^YH7I=ab{* z3#);qCa2e@%?w?dvlhvNMs~K|1v~{!^YF3+ zILDJ}WwC+4>U(y|4M4lwLU*d(Z1^Uv>epoA6nFdHezJypACce#8Kk z)^mHL`+11r&U;NMYj^shJ>tc|vUGJW4y=eS5C2CQeYQFEX$Ll+Jx3iAJX03R9*2=7 z)0M%(w{**YWmR%qJo2RK`0nz95(^e4YE-$HotBPJdxicw+{Mm{DW>UVGvm6QMb)>9 zdYYn=I;8zOm9;P1$7DOTPcZ9wFD8x#rt`VH3T1=IqxHqB2U1%r80%;GP6j>GI0o{G z`XdH*po9|q#}fG|Gh11Sbb~Iw@)UGXc)H*1EG=e4k$IevT{6vNT+HC!Xu+d)@ru*I<7!nX zep=k|Xn5e}GtC@oWbajlhItxGYm^*e&Y0bRRwrup8LaLSlBUA7ig+CwoBENxK{^P= zALQ)$iQY$t_8RM-=5Z&-$p}LNv)8$La-&NGl4vASSB_CC=I8(S=nAhV@l+6%;)^Qo)WQ-b0jF zNide=J*Q+j=Q)GK5()o8D-v94fLu*Cs^1f=Y>^ej1;oL&`3oG9^p|D=^)LE<$;^Hd$&INb)lw0i8G#ZwoL1PLQ!NOS zbDbNKnBp-MomE*(`@%Bq*b2>{{-8ZxMN?-+Rg${OZ)hN0ta{vAp;bs(k8w2k-MRCK zrI{``=?r~SzLeGby3a=wQqCSVWx6?)Q5RzN9@Ox1zXdBE?~C`!#uQ13r@FNx;Hqla zer|47TukzDokt0($!mVi&}sCT;=c%!e**F&mg38PR9h;fZfjeHulszje}HnZS2Tm6 zj!{k(O3fnhYiTDbJ~7OLHR0RYYAWN4MCnH8#n+}fw-iF3LaVvNgDOJ0;Gn!eIJy=+ zfvgJ{Mjj%GJlN8cCr;Dk^G~hMk>%yK3Rn|Uxryw@aJc%mpLa4=y7H7WVvyPsTm;6= zetS_#buc@z4I^o3AEGALNG3cv++u0QMikxc{tlOJKkfP z4`*-endD&Vf~wD4&x*XiA#@g+4!xcnGX^t8-CRUh{OCOhvz&0()(H=8nZM=ReN>T7 zLfNerkHgD{5<4Ks^m9=+t&YPv)JX`}Q}3$tHGP4mDW>dqZ#COADwNyVA44h@dn(l8 zF{KO`mf{k5O~SK?md`#vRJr(`Z$J0%Br*uXs|vzA-Wz`hy{Wtujqv9%ac^20g5vp~ zs>7k_5Rd-dNSWuS%B2zK?h%~(W%En?{mz3EoUibMyv~mPlud4ut{l?prK5Rj>tOei zvjwm>%D=y9;h0wvdE9JecXL#A`y28zHg-1>_hpA=F-7+Hi`UPr9K^QbIGrqLHf>>4 zq|D4J^mX0_&-c)cf}6~#)TUKJaoJ6eQ9_xAOAN^X|m0 zd`_yU=EVjt#|7F`5NA;u5Gso5cXHGm<3YY5)&%*feNL6y98aQtn>_-x4^ln^Cu(Fc zhRmko94L7^dI)6p^Epb6;D%I{lj>Ai6B$>gePMhhGk%SUrY#VGoktXoW$#LGPL9r~ zvw`XVO!Sm_!#3%@G`yC26*7|;ODvS~e7Pp0io?Z)2TWxS#~8VZb}x2ZL3%1K+Ld=C zx{u%BWWZpCsx>j}%yH37l|*{1(P%TlMQ&L|F44tXmFZH3RX4v&7F(3xMKK!YZjrF< z$NsY;=_4_>Y2vr)&fLgjnUvkiYTojm_3(nj==aQm>pfiuCMVt-NuoU==|TNOnzo0@ zZ&T8V6(X@bzA>Z`hggX~V`|+{@XHh5kU^nMRRZ|uOu^!JWd#GQU?l^xlAi?Rb{WV! zwZo78P*68Dae47f{fvDR*14MG6qV5MdZqO}-JN~uO|S9-OQsK&M1jgDWI4RcT5{U@ zYdJHFKe^S9x}#GojgHNSD7qtYm@Jz+((jxo%;a41q>N%o-D-p<4Cl5~KiiF;_kReZ zio`o%?*BPX0B)t=-1K1GdKt@D7QUH|;h?uXD-+o|UYXFqv|LHpP=%WtRFUGUpD^Kq zVdX3OXR<4@Nn%w7Zs6Tg%iL}&LClqUdRoPU50;hqr}3Gni@4EYEZGbg>WeR$KM@7L+cg2bWui z1;sPa4{Gc7ISuX>$bXnEAnvFlIJnlHmStR75#&pN|(Gc&gE%@`^f6+E7Y{5f?E%yo3*sD0%@k*}75dya=Es)1y*Bd@SJ>U3+o zi?MNZtZ4eA$!WgGD8NDuQrpDhXkZ&eT9aPMCdla=O;pp$A#q`2XJ%?xSt4R+V(LTr zQ@o!-<(+FSW7<)?>I{Eobma`>w+BmQ69+3h3IFNuEUIN<^JKoFPpCu$HistSS&G_5^#o?alIE>wNCH9qQJpT61&2{~TD zt}Kg39^l<$PvRYaSq|PrVc7}`%`1pe8I+=q9^ZzooQ{HRSt?uWn0Wc$pmV{Wqd2-l zw+6{WS|R&#*X(ex%z1km3 zP5Z`3d%v}jH^49as2$zLETH0ARR7$@He`B|(M$3w9m;gWSWkG6C2a>?xU|1uJ&T57 zw%&*2WA+M+FRxF+(=QH`;{|S4qKfvI=X3dVz3VkRa&YbdQAnb}mp{b?I{5cG)4{oBNf(+E z#0?AY?GA{%m!j`(6sN|TX(tBk4_Yr4@Qe`r<|LQ zb+0}ZhP4hK<5~z0@!zY?lEoD9F7}jWIz)wVGz=-x+CBy8kNpiP%}{B1x2-m^)FV%O z%${qjzNAZ15UX=ySnpp{O{NiY<$6US46ow|NF%ivm|wPVII~B37K@!p)hJ_@RO|gc zPO8O0ci6g z4v|2d_f0R+Hj(|tXn}{gWf6(3>m^eoVg49>}DEVUmqL(Ntb{^ znpOvCmuX8HX%J@cixdP_u4P>9Cf6ZOE44{GXc1oT-T(RFypdJ0>fY(jl_ky-?2KK= zlRRiPK~ytFYq0fs3(mxh>uA=&sGK%FWE{`csD&=?SsH zoyG0{{KqOKv2WRY&(?Wj3^k^7q4Bltcz=je{Gf}Oz*|)ppSLcSZgzR|ImaOP?p)o7 zg0?vIua@wHYwZj69#NX!{<62#LhLFurxEYdSVyL2irE=_cD#P~oCnpi^42I^Q{*iY z^R!n_W7Soe7#q0Te5j1q_jw{Em!=}%A)4i_EAeoGSu&&wo0g4*I#sYh!xvN61ovH^3WQ3ttbx~4W5Y- z9B(}T<=-%4UyAh$*ShPJ4MgReBnm_

zs2=H^|C^$EdqkgYZE+kjjneMj6c6WtK1 z_a=O>m2wx_rMBY{`^rc8`;EV(ZYo3CHsXDD{wSL4L=D~so-`b%c|y7cPiH1)G5JoW zd|egYMX2}Zv`uU#oY=Ii-GbeLb-VZu8#T7n@CQEu&5s__N>j zdr8iBc=026^_as^%j-HeBNKh`<}SDqz-+WWJ%poa!qYmM5ENs>h_5ZhG~XNjwbx1` zZ)&CTXzEx3WV`16{1G?0VImImU^XP5D|%6Q{OL=x;!Gai=Nt9YWfKA>S_K@MJ1*ld;dMYs_hv^Pe`nw%;r;NX=YF+PY^97Zi5V;OgvA9DoP{DWuWWmsp*+_92GR1$nf2LA>_WOIx2hG1 zOr6SSWF>g;JKN+;3v-d6REA%a<-oz%YP0K!9lRc*UO z%^~yO^Wjglo@uHK#Ena6e|zsAn6w0auBZqvz3>1_JwHXFp-_7POxxd2yjr{a_ zK&|bh6fvfEEA?i+_XB@GL{#McScl1m{O%#cU|<`<``CVlT5v(AJ~yBu<`A*$+6k}9 z=3nvEy$!i+ShveJ+d0w-xk-`_8z|Ve!yJa6IiCl-TY!4k9RJC#c@6VUX#2iV1iq@&k7Q++B}Mu$%u$N9YweA#1=NF5Um3HWL4*r>&W2ySkU4EmZW!6kPf+=kU7CfE%-??!=W8poS5;I-zvLi44>?7W z$#8lip{mEy5iLow<#7>a5UM-HTyE~p*Ka%GD>RTe?a?ZPK%eO{93Ma#>nACTyq;Dm zh@`rli4ry;TA;l{Ur?1+8ZQo8fk`v`pdCF4sPI@2g%Ir79GGSZ4&$Od^?V2Q7RKJi z(n+?g?crYDAM$TH-a5(f_$W)o3MSLBjT!=72~ig4aMeXY0$l_H(%Kcs~TXp@Yhrk#4}?)q;*aw!7nuYr8z z`xpkQ0mO>TUQzLYqpkUj=|Vu7p7`=Z)CF6sWBIg_ng?oe0}v(LGqJX%-|E?q^zQn~ z-R_^s>)t>0KAP7<@Gsz@8$iqu6XSvmZLJJt3M_~^`oI2Ap+=pvjVWQlBv^>??C+;N|L|ieV zyXF=#|Dm}^++HNHJY07{2I+@A0M9NWsI>M1SK;$wbe}%rR?+C>O|tIJ*sQ3hUv1Y4 zt8eO>igM`FFa->l1`@#d7n~@Jc0<^b*M+7no6GH-r05Ey3dyWx_cYG!k2g{Hk_^BeOetdV`Dpxi|=S1UZ0oEW7=Oni^M&Q z<>G7~dT@X1r%c{%fap}8hMtPbKjKzGhGjs-fktSnsH!gJEM9PNT-MqFwa;!_6T_O3yTaL| zsCj)ZxZiSTi;}xv4hmU2Rk|gRnr8rgoJ7azSgevPeua_&7zK$Jcg@zp&F`+1R~>h- zUe(-jSS(7*{A952S-!)<^jyTnT~7ug_vCVj^Db;sWkYCs*lSn94~bU^C#uIJMW6>BcnNF<5H%8{im&B1_c1sP*2`?^yAXCn?)% zhxzd%Ut%q@63pt8UZKpi1kDUzVd+UE2n(yHo38(c#k8&nCmDS_1byFeCH`Pp$jF|8 zU2W=|d`YLomoU8ai-nY!I)y;H9|oBo(k_Y_h&toUp`j0`tr_w7+3OJ#GC@K7C6P8j zXElq5s(3Z8aVT1qncz%HkpA$y4<131oLC2oPVa=DY1c`|E0~Gg zE8bWaKj`bIzj*RqkrQ@*stk<6S*l8n396tPrh7Bbn7O@(RVS?y!#guG`%?q=yfv1B z8UNL^&iuue$0qeVXY8|J^kdHOel-s6fQ2|DPld!jMXgAzcomMf4NKO1PrY?R1%yN9 zo`U`ep=#A=p}V5rE;8K*0E1r24$(Z#@T7ay~h;dP!*5e z=hos{gvSmS-2m8*`$G30avEosv2Ai0=L3g#ezvw}^Ln<@r-ePKpeImkVSYAE4zv zTJEWyI?^HRbHRk2`M$hx;oN~k^!v(_%gQgYy=&bl@*w~q_xFJY)cxp8LF9t^3O|T( zZBnr6%7QT>v{Sh7KF^gxc5?gc#<4htKI$8Zw?X#-~HmQ$uZ-Yis+htR3x{v?38Ry6wM>K#x*=-wC#CcVhz}#a28%OA!eG~B+1vS< z4_0Rwpd_EEkq_yc;{F{$IHe4x6t$HmBIG#@e5)^eD|XQ*@uVel5=I~A5hAn~1Z1A` zo|!2_Zgt3LDFfI!Fh@Tu=a5(dc<$9+VoIU2kUCtVIjAI*fa%ZRR=Q8TxRr56rT^J! zBj`pr@aLzl9tMewr$+frVE5c>CfBb$))MXWXK5-vn(yOu3yMU`y$=!;oa|8Na<@}8 z!Qd1XZ4{2IF7dhjQ+2-e=Dl}Xlpiav$Y4}&rj30%bDVZxwgs~A zu&~;0%ml+M*?p@OUAi2lfHQsu#K~{#(}+6+W{>PmKAiujw&t0pid%f;I0U$InX7oP z*F6fg`O3KLGXu;wXCr2kDGGRqjip;89pgo-TzRhwIGM>ybORtBxeF2410FiwlYmoj z+Hp1PhKM?Jg5)+x#)P8;HqZHQt{8o;uG&Y+U3~o~ogsC_HS4kCc-dU9y5xP8F9;^_z3tJ%&p-uLh&~`Gs0JGSTr3e^Jn8le3VJoCwt+CF|77C) z{Dnd3E=(S*jJ-70C+%X$P)kL3dP3dcWyYB{FA3dS0stWgwmVZ*S*kD{7*K}db{^u>Q%&!N0#rH{>YKm!uGvTRf>}W?c-V{nYE$Mcw5_|W6Zos3*T%n8iJKeB6H&L zmgfjBh~=y)awoOVtJE*Gf54xukU}^IUyj`|`bl;0E*1v?cK3S}+}L-ek=Nm{=QCa+ zJ+9BUL)j2mYHiI82H`a@I$q`}wV(?06%7+%JDKwW*Y5ndDXYl<6ET-l6PKjxKZP^9 zVNt{pwO&!VSB~$F%Db z+6=Hef^mt_21$ZPUs#Zzmjv3?h1xuNzT3iQ8s%QBMRcoq zWkpS=B-bKVX|ft3!sjdW^*+-<&!5G_ zF;nrS^Ma7Ni?DU8(MjstNAC*1SqjML-d$6qk*Ca|um_lw*PBI#Tx{a|6Nm#~u~_+H zJRQ3MptFWnbUPahZ@xIS|^iI#>U?5 zex$xI!k+#O`#w~3I%}ob^YJb)_ACLDmi343>$QwZbYXQkxL{mHuTh}GSk>d^`%fQT zhN|M)C#&nlVu%*&gTJY>-3(x_(1z{g1^WwcXt~$k7SVmK7vufIN)0|KWmo!Y07#0CHJI#JcmfNY5*bzBJlS*2|EI9rKhv!!qW{CNJ-`ex1`A^+z`)zi-Wl z_o{$$E@I&4m{X8_3_65$1qO+Lxal>I>DtXqEprVo;u2+OP$nG9pz;#;n~p)sUduY< zU;oMBQI&b0)WEfFSE6VXXjwc!w9U`W{YJ%628#6KEql7M@mohjp4m}_LYL^CRqKP$ zNV!y%ZV@8y8muqcDnHi+J06v_XkX;d?t>pI*WuZ;z1w8)j%sWqET znBWjklIqnW>^R`4W7cWH!ff1E%kI&1eLL(MJ%g@P(!TuddsW3=SfT4r)A~l9U)bv) zbz(c*=TZvpd0C@|&*Z4p@Vn(YzuuW{`81!sWYyZ;OC7k|rZ|N-Xb@yr0p;md3dR@# z!{Dn79+l3ZqRzS@YBReWF)eC*hVTpVCM|DWaU=hM1BdI!(EORhUu%o~W^MDk0%G`- z`Er;dQ5HKE9QlIj);2S>g=>#4d^{I?%I~>!hNRkp??su6c!MfNi9~!iJd8G->c@l} zuAt}#bqe>5J-TKFq-AvQ)0k9l+|Qb_Cb(3NMxK{^th{gN5B|yIE1>{ErUi3TMN#*XhgI;tmX3=2L)o!4Lny)HfcAPqdr8A|AYl>QuZ)J#VUmG?GVP?h9n0GW4C#U4k_=>;Vs)v`Zj_Y)v#LTzcz+Mv#Hs5h0Tl#wwJq-~^0;#hzsDN9!4_%CJ-f+gVszTN z((gST7`_8#1ED9<`PFGgw)ve(JHvbe0Rv<|(Y>l`Fs&v`P5Ln#da(M0n!h606qUU# zbBYJ1<$|`QVx+4bm8)qj#Z5?;MzP|FKBW?AF~<`JmQtR_`uSk4hZ8#*(Ri-cR}MA7e;Ni*t|yw#xZWlvzgG0`V8K;oGhP3-9tQk zlPd3v=@vA|tVJi%1e>!kT)hQ8#|em9=Fv6PVX2*z9u|)|I{R#hZY-_(8G`&u|B{R_ z$YA@fi`i&F5aP312gMRORB6`v)>pwldy5F0{>aRV(-;{MjR7ku=K^seCc(kxjppRWPlcb`|e=DFA$g~vRtK?Y09BfO3S3K>cl zE0L~uW4CRzVP)I_h<0*}TuSP0Jdw&|vJxfVk7T7H z+NZVe>v=FR!v{|M#jsCf5#q_~o)WIO*tpT`dfixRnlH+}}74@`WlvVtx%i;szXkLwc7H7tm2dgh+ zju6DiP;Penqm&d@RrCrRky%USLiyihD5G%Z(ho$6=a#;R7(Pp;#mJg$<`#*6i{uivXE=J`Snb~W^#Hblg#SbCp zkU~}@9b2m;J}IzHTq&O?QnXD)MByxefC5wx!HM7FkT!jPCkCJrmrZ3S- zTapd}RwSBb-?M7UfD_*Yi|0)TfHiII$uzwv|J{yM~i~^K{Lfc?&L`dIw^!y2y zno?t1bp}ZQz-{X0-7B)lq7INMr01v+yJCPu!YkS`zYl2+PBYiD6xY-8@Nm$2bx{_F zVduXqy=gotM4nUJaE+|3)ITt8nSLNNU9H0+XqxlQ-uozChDvxR$YF~AsSkk73Qead zQ4?=PXI8pz8l_zOoz>>)XpQpwwl% zo8>WzsJ~Z-p1WkvFCJXJMtl|E^||whBgbCx56ZtuyzV9)6Xm51C@pA;4H_I}=!sf# z9zr0s-3L3_Ougxl8bq8gjlsdA{ESq~0E9350O5NJP3s7$era`be%z=)hu^VVGp`N7 zYyP$2?Ac;l-)Bn6&hHv0CGG!f-hBLg@JWy zE>*TrF1D^=$#zm6%SmNooSh?A%}{zH=vIkXb#fe=|739Mea;^m))?l%l0{#x8~t`XJr{a_^7I>X2= zzMP5Kv0{j>g-vvEsRXI%=*_Bi#p@St&$RNCeA=|p@sydHGh1AC=qqrXY7)vyW-X&R zk_DP_)->9NBI>B*0oR|gd8WoIu8Qv6qXgb-S$}0LSZAAHq!+#9Ij2!uYuk2al70Z~ zLJn(8B-c*)bk`yQpqn4^vOW|=qU#ZztR#@b0KPikSKw2|`I;}3&)tCIIBXm;0YsQu?k#axbwK($0*@iZxcf4V2GTG7TS`<# zrJcL^NX$u52POY8{^tAL&(3nCwbE~TU%CgG`CroM=a9A3;;??i1ugTixOPo=zQy}> z;}`_61gFpdvpET>U~`OPC%%o)yWREu9sdfzL>kUwr#5^KfpE97d!Z)v>uSE>eQu3} zyAA;B1F%| zut6~!y``2c;OVp^ycR_`T9==UR9RPluGsdoCs9A)5+AbGw>URl68r3ST9EalfQrt- zn;-nQY15s)oB4b|`cO>nd$L99Do8FMdIBIYcR&#oZ+TS4>8e~CDTbjF?x zM!z%Vypn+zm%G<|2Zml>G3JQ4)MGp>Yf-jF`p?ig0u~_Y z6=5k(VXvqUzskN%C|Lk_u@RUgVNdHMYY+yULKKh!}-eeD4-uWs0QUDIv4Zzf9|4uU?F*3 zQG$ps;%^|HGKNHQ14zyfP+@eG_o{$&X@k?2$5N|!j*r_JIcanK<|liLkZbwdFAfZ_ zgMDm(7_5!*mwfuBqnE81`^R0on!y(OSbQR^>abwXWtsw@$GgO6mfuSNuv$I&BTmkN zWe)KziGDJBvQ3=LYb1cY%KISzI(K`Z9d;t{J~}ZP^(g@&b`URzNr#!kB~$Zt>b z;Z!4p)n^t_c2ngA9EYbuFP*I*bIpUSV~&pTgP1k5Wn=4t6$=0v)Q>Z^k+roq1ENA? zrA&>45qm@j@{_Fsh{*S+jKi6<$>yH+AGXHmGw|yl&qegyrVHM*#r6^~3-B=1Jpwfd zJlG8{WkBFw{=CYTQ%tMy$e_fJx-NR1geak2+06TGJYx>;@gTxPLKb(7-Vw z_%F-O=90_i_V*Y0kb$@^3H^F*p8dpCb9r^^XQ0mu{~)vgI~aIv&k>N(&2s?mSGW^w> zd?W$({l7&5f3+p+Wd9E?E6>52Jc3mvwo74#(wnTB&_Bp0sQ+g}z!h+{DM{u42nmi> zZ?D|~`9JCcv(X-Dz%u?)`GHw})RQEG%^i9q5~0x|KJdS#ig`oz=3xsnr>#1#H8< z43GPRkNa0o(_BVYvQFG=hhJ{@&>8hk4P|VtuaE=H>IQx%M&J*%s2j8~6U%zx((1&L0#GsI-$^|J(y2_}^UfUi@#H z|G$(AScd;k%GsL902(n+(u3x!xt!YiPd}JL*+13-eDVh9HKk2!X%ntcX{Fk33I_V$ zKuR@*Bi|}B7=c-Z-Y`Zd@A>>R!4>Yn(Q>szn1*hCHTesZ!LLA zi!1LFh4_KdGpS8;I#`4sDWOpO+!(#laMguf3R_;qcxMrlSLi3_^ z9_-xB3yEh7K&&R{;mrB2`hhL1mkx1={-m2Lr2sC>4-mLKGypt3(*WIv!(4iTUIfkY z_E!2H&rzXi;g^#?_^oHExy!Wn`kd?&m0z6#z`&Ffuy&s!)2FIT79 zEN+Iytptw%Tw3sae&xsbCY<-+r7I=~NT=(Sf&BCy1>;&XOVH+Xl?zVx95VC_&;ui$`PT zn{;;n?3*DlDc}hsy+4ix?<_Y9MfPe{E=BY*_QnV1lEI*!r8V|j(Y9Ci7PP(ohjl)0 zj|{)*rl051_GG<#*j96*pa%?ZEk=!JTeX_!XNt!Ax~9cVONZG~lhv$LG_jPv9S4o@ z)px_nruL1{fpoU|>Q*K53RRk2;`DwOev*V-DRWT8m5j5k9lYRlw*NtC1Zwbcu|&PO z1?#C*DV8B9wja6;*;U85)$8ywh8R#ciBQ|*4_b~$=6NBi^w__l8TCvy-t z0bnc02j(uqW% zJI5y*`rtKDsj}lW_wB%&7GyCzp~pOTEQ+Q@_3Rwnm~Lr<$YI>_^({SB&~>Lq>&bVE zaV*&rhE|{1n%+d9JJ(M3K5qpUQQMc7JpeIamJv&OE4E(WXIH95(sr|W1CcJI_C<_6 zSPxoL~nRM zd9}_u#Tr`5%TsFl6)68scp;(V4*1W^tFd6);{9wd z_73XhANRcY^r-zPu(m%+eI{KaF{VySZ~M+_Y(HXmaK#zk>0oMvAwa>d!MA`gyV8}? zPKIG}BwckCu@p~{5~n5Vt#L5FG!8vs{;LzT&RFM>18fo-2G}G(!%=TkcMv=!xcXV2 zumGQ#WTc5&x?jH8=(a$;nK%FgXDaHpVT-jbm+#;*QOoD(%WUwXeHhPJ!g_mt@&-q_ zNaRR2u*&*Va8!fNM-_ko*05T;L8x-Hty~Mk>MW|*zWwAo??rBHQ3tM?KkkCG6&B!= zzx1p`1!h-%rdV!>^k*&)ZXqif*${*fs}1#GcMw8_U*vdaRbbWiQc&-S9h!bttHU3VakE6Kbi>gI^WW zHVv8L^6bKQ$bmKv;waTu97;Geh#q>Q05a8w>F;mUf53f4p@-oEy>cWzJK1?L^R=RR z8UO;6w&#?LatW>VJh)1g4$rNi<=%+e!>e@fgUap7 zf(OK7e9AW!d^KK1^LDS|56l75&&Y(QE_K{Ey1~GJe?k)63zjHdOY5aLk7s7nP&&-` zO9xlY)yX47%O9&Fo?VQ}T&mc{rXn{;&zJ1vTx{m<4_X^Nz`QDkuIuz!P)+{EASkmZ z?Stg!n+H}1xurb>bNVm#bFkw3J_4|4E4~K{t`7H?>~fWT<`47`tUBs9xg({kzar${ zM?h#QQ^Bpyusf7aWk6xw9%0KHP%*EOBKJ!b)&A}N3&y}s(<3}TEIck#6SiwD<^am< z%|@)cE=NOA1f6w>Y-;G0RtrY+f?=g5e1^CzuPGI(xW+A+o}y-s>N$2GWo9jPO;=JL zPK;UWq0Ny3EvxLAV&xGI`b;(Gg2(oLVjhRD`%vl&>RwXg|WD{BXsa(SdO%`U&8Tbd685XSnIRs%UT zr3GT%QD;7DLDiMyQpHi8t4Ny#dMkBSXyzmQf$OcZ&~!e3>&YD4D{tep+~)@6gj~Gs zxHMR>Uz^is)i0sX1$g}1-K<8tLT|k1C;uLnIy3{5(R&citIkoynz-o`O_eF$mNXsy z%Fots2_w#tkpvPJB=9b|(TMQQ*%{ounEDJWOt1GBC-o_u$zAmRER7n!JhW3)(Xc_8MHF@+jjRceT0H~u^n)>C zb~O{{OQjSH@pnHXO;h-8WE&?oqg@yGYAgg6^lh!>4kteDhKTbl^b6p38r#?1c)l~( zsl9lzKBen!u0l)~Pa6Q?8NlCup1a6yLE3aXfFa$glBMmToXm>A4h!^b(t!+}H_jo%#}IF1NlD;ulyPyVaVO9}nXqklX`g++4z=z;z<6k^bkgh2h@7ko?x#O; z#qogpHM7Sf>hhfG-Ext%d~&Sq{PW4ji9&L96@$lza!`6@NB3l*FGlOd2QaS05?u<<$K@2JWEUnP7ggiFc_#D z8d&W(on(CQCoh^*fwDl>swGu8TrlT&fb{xxuvz@xi#hM{aRo?3R`{1Be+Y&TV z+dqyp%$(=x@|BpW_@Mj8P|R`Hn4B#ud}06fxHF< zq>k!9)VK(bd|vs*+;xLE?}o0+b@CR77(%5fRXt@c#`*MH@1d>>j)LHK4*)>{5H&bB zIzoX!%FLF*UX9_}snXlNXqP|wnBY!ssOGo!cuJ#44*r&xM!ROUg4?Ccq^9j zL=}e}e@{rs5a$O?j%BY;_r@3oZ|PTEp-@fF6^~8k$wkW*7#`BLz0Q8B+6T*gRyEXH zI=1cyV9`l5Hfdf-gA<*f_ECFOTqT+_@CPwJR}xldrs^0}xRLh{3r!x8`J`#)xC-L6 z71$5w(aB%jG)@_QCh2RPs#c29ScsN1@F}XT6evH=TY%g&2L}@7g}e|J@NhE*Tb)*u z^C)X&{NMGD+T;9}o^!zo8`NcJ>8UENCEu{rv?vL3b=xJ>V_A9fTQa4UnO%)IufACX zR=kQ;mFZG;uNk!<$`(jl$`T1Y52_T_VZ)}irs)hNtiJRY2~6nkZJ0dlVPmQ*tnD<5 zv^ROqmP0d5<6X-~;>2-Yl(Av;!k~V;eNo5s7~aL+OGm*VE2agrmXGmHe6G>kNYmbb zY{$#j;X9;9a_Foj3318v>fU@P-HFx83ak4cFFyRElIz2K!MQx`7yQ@J-JS7(F&%CG zzp8?apjUd73Rj-&8kF1WeKN2}<{`*p=gi>sW4YkZG)`*);&B2){|1^tF=5l-rq2|` zDY*b`eJP=ti0s!VJ#OM0%*AsMNWz3&^|Cu7qTCGTo0*2nw3hQ#zvmtq`|d8>x37b3JXVd=`%(yVB)6K}KB53z zg?G5TiW^IwQx67DZe@{d9{pLut##j&qM-|5=Q3K~AA1YM;{heDi-7Ti<+pg;VvH)) z`C<=_U*VI_;|F#MmO1OW7im$AV?KTcq50F^^SI5$ctdSzr~gr-6UQ=F466$r$$rxM zbo65}8}Rg4M&x>b&!CEh^$C*vAeNeRl9|Xp6p#|pZ$~txbDNC(9uX1<;!fmQ>rGhi z(ze^=Gd$^pjLeUHI(VU-(le641Am2&%%_F~uFD$7ayi9-Rm~0>?3bg?%uXxfOQ!Y+ zltk5YAWx$gA_Y78=&Eh4?uMiQ%837qySEC9tNY#rNr;dDfgmAxfZ&qg!4oRDQ@BfT z3lvZ|A;C4c780DorEm!D?otrkA$ZX0L%#3#@9vqa>FIf%nc;#9xH#wRz4nrKz3On9O& zD?%{)5_eYHeYKoF?}%zRjS+3+RCeOzoPBb&-)>#Tb2sF}UwdR}Qew?JnKKx6)@8KR z{qxrYaP=^#GS3_7MFaU=1Z5s0iH)vBv}2XSQ`_7~88$-?XyE}WQgEZd0+!hk=+NPubYMIkEf12`ZY_2T)n)seTkGsSA?)!t0 zeS<|@@~?FS2V7>e$qDk>uwCh9Hkrpg?|5EhvfY3jZNvpX;;$6F{>bU(=as6TQrq72 zev9!n_0S?=+~Cs8xb$XZik6MqxV8p?33h0<$cG@AvBT(mfta=~iNxodqxSIdrYud> z$?e{Lzgl9iVYklF>-ljp+u_UESIFu{SYl8|IbKj~;zjHOO7>{Qp!ta=$5w4MlXGA3 zPE&Qfou2mX+5T~tlR#cD%h&7|6uPjJDH0Zm+=(zfcMPk%$~z8I@3ia|SX@GNXJ+RV zTEk&%YO|_#LoKQiy^-bS!UD4%6bMyO`fT@hkri`8{LzqXI4mtS;U%T;^Bqfz;f4ym z=vol@p__@uIfu< zap(DwOXC<;NDG!S@Ali#_l73k@iIY(zq}D?XnAd{cDvylk5cxb?#0!e-r_(u3frad#p_6HhF%& zTMcrlZMDgO$9GSb8Ik~Dl^tdCrf=Xm65@|~r~5j(FfrDs~`7Of_!!1K8X z#n$$wwat#>20695EslKt5G*L_Cil2GI%B@RAX#!^#8DW&kPg#(A`l=@9T1eCAb=m2 zxSi!i7c$;6pWR4(lafGwG-3KoER&N<98vXo9&QaEE;jI2(3@-5c_g6?g~Lm5(WRL))V~Q9klz9Hn8aw%8j{Pa7#DeKw zfYob9Lqi$1*XGEw-4PgO4QI>heEe`<+$34+XJk&In|7k zG@F!3AU%9QHpO;GK)$G7ib+sUm-u9~ig^Bn3AW2~Tuj@IF-gE?E(XyPIxYGO98~xT+8K#(*@}0!Cu+XM>D%Djd^c`fpY`}*uYG=hd)THz@|8dWB zBw5h!iK_gNtNTp_Zh1%hw?29!oBo@>JohegH+-qVl-o)?gh(X7*>^@ zL&vxRdV+Xz*2>z-InL3-GKj)md#G9*2S=q$lmIaeUPh`}1ilsT#Ro|wHfOE9mj$1h zaE)V~kicr>CX& zg)*N*IPt7$&6T*p$~5*=Kd@eydYwxTM$kssSpKZL5z>I!%mOjoajjqcbqr zh|ly7uJgJ}g`Y@@<)-_G18E>{dY=u0NTdRi5DE1gA=h!sb8AEz2h$s z$4X=w{bWIK)ukHWv^FW3b6#66d++D&1`aU$YN0(H9NnVVWT+%ZA`#X`JCsGn$R8DDW$P!oWi=oXgb2gpGkAvg7 z+IC;u(2s`(6>Tt&FMhXS5l4Ew9oSuB5_zIbF88~FGN6~IPW_s+4jz}4r%u#16R-1L zJ!|cW;b~n66X)dI{D81`I`hk?XCK|J{OQIWBXYE?bTnRfC>_cAj_er}9tT6 zS$`M!h5T*tcVO669g>!_zV2#>_31;}{1hr{>cQ}2&WFp}gL&vFYxEqvAK_Hi$LU%H zQ6bgJLr1U|ZEoMf7?T>N6*jmyqxkA$-UQG7)X0gBHkKGCY3DPNs8ga|7`*;%Y;#nV zEe-mE1AUwcvNs*QSS4WfQ%e6KI^DRi$wIC{0^vl@cG4HMZ;Vv6WiZA;N^5Vtnh6-U zTzuj0ln4p)&)>p2+g^r}F6 zs&Nu0Mu%h^5@2r3T^redIAdyb(Y~YT=8;d zy3nDru7X3QnL{-cIAY19FyN>o=AO5z+`elP4r^+X7%~zUa*ptGaNp)nDejTJiaFV! zZ`Mt1)|(!IZ?gI_s8|#@9gLi%7Jp~-v!_GRx=-rE39829lLmq`EBNI+UNEN0Y#taU zMubNXV6mB=N$;AmsrIZ(T#_oszG&5&HE%R95$b8Tj0%6>PuR+4uS6cLYCkr=(Or;$ z<(8cm2bB~F36%xpM2T!`mhU-@FTj+e4({;|Z7xO}Zk6@S3D1RPI7%=ZOBJlhmRTdlpc7@SpDLMY@VNjf!+=}sf+#>M32=-$*FW$-Oc^(31<)aHatLRY zJVqad#ng9IcJxOx7xQC7N9JL>oX+39gozBi^WMJeemC=QtHNMc<(#QRg>qf~Z6fxQ zgyg2^UIk1)KqAWR+Vjjq8m6%$lRwNSXgMa8Z<)JdwEiXcy#MmDD`LAh5!hedx;g zS71x}U!C$7oZ+&@zbIq`m!1=GnCdS*(PmZEs2{J`b`WmjE zur1EW(L*%VlE;W7b7u#wrq;OyWsfW}`gtLo2DQ7|EC zY}G_|7|+8gjdA6VnKQJ}^6SgR){ecs+z1Cv1c!YGT*tVzlLukg70j$%+dIjBmhaeK z&Mxq&xG<2=bS9YGl=>^f>FhdA8q}e+=PD(d?7k1liG4lNu6_oxurVhvakF{Ge2+@Zl%FB9QF&c&PE~mJe%7N)>$Boh-0a}It@`Xf%7eI# z?nxMk!FR%IaevOvgU6ePYbCHt3);K5;FL>FLJya^q;qFZaY&D&IdA*D)cR|SFB@&G zRb^^{<>t@K30Z*6Wg5WAXzFA>CeLnGO4Wp7_IJa=8XQ4rcp44 zTj!Q7Ke+Ffa_H1+f`EItbm+w+xuL<$>LFhIBcO2Yu4D*SE-hBv_sj( zPJ8hV+7`IbN?oe>+=N{=uzm@H`7?H1K`Kd3-Dy$(QkkM8_GCa*e`N7V4eF<-$fzP6 zi`g7r6)UNmL?^NBvy=yaN_r*nXvm*NeUFUt_zm&90+}8sUY`cI7l8 zHypZI2OZZ7`8E%85>EfpNsro^-d~Xp$6EK61~Pa@%gbSgi_&WG*YIzGwP`#)r>E5} zdvsGx2uYG|5f{nZwYFknMZ1?Plmr35DyuV~Wc4E#T1Sif2dN~ly!eWcMMVvAmw8Vb z_1Dke5PZSmxjT=z(JsJjP}f3nDfy`)=*ualA5RQ<7ai(H#dn8mc(f2l&uaCyFZM6- zA~-qQmK^JyW2VUXI0Y$4ClCyg zpJ6IB%35kBLEk~K;35x(>ZOyYChFubz_iJDZW&kE?{Tn@gGkN!YJN#S!y18|@^Y_e zaSM@MA~&~rHtB4cniL;GXkLAUU8(Le&y$QqRFy6V3bxh6VAEQ(&II*DB4&-$)%;mG z>D}G!H9=;Jc_-X@_^oGZvSwq9`;Yi~_VQ!tCZr~OqngH8X=|-OWvxv0V_-d}y!tBY z7~>=)nv$mWXAP3>-JPPv$Y{K6YWRSvh8kda3|w^l1I$F!ior}8$Im3o3K0xGJieY# zqib1+m?6Riy81$mEE-Ywc5+wM&e|%K3>J}UF~4U&ySJ6-L!?rN<6Zz6zYRh#7aslW z%y0UZJ#$uNamI4^)UAq^C=~B_RXTEc_S+sl?~6$*CdN1B81*bRxj8bA=L(L?cP-TF zjon>WECwtjJD#%4I4NsRA5x*J|mZq89JTpP$0i;aQ zPj$5`+O=7baZ8^G(66X6%u-_a1z6 z`___TJe)3Et9{{j1R|A0aAHP7U32yH)5iL{ig`9W*NNwQ8%y^TemACl8k&1klC;hD zL!>9&({IbCIv0W1)fnW0`30W@~w!VqzPJ_CHAOPr7#|*DBT8JHL%dnA>L8Q zw(O!SsOlr1uWsEW^JCY@9;)4_)s$eiJzM(RN00q42g9-8k;S&9hMy3+qxdy+m4wD) zT%dPtq=GsuL>d_SnB1S%aVOJ0sK|Wi$ucdt}D4^z`%x%vJJT3Hab1F6H^#){7Yv=AGAs22+%ibu*EUq42%VoKL6v0zG&T1v}9Rf ziN&=h1!BqEd%9OIH3=pSc5IVjj`QDq3U0eGbGwufCFgGR_yE#CVk~z9uT_;<+IEO~ zZLom;+_eU$8X_P>JMf4Z;3W8$Srkcvqu{BD*Ife^``Y$f28@ zQ^a|kKS(-Zw+SbN3KOr9vqTX0w_q_GjNzv({szI8Am~Q*oVp>Wt)ShrXAEKUE02 zOo!9?5^59j_}|32BN}fd^R&6Bc9l^vYqLI*k4C&Zl&gJ%@yrK*tC-Dcwhj6n!)P7&0)J_$R&R|O2HE|}D>gDc(T%({ zaGzHHypHRf-%+dJ^;M5Q?KbXygcj7v2c$8=S6o;88Qg=DmdEn+=2cc2f_X%9fdqj< z54(C6U!V%64tgmd&I7>^;F~OaBXaV-F{iG0Yxg*W*H>stGm<$>30Br9y&uK}?Fb4? zOKSqlJ?y;1DtL!&c$2pJFsY?MyaLBCb_1u?Gw&ItouzQmEg0pMIMl3&Cj9e`MD<>F znaWpn{(yBiAO22 zawFtrYp`l!$0@z|YDr5|1(I9f8s}kdp7xfgMp^6tc!Y<*BXHWT;J^jsgVg%Wa6+p_ z+4{7{d1^lgF=l*ehl~b)qo1>K9xz?r`sA+XB4 zF%$ikeBb{{V5egQI}Du#u{SY5&#k+0oOC zsPUVB%mDOOcCKanKBEU${XNr;Brp)6o#egGdCNcUR8wpYFZL+ts8zp#85d?! zJ8)1`Q(0T8wKT;c#{45NP-Yf(RUUI34$m5O6gcP)Pq@>=k|J-f3L39!kqJSHhx4I_4N!Mui6X^4W@L^ zzRr1EjNEg6U^ZSxAQ}DRWJ1DYnvY$mZL9Yjn)GgYve5DPLCeQ&EeYEuZ+dxe+dO?M z>WTfu()WB97=8p*cw<=*=UGs;%7#oiH*(9_=6q2GXjT%f=#^*9pqJD;oCk++gm7g3 z=n=pAHe=pf$AVH4JrByy+VYxCJxCcqzC$mYEBtBO5o?>WW*d!Oo#>*g!-uG3U^8sV zVkn6h=rV;3SmaHy>D|{>fx7SFsn6Ifr~A~)%nz-5rmpwDnO55D#S&XWP%z}k>$%&xNk-(~Ax z0|OGO?^Y2=V*Z{gCSTP;?So`koj8D| zidGXiDl;cK^d4)%I_zcZ{hulS1|35T_-7;}jV&0vPlNhJnP&3d;k#Dsc9|KrArEeq zXwn?)g+&ZQpN`iRH9zW#du6NC!FsRui+X#GSF-{xR?FVOv+;&DBG#3D5&pdYmygLF zG>7XfWzfoo%+lsqS=CK=EZsBPli!p&xPF68NVp~@5Yi~n$4tV%_9VxHzHQvN%1W$bZ*ef^kTQe-v_UZtST$V~f(y`=J}lEiu@Qf0C%uckheF9ZeJnmk$=jY@?Lq%oQ+9h;Te_H2FYmDs|Da;E^o5q z5nEi!Nar8*H0gT=e1OtI#~RL2E|hIjB^MSjs~zDKtdNB;-WzzFT;NP( zHh1BJ`?K|Kt~MI`zub4o9S-t}?jH)NLHHeEyB+vW0MJD1|7Rob76st%@%>G5fr8}j zUvDMuZiX=WS2D2vkImfu1|V#*zf17sA7*L7Kk@MY=49b!aUgpR;Z5yQ*EjYQzafNov{G0-_It!@rA&h6V`{@RcjoDqO%ubXC7g zokv`3)wiRbE~i;h3~EG{%}#rbT!?O^dM^H%Jc?EC!Mu8m;g^+{0bY})KyOK-;omi#|AA!~yX$m322F$r*2;0=hlRNZ zcI*{5lvUqSk$N7()IID@8o5r!$A)%Y9T%N;c!{f%z9rlBsVes&*WB0EFiBd)`8VgM zZM#px6oMLW_Ns!?$h-=+az%Fb_fV_3(3jfx(O$gycXzA*h)$a?s~Zzo%@3sjqZb;F zYUDF{7j8GEteTHAWTM0727^~}SWE&0npNoD=nv!ajDFYe-$>xZSII zVzLM7s1KbRjGNvF|M5o8O|zik^9Rs0?ek~+yvSTV+^(C&%&t&48rlKrzh-MY;#>Me zEreYQ7JJcaIDyNvE2zF>*C6GJhMEyc;VFg@VUo8FW;S2jsE}1(CZBeyqJ25X{?Et0 z6P1aIN;StFLnRpNy4~ayPrh?pb)7fku*SVu=uO>=J9(YXi)>Kc-4nnTAtCkz3kBPGY-=yhr8RKK z+3fx?k9*fBg40}pPzqvJP1beOv^*?+!|bkha~t8gFhoeIfQ5!z{U4j8#xOJ!z@G}8 z6ihJ`ls+-VGOTyM>X_B1K!r8@ruHpWjhJHG>sq+%s_=g2xjZAnFoAZdM^fPZ!@@{d zbo$NS?RGxVqYm$Dz!sy2&SAn3H8L3s6nybok9^}wa&1bV#7*pF+GYHx#zk)F-e3fy zO2tsdj-#KiNe>^C)w@?N~YH1={E5|9Y!hG2B)X@c7FPjhLT; zzBi&<6lInk+xozM#W#3lfTrt@_rWC4YsA$yj`v4R7pc~xy|inbY`UNA9=@f^b?pnn z7hB|(7GL4Twfp(GX-Yh`=S`0t-SMZG`PXxJuQ?gF2TUgZbq{v8? zHX)-gGND#%W^SQ?1j+Q_Wc8zg-~qSQ6zSA`v}Nr7{`|>=H1VNrs^tr4B|(#N=uTsV+ib$VrtX=n$jH#s%mz*^tE-( z!c|Mj)eyd17ho=_nXRdwn9@Ua&Y=jfrJHpO@64C?-^0jO44Upc&@=8=$b5}Zb&fKq zdO7u!nL~{<;w3h~sMef21l*q3N`!X!^^y9w4e=P&fHb$u0;}TaYOP^l_VoN|na9eY z-x>ape7^+772wid@;*-$Jb{#*0k5+aC-KxUc6f4HFFTacYkRsR3~&U8{{Kd-4krwO z4Xp>3_v>FDHRwT#B@LAE&p`2Xi|&i)(no)E4ni#R$7nIhTD{I%-Y~G1XgBxNK$Zd& zD3TH)a1j}0Gc`er1YpHpPhHLll zA<)qYCB`{LlTd{mkDTJ3a? zlI!lf;LIQ4Nj4bn`x9kB@0D~{iV1ovb_2-f_RDj6qciT7oLxR?`Gn6Ka z-L-v)y?SsO;#QRF&k81I+FP_=LVtrd8ru3Yzb`jkZZJ2FnJ>(q`SOADF@V7Z^}NV# z@ky&Jz&ux*2&8Q)s|H*jOrHtNlcQx0F-CH%XTWHB-C6y1t5C!~ z_(e`(ISH@D<lTXQUf7st%$-=Tn^eKsLo=CNDkt>dLyXK$zO z{fA}+jKahIm`nUWS^Ewyk22qVIP}}Bc#q-6_n&OTlGDB>`FlfxaIWsgot5n)1#rtEkJOdf;1W}{i=;zjkTz|S**AL_j)bU``-LM?+sy3*tgW20jC1NUFE zPNW1@!Vbnjm}m`kEwr&Zh^f}{+&e8%xJ}czT}~S^;kN_F3)uEv-cWu^mTkvv z^;-5SanrG9FSo5~wRuqlrJ`&Am-piIpC)<-f~gn41s>_G=Fc znlTtmYZp!&gWRY%nCfwD2%+L=mKcFJdRq@E_Rhlu;F zqd22Kt~%9jc6>IIw%mht<{h`G+;oAI z=w)i}Dw(s|lfqXKX|PyHszSTnqJ>)#-rJfr`SZrkqPPc^6iQIC`N^Tcd%0}% zB-=IW7OHl_JMmhd3IsR(S?+7g0F$KM#S%Z0)3~Brjw*Y?zh}<1_+0GN*AM&#D;SRJ zX#xz~2vmrnxmyb~h&_2Lod#>?ZU<^ukJEQ>j*n`(|ibx zvxy)e5K7?uPm+QmI!)_CHhH+pWBh|(+lamshK}}6!E)y(I-a+(B}R~`SvLXpXFmjF z^e_I0)JWz&-)>R>XXutCSDq;FD);}Bj)yS%>f&`97A;_RhE9-i+TTMf{rq3i9n+9d zNxUqazWRCdeYDcy|NI@uL zE|?a8I4OX}hhEJ^++5IK=^6{e-LB(Czr!*FnjxD}YOcdp)7uXYOfsfbiRGb9%zdfm zI)st516$6uNCejkQ-V}w?XjTW!+Wm9p#-3Beo-lry10V2AAIKGEX~pSMc_0;oSsvz zDO;TY8pri{z z5w?p=38s_*XSeL;YUpX-_FcW`y?+@DBm-zf64yYF`ltkeH<8n_bR?TUG=NKjbo&`5 z%c^-}o`geR6piv?(A1W?@>dgHZqH+;#x_LYwl1T;F2J3P+Gi@_n@2*um`)~tlNR(~ z&*@e19Q^z~>-9&`u#)m)&)IHN-Fk6`;VJoZ4}IBFC-S1-p9V}l>L!+oY=!xjqn+0T z=H?nohM$9gUWuC%Tw1%UioRrQ@p}=bRPtM$r6zewd4Ytdr|OP}K&n#QX`Ai011 zbK!uMgD2#pI)mEvxmcg!m^43Vv1h1yRgANSaYh&ashVtaa2ud8dNjEk<)3uvpHz3W zs}WuQjuS{g*4g+3g$niT{Bd~(#yA+N&c%pY>G=D!A3Iu<45}>pwC1Hgkt`?L6~SLx zT|9-^GL|O8nrn%R2lZyMWxhK-;sVAcskuU$dsG#iCf0I^2Mwi0wWgW|TQDr_vJ|}R zS!{3=s`eAyMSwJGY8lfoGe|#7snA{o;@pKiCqX^!U2wTd(=WLh1j0Lyt-!%8WJ*2e z+d_hUh5cTN=2!`8xlR8r_jsw!N;YD=>|i@!Ppl4<6`H^-lOr;n2thS`#O)V8U8k>_ zCN=C*gA92>vm;}3P7T>CL&f@}AClk=w;Fcp3Q#Lu`+oE$VGRt!3^P_+RsT2z=Qs^) zDaX=vueY>m&YD)sDyj?1u zIdGYIttqp2DgFelbnBB&VI(A*VcZ8AaG|g&*DIQ}6|W^xjgXa=IEtGxAX>Z%XUW0k zX|XI5YU+4dA>MYlint)bgZr00#3Ar%Z;~oSeqZ`yKH0T@^T}xsiaMM1EkDax*6p;} z&;>vjaxr~J?}nc>OfRzai*e&WPYI>HE!KKRiHu@>rgiE9yGxT(J3tuB5k9KaScQ>|ap|Y6!Vh)!+Ii8?h=l1LDN!_s(o*vBuYny{bw4*p0Byynd z_$J3DRg{!<6csg8JXP=$QVX$csBQaTduq08GD#F+2llbO7!@f-X2#Vx%{eb;C$i9FYrL{{xZ(fjvDzs$#WMuvoe& zGL86Xb=-vdd2qr;c=^j86YHs|hx02v2)`Nag|k8fUn+S630E;2kS7QRN}!r3{tqed1f}`&Ddn>I>DYYIy6nZ*b2}oAa=N-5 z#iF?!59Oken&~MGU29nNwX{OjZ$GQa-D01|tXmAasul96+xIlpk=XBD^~PE$xRwfW z)vzdIF!K*+>#6N|Pqg#J#1fd!nO>s4OD&5rVx1{~Km$#^)DhZaY3o3fLEO1i`&uYE z-ei;5D$eTCsWAUpgNB7ar^z64zq1{3RgCKg0$(U4OpC!ZtkYqTo)wQ=$Q5~s$nux zU8kIPc<9t;j)uAQT1m;dqafv}6W8Wy;^kwI`l^oEwAm*{QshGc^f$w|ah*4E+!ME? zuY~5`_H&terE3c=cdEs`F79KTR@GIV`qlhH_eWYbg=Bf~&S~@0S zWqV!bc(%8KX88_?TZ7(>x`8*>VxfB@{Dq`znT-O}d}4-9or_!V^@xt{a`^CwXFz7n zZ`J*R)chqdtC%*o9sWRmID#mTx1UKNlb-FXMn~c@?hA?6o$@#v?wWy8LQ-ZtPG$SO%PE0Tdb!C5-Mpn6v1)-KFBw zl~}y!Z4kieIehPDgBID}5>d91Fw?ciZ~&x{v9{5Cf)b)g<0w`J>OHrdV}v$Dhu7MH zj;@4@M^QJb^t$de*|z?sxTqGom|zbX4eoQo7Iw5a8qFeTsVVAB^14|sv;k6f_q{so zc)d7F>+}lS#+9&Y6eC;9&Eb8Mjj!+Ujr-a^h3xrb*;7r~O~;g4`wNvcow z`8&64%)uKE=+s;x##b{^0}OdPyJI56B)b zM!B=N8CJw&ZMUozqkG_QT}%~Y*ZQ;BiP|}^gCfSaf)B0FRmir(6yrYw?MU6!l=})Y z5cO)UlU1-y|e&l<-X8S{V*80|)FI5v~~PPK-^NY##dD4qqI0=I@u=fi_i_eT1^Un7ZAXs?$G75GpC2dn+=bAyGtcvsE5b z6kFsk`lAWVs&{dguuxR-QJ`?t`(!`qK2$Tl?&CAVBG<0#z(F`JFLxvD8EgDW>k(8xT%nv9aJp(-E`A(9{$Apq4p^o|DxtnZ^_d}-i;TnY{Ox7EFCb?^y1eZR}@W4(&l;s;aIoe);0mq`i?%RGJF2D=C!lySkB0T<4@!2|M)dfJM?j29s zEPUZD;dKGO{;B{UpkQE8R?F5rMQv;WWo<`&({;eu8;fVWXOlQ-f>~Dib=yfh?OOH( zuJ~dg>-IJc_EpiQBXvXZ7l#Qd-8#jbPS8?H82_+dKQf>}&yztfOsr)p$iX9e;B z1RsD1@IX%L87}IzwvvVBnQ)O;xA`hjuryp}Re3S>1QDAL(Pm63EpeEHt$Hri8>X5B)DCt6oXe_iQ8HNyf&kF4Yantv{V&8UI9DfvvgXqqise=3#eLJO;k>jJ^(K_xdf^*NwDNh<58 z#`j6SND(@vLHrKJ9`t#vM-j0CS?=Qa_WPd$I$a)}?1rq5Si|9ih2uHEcnSl4X9bm1 zW}M=r?b&#Ij}Ou1y>(fjbKZPtphw6vN|D(YboZWj&N^=`zqE3$n$eUCP0N2XJWs(1 z?dT+4R|{tIt*Rl(ihqM;&w7ORiBY}o?YP_P(E-khk>TMN(sxqUm5%HruO;i1m77dE*fN-hIrS{AYF9?V9`nhl4-n9rk?0w~;M7!7eC(l!bXiHEHU$ z+>eIFwxX|K4U$o0D3;X1;^f$sDU1l;LdHta#+1Hs*(>$nhb@!S`gI`n^r28 zCV-Q&YLhbP7?i_jDz+hsJ^3xS0I{j%V4|s5aj)4A(-0IO?3qJ9|K|fLF@wRv8?mF0ev#E2z+bQYEz8ni{^Ri&MQ`7~~yrKpCsMY7J) zp`-gvQ*XH^_5l4F)elxz!?c#EiceEp4rTc*K8mOSb5V`Gv0mNui{tQ&0WS5~*QG+F=UIj$vm=dZ`?rl;YhACW#cU`*i@0`1Q?Vzp{^^=yU zm*D~KdUFRoh5<@Lsj~XUD%iG${2^Pig(#~W@3Z}>XNNB+KhT4SD3EqIUP}6~^$Mfk z?IY{PNOF4j`Z|7X9G;lH?k88DJl{=z1!V|8fRg#Gx9nhvdUUVXp+t+wY~5+r9}L)0 zSOoUjIbt&2tb}tuMT+>LyELq~gxmCYw>O5#&nX9Uxnf6pCzz9dc2 z6Y|_&rFt2jgvrCUr)~1$3}Ut|Vjj!FgtS&w>bi~QMn-Aq2OTi1nC+s~E))b_{?UTf z&w?T@^5bG6464}6NP2SXRujuzpS=-SMP;2R!dA(D?%cNp%2aoE+L6v? z@1_|Toh5*R%@ZnAPL;jUb$(AZnF}bY1J{h0n^`P`g0`c|ncPI@&L(6w-V71F!CuoV zXVNd=CkbQ~#c_P>%%L0jK0^ScuK?0}YhA_vQA7qV8r>XH5e1ACUR*}LY#Nh^eGxwYx> z3lS(a;NO&l3%+thP+9keXMWr0^7k|r9*B4k777@zP9^4}g1yARt~xv5Q+HEPyzXnAn^Ia6^68ka~W8g9SA=o@w;E_W!|<^}vo zwmGDuEaghno5UYSAz;KGlJNW725PM9Bqe2`B$eSHoK#omQ&!p1Y!!Kcs`UUW2f?8K zE6vqVQh%o{lYHqv3}*2_YdVN_=(Ne)x&)sGnBx>@ZIbbhc7=|U1IJM0{Njz=17%a= zN4i%?`u2hm-o-Lmkt4>l^xbVBT?(ro1z3CDS65)8*y(0_!Y?@C3=j=IA3~Ik^%qy7 zA%GOx3!QFU=K`DD6H*&3d${0>Q%4w>KMp9Yct5(5 zFbaSklC5!p@x%YjHaeIzev-2!erm{#Gju8A*ZWv{o`R%&RX|zL*tPv*yo%{zt7L`5 zLNcwwt=bzHVqNuEibmt6GN?kUXjGH+q0QykgJ>Ipy!HK)h02sTe`O{mN3ulyPMIFbHNj$?_N3 zIzUoqQTQ2LZ$(8S#y2eRXyqP2Zu9u7lqz-Eaj2}{&2F@0Q@PCQSd^<13lvKBTSXaf zw%j=6gXK1`D3#TVSozt+^60D1gBq1DKQK6wY>j?9Yulh%4{DPL2GQKhP6K(+GsR}U z%N`2T3@AN%9Js7wx3X{WRggxGTNU+r-2cbdYwPOVygFTV(#l*n|9cEx74`g9BV}*P zs{9qSXJ%fpgO5LZE_!X9qhCN-(gc;Yte z9^CW9n1iMd=QImV_Llfv#{{LVKUUhZd6$dBkCs4m2wcB=JhrLXNQze&S2}@Zl#c}f zR)$r#KakF#9D8Pb1=JKm(-zo3HC+jzuq^-hw=Qly`@69l) zc~95$g!Y4`QRnc*U{#wvSf?tl>3pH}Cg7oaqXsnRVrcL96mvTXzNL4`d(eBPMxYx9 zp!34-O#2O`3>NY2DO#CUMaz~&Vvb4-2WsGn8b{B%eRONLPB@;6POH?9tqo50l`-nq zW?~uA*K;qjMmuEXYn?>*nDbzXq}9+Ow+82ww94ztt?it|oOI=qPslFw>$Wx}j^*aw z)`y?5>z%sS6mbwJ>ba7zn1K8pV36?9}? zs92uQ$i5|@RuJOG?YenkAfBF>h9n9`wF1hp!?PcI*rN&aW#NsT`FZVO1W#Xi&?j?w zmcw-lwwc2|kH3yED?pI65mgkA>5NYErkny%NhzFQYoEq3}nYk-n~~ zjf^^jQ8f6iO|fIaV17^ND(}}bCEGp!rna%fbhu95sN_$@peOm(j#&vDtl&jsuf(K{vsT~Ir;CY`jV&K8O`|wgOBK3! zNwl(8&DV3e;Bf69@>@K@lhA_d6%BVQZL@$|DGzU7^24@8wRx6C%iAN&QBtarqM&y- zncpJDV|N+)kK@$H<8mvLwiRXMclzBiNUy>QZqUaerOH=6YDB?V9dmF0xCfn)4OaO+ zftl5yIGIHK7)8%fIwG5B1{Cd1815HHtI!LJn07?3JYM086)i)tPD`Mnp5b3AVkNLJ zJY8shZQmEEe{1<$=l4PTQf_J!5S&BjNLb~hT>+*<2OU5w%VBG9Mm{omQ~;EGea&hn zQQ7{$Ws7d*F#0rJM1;Q-hCARRjqPqgg?MHr^KUIW)_VOdCuZ*&@q`Xtxwoh7K4}SN zy>wK&^!w*b?)#M2@N-Cqg9$nsB^MnZ%adJq6D8kLnXF`b>#rrjrs_F`VqKz(Pfk0Y zVz&mACYdFwj@<*tQ0+RZz0C5=(dPXrfiJf$I<2a?FndN}{6STd!BNTHgRUd;9F%WKli&s8$U$W`NhK@`~wM# ziZ~x%1(arjgpMw7xN2J;)FhGi=mh^5bC8C9vh%0v zss>g^j*NSQz5Yp2#- zUOh+I%UVe&DsNrD7l2~fcnbjKt=vO+Rlno27G&=^N(FAZ&5ELEMWWb&K>EHPXwAbQ ztoONnbn#sEO8C@Yzs-Mg1rYrX8V`Kb_0v)?u;yg7nxv-;Ad`r$rrQC1LC0j#tMh{sP39?rkMlm->bJeOL*IGCF?;QtdK9RTXM}8+k~Bdz79U4!jW&Bgr9o zYIsW1OdSH;c_MM)q`GaU7Wr7-AFq*EUK_k@;1*c6{+=Fw>gVVLe8U^-)hs?*ND>%Fm?GY(fEE z3U(4Ys=^-AMifVGtf0zA)E?;<@GRpEC457`*; z^rrAqMQ$f9o>~u%68yLF&N?j0_S^T^h?LSI@FD`z(jcX@q;v^5AYIZu7<7oVv`E*? zFr>hMf^Za^_F5k zvTp?4q|8@$8m16!`9*huWVJrhI9OUI%*{$YRH$b|gDa0>8^}7+u}X+nZ`(zIFu^0b z>@0-IEPPY?FI@4>n4yPq_pW%P_bwnFATS(~x*T0N%~UPD{*j@xO#ZHGF>?43uIpSq z9mw3PL)IV2@E{hHxG5Djpvf0warYoLVeP}qpz(8mxD{Ux&1Y*|mXu#|4PN7tZNXbw z{o%^g7tk&AH`K|#@-yW3yWJxo$xIi~e~JG|>ADTI;qv`d>lvs1`pM&| zQhbYH&wjlFvNs@q)Amz-4{4(M?fG6eIP9~++g&{ueQ+!z|2h4d=tN$hPl(2>YRHxb zqovO4=3+3$^s0pTy;oDsnR)%`*smM(Cy8tR_aZbsCm^2uFpWLO%-N7&VxeCpb&O_o z<%XrJh_#~-?0XlLsK?SXvviPD=B9TrI>R907*ig0{AdXGM|njk>+R8rxZM>@=>`3C z(ajib*Kb9f4yoLLKd7Bu<%-={@n$(3K3#xJ2Z09Q2>!R@w3No^Z*@S1TFy7u+JHaN zk4HzAJx23&!xoj|vM_$wbPKYcF`CYK$Xu78c+Z-75pU3c@C7u1b6St7$McTY>a_k? z&Y~qY$eLOY;W?CeSNa6J1ktoV! z8c2aywkAuGOy>#4F}dEgJIJwe&r{wU6nz*9!+k^q>8eo_TvOu|DNOxjLoYRwk_6p* zg4ti|SCNd1yrwtjW@nNygP7JMpIWEK)_n4DH{5M7Y>xCI+1PAJ zo58;jGQB=Rqg-X>MHo7t!g_QD@9q`&c<%q~;CwU;55_fa67#MCJ!!F-+anzJMCsPi}vu%u;1gubHS@E@d=PsZrpB2>8a zMe{blZ>7u8+~ju3ShX{50AdWY^v)B`8es4<`t#UM8w^1fF_lp17^ecw|4Z;DZ zc_^j$y>Qofq3mz}iG~*O%uz_hJnUjfP~UUadYqNOF4G>(BwM4%=V|5v_(i_H4k@d{ zfRaTT-fVNI4Zqs<6OE@&HPZPg5u=5u{&`+}7k$fhR%OWOPm-TQ>MJ30`4xsv&$tV1 z>SMnRT)aF*>>M4%)QcB zXn>pAF{&rzv_*Nn6k084Xg-VjUizz5OT4A!MDhBERduISQtuBOK3Bp_9pEkw;R8zn zQ}076?&%oS^*^FZQfnV-|4FJ!@uLN!l-&7i^&b}cE?}YA-ihj^^Bcp^JRVsSWkuh| z?pwV)o1s5Sz^^fXQw!;YKEDT`Oyjr{USAn6MTHt3k)XfUPyO_33=BMzcYAMFpD%4c zy@9i1suqa2*={O6<^?ZRcuCgGAZyx8EWmb5nEe!t`QsSV&u8t;LL566!K^OZ>svT| z(ac3n1T6ynD8+_A1W>*lFkL3Q!VaWJ6K&zPg0EPQ|5gHMQ^k01ABoQuEE|@(p`3z` z&*z+1C9~NZ0hC~BOV>sO^>}W(ol9XX0K2?-DWlmMq=}5*O!ik2CDuSWk_P*_1-I;o zkqk$>GTQ|jC)B{ssX$=Td(9u_Wm8K2SxD0$3iY_ zR(AWeb}UMs`UMej>Ki*SjHZSfOpI|f8p&Nh|5Gk{K1|p+*OleOtSF@yH(=R z6iV4nO!->xt}4=imRg8=%%HyCPSH5IqIbP`eL$3-i-PT^{Q|Wme6z|Z(y#nzdwr%?5QLzCmP#&3 zI-Js6jSPtjKc*AwuNt2# z8}H}6!jTXqr33-qlWJd#nN?~{x!W>vfb|82TmqwFFJHzcq4pgn;X28 z85gb*KoYXo2V7QO8n_z8i*;GM+j2RG77G*P>%L{IQDF@e7N7ScOVLwtw^B}Uxf1S1 zPKBq?YW}I3^`3zYI9xq%?zNQ6vkV)Xc#p8`ah`jyhG?vPmt4EBs}HQHB-dG2k8#SR zCOiFcA5V%N)rz)FJF88Sa_y^ZvK!Sm_^*_Nxn3!;V4NL1l2cVBNL#fnDrDdGLOMIA z=-DS!)~*Kc@2tOJJ~UFfK)H>Aw=ycODDNFN=6OCkG#VO;r$GxpPpjnbC>hjndLsTq z-t_GmMcro)d`JRETeM%fS0zS1yMt+A4ehk;rY)GZ48vPk%vpM*?0v4fN5knpS{%|g zv^z1I6yua(eRP(-cY_9}GXZ2&hByX^(Qan`@UN{FQa%$Id!S72Z$Oh{M=@ z@;nld55P61I>XQN&Gv@TF=|WDQI)^9hM^N{UTNU278!bQ-LJ{8K*bC5b zw&81>Q(ARA6Fnb+%YF48VM#`H48rbV5_rbcWHGxbTe3ChSOvZg6!fUl zG>iv?`j(8hGY?u9;hog!H3y*Uf zmOpvWU+;-H8sHUHqwQoW3&0JGRB~7GDtW=KMcbZB1Yy`*x!yh+YS1`nK1gZV9uuf5 zJ5Dy{9#fPnK)Mo`pv_LuOO_LO+_SqI>HpZ;>*!q|qWboaC@Z#RYdJ{J-1)O_wKwiJ z$3|JPdQ0<>-^COiE6WtTD_|*jIU_O|Yhm>GCDU!V3Mb@th^C>m#>z-6A3^M-x9*wp zef;SV1>f&BT0Y=EejUENB;aG8=kW5?&s^wXM}UiR&r|-&V?XVS8L!`k3_J0YVmCzR zRgw0&NITHl`zUp~Vtk2SfPT73mz;g)>nZ)i=WT6E$Fb$0U$n}UTw1|$%fPRZhROC|BT4k;UgJfgqd`YZU z1{ZX;9&;dm2)LTnXrR5gSd^uE@{(@>*yVJ2=A700W9}dpG#f&`;Y=kv5CpQUYL`#@ zO1{YVq7N-v6OomjCQbG!J-g{Cj5BjT?EgT++x%x5{$A5>L5ORs9s-%0=K(2>Fwx&K02X z1ouuhL)yRmB$C8h)!a#4D25zM{H(&f&e!K>j1J~Dd&L`sh%u!(-j@260C4#Fsdq`; zpb&W9QfQcC>XCVSYZb6mJ+9jGf#UH?qz%}X9_^IiL(gTAK~02;Ujx14-3F{g;w`@S zo)PAB4XiyR{>L9di8A1$f0}!C7KwdNh_k2xz?9|!IFE3UmUQ|zdqUE9eo)U@viRLWmoyI`13w_Z@`(nN&HJ-Tin&6;&ZN@iy z4k6M|vRzQ@+PbgXTZ@S0oX3j`^Lrps<_X^9#vw85o`dVY>O2e7%S9vh2MGaMJx}Qx zu-UK3hRafK64Vn=ZeWTi^S2j3{{PAg;3cUsdGt^<>xcc;Tt1)>s^Sq5+x&1#`?1j`v0F$z zxwTcM4~fbLSDJ^eMbRWuzPMyUCMs&EI>n|*BaLX1E^iJKB`UoylN!MErPskD@>Zsl zVGugBV)ZN%^~VfAvOov|P}bwpzu3SL&HeX{4FC87RMwuCmQ+j?K>`9GvI2V-+SS1~8HB6radD)8Kr8gs>h`Hc$xJ3a5vsQFR_bIJ}2wRk-F5odyr+)UJX_~s> z9bv!-S1JT8sdI7YQ#(JvWy!JD*YUB)P%*OwA%K?bxLz)jh2fRfY&wVB*>nVzU`I^^ zX#*ahJ3$|H540|}Rg7r&<3IsC+_wzGH{ONTK64oo^0Pmk-e)Zo?u%B#{Wjs--Jpp& zxHZ1AX>E}&>`S|6AFxYn~2&&#w z%7VGhFCM$krYsDvXx8{OUS61IAT=6LP>SguwIb4@OyO!9a>*SppY-GU`a{n!+`BE> zoTzrAeUgo{oxMFgl!@tUB{wbiN=V` zpKIPf$4A81EGI+JrEELb2<;J zglwwHIzdnM_B!Ttou6h&b)JJQne?6EzZqqGRaby=e@Q(0G>Qlhs=)=o=}Il5h0VsK zTgxKIlKcW68H=k|odEF#6ZmEUUhL28ww2-0QY5Cet`T+CoD6DM%5mxY zDl?e|NC%))j2saJxp!jZ$=b;(f5ipQlmZFd(Kh^w6DPC`v@4Uy_3nGaVj$@k*Unm- zQpX8>tg?2j03-AVOg3lg`mox?oaD+-(+~~h+!U>p?%OvmDTT9qXzk&D}?g;VPy2jEBQ5&t-Q@tpAZ_DZe^EvM5wRifFjt zf{v`#7GHz`=wtcE|D|cefx5XPm?_02TElZ{pG+9U#VoZwBieap;;602nA}6x6^6_b z!eP$>tWzcxzQ~Q6zv?J2JOqOOI|dG$w0*Ug8j}qOMh`xy+k3lYEgkkYW2=#qXZ1od zHLNxtQlgkAc=MrGa40kYbfsu+yyhy~OoalZQ%ci+>E$3scj4vW!DY7_yOz&h7;LNR znPxxsO^CV2lkzP_^L@u7cRHL}X7Z;Oti}lKv`MG>lI&W$2QJ{4EVe+(9V&LO+W3lr zI^5nuHz>5TidcQJ)#vd+=C@MSeGr{G0P^?F+}*wkwNvB7k8LGsQVXNnjoLVyw0c*; zeE10=8lc_WYN;G0By-)sZiAg%5jQN1xo!0|MZ%h0o`gqPd2#8whx_VigXT5H>cVjq7x$}WH0-$#uD#JMZg1U>jUN=?r?qwv8)6ZMjD08tHWd>Ms#YKi7vSfIxuH6JLFsJz2qT+(g1pY$cQfd}|;hR?9B`6>USkRL{l$Qw} zB;l%)E&wUHP5E1tUf|%c-}{52wF3Q9tXdHO`G3H{&%v2G7r@Y?%7;?AGVcbF_f7Xe zPyxI}t-UGq0&M|!S*+hRK<)fKRLwa|=mOu_0%*p?AO8PiGx7flH1-D%{0H><4HEuG zYY}3f>RiIo1_7}NMOrgWf=NJH_0wCHT-YeU3A9bp_`dB&-RM*Wk z?b5~}hA3?FxMGZYxXmS;Fx)>7v=6;4-&P@aDF22<_r*OEj^gv0Tla&a_C-JiHd?h_ zBxK*N?goHov+eCLl7=Z4fcT>hkv@iXTlwHExpu+KjI*BAuX%)3>OFlJTe}H1Ti}jgXp~D(n_z0$ zq;Kk~X_f&)0wF4BXnX($E^MHc^4MqZhJ+}kFYYm5@L4HzYtbFetqTZ%3*Kl5R&F;?%mW%LyFd&FqTYUFUP;Nw%J#GKq}RElpk3Vp zV6X=Qnn~grnG8U~GFa+GJtXF6!ff_(+^ItsN@Okj(zn?^W;y>+?wu3yNAAtURdkZB zU$Qwk%p9>EP_uF5kL)3!mMHjVWdF-I9UDF|^1y%3XFr4D`q>v-#uMKY$;QYepyJ!< z!ZI^`xdqYSs$^yGR282VEJRlDx`nBXeE&*BN*)pb=5(BA$1&+s^u}S!)kzNLPuf(0 z{lT*`Do~NFjsXBWOq`EaW;bK`2(Z%y9L$%G4$mDh-Sq_pSfLN+*XG-)Swm-jkj16+ z2k6i}k#UkAK;A1=a+2?IlJ8)rLzm>;<56_(&ge^Vy1JzKn)VidFR8kkBqxjdC{fP` zhtv-i#)K5aDE4U^RQ#>|vkl+<$%J%qo0^`bodoS-BLlb&mVnZ!#Th=W+4P+sT^1VJ zOTk_+SWr0RCd}JH>3X(g6El_kB^O0jpy?+F9s9alxVU$+jdrjRv|k@RxaMn)6Bt}U z!>p1D=2xX46G>L-*C~GO@z^9HyA>b573hM9zIJdudEKRF8bl z)K`={TGgqM;HBMgbRW`tTJ3as%K-C3(ZOwFU5Z%+BjTi<<|@kQx|STXd+m6Exq(?F zHAbyK&{0lZ!8+-r+PCM@&QpE4rYGgsR~|e6>KnrB+l`})eetY3k0*)Pd9OT57PpDo z2zW@68^PF#WR_s#Mc z-#9Ht^G7#3eRyNkL-@$3B>P5iwH5D@yyGd;FE2R3v0d+|KVsvAAcfz-G45qk+Zujno!L^%QR;^z+&d?Ft`ZUw}Vw1x% zMSdt8!{!?y5lM=fnI*BqF2wtCRIlea}{}RO_+jiku1$LTZ#zc_D=C8Y? zCNP)uBh6d9 zun{d*q+JLzlYi>{a2?#)J+L;H*2CVkcDCIrxXsGpqVP zM1I7Czh2 z_7cN!AWZ_})+vhL+kl>2sS#_##@&+vMe8D^+1J-sPzq)`!DrGAlvZ=A1O|(oC+h4k zs!i6LizSR|*$vr-83*v}Uean6&=+xw?zg33=M`qVA5@b67T=%dZ1H1P*e3z{E0uS*0g*xbYRn(90|P610PrE_jc8A@XTT^J^|Oid;*hcnU@a` z&};R>g(HG@MzSJ^;@|_Bs(zkv{clZK&MtkI)yWS;Tru|2Vx{KWy7!b*7ddg~FY@_u z<#A0jSvS~}S&zAaD3S;wMQ8VgC`vG`OAn!h;YpiX=0+1rmsv?1HAW6IuZgBbC&foG z>XhR-;~vj`)X6IO-0|9?$Jj*gwR~zI&pftE0%?B&i!s^}9L(6!Xilzzecd7FIL@&< z?#7nljO={6xNy0Dx$tHU1N1MejLw@yU3&WbU%_oBO%foy`&b>%h*EUIl6{ca?Z>Ar zbt_@0%_m`t3>Cs>TIh4LGc%_d5>U^2IGh@r0*Dn z9_6j^l7pAh$+Odumqb6YUubkrW&TOn`#z8T35@%Uh*Z}-6TivE&ghRw_BVs~Q-9># zEt%OX?tigxQ>oVy4+Z<8`E7GlVV!!2&SJ(iS^`=Dg`8E1JWCeYjByh+aDIlc+?xpp z@2=E?vBPuyGtBv-o%aN|z-FsiglyN|(dn6IZ@lOC^2WOzA1}Q(9zw4RvtBQNyLP}{ zCDNmtF{*oOxe1pL5Ws1_#Qe57LoD^v7p0|yA{;p*9VWKL-$p#_BHoWuc&fhHq*EB) z>^_&9A0J*s?Slb!;we*)i=O$D`tY6dW;!-x9b}g% zAT;RiUrMj3qAzlM&_750rCs>)d;|n65baq1R2~Dk#f@duH5QP#6h!=K#8+LddR1>o ztQj+k%~cXbvuI*7l!wL5NV zHUQC{cBMsgE|P*vr$E1-$M3+Xr%lu-5K;WaWIkIRJgz2>T?cQ8BB#j>!9A3c3y&h# zc~zF}(Yo2&XGRDi`&u{C8K8)zbqikKa3qSz?>smsk8`AwiTLPu9)8J#>x-S7j6;jq z?{n%eVz1njJ3%<8cnyymy3g}AqKcYpakV$sFBPjSqno)1Yh8XSCC5Z3&H}`~?ul{% z16GF}@cR7y*is4u*$UpVMuYNE_jwHc1~?-Eb=*YNqtz4L9MrC}igig3(?Mo+bP#J( z?KDeMIh`)jIg_)8tqcr;MX3Hns(estRi?}ej+f>OL3jqk0}$3xeMoCxlcsNX`RYW1@&(#CpUVC-dcs4f&gH8X?{qOd3UG;S+X2JH`2$~UV-OTCF3vyLeR z*3A{milK%jZWgNV^;5 z^wtX?(pIn0w$WCE+ghysvT&|{J&kpqj=DVkJzJo_P1kgx*uwhRzHE!fVf9AEk(Y6P+cKeeh#RmAONd{JW9SfTwftwhZI^D7O>*-x zt8QzDtxb%l_ZYV;mf?J3_)}5<*Z=yOsK$a{Ao#GIZ|L)6Eiq6>OtY#%l}l4-d9rrV zsF3@w1csJ6t@>wGM!-4o_m>Ip^hZFCgl$KJ64GE^?+rj#@gd`%*VqFtUMu2-<)@X5 z%_)P%TkO9VG0l>?zsyLf>$1eQ*l=*NBmU{1t;MwwYYU}31(yM;n~2HREWVvW4Jd;t82JV#a{tdD>;V1wA literal 0 HcmV?d00001 diff --git a/dev/breeze/doc/images/output-commands.svg b/dev/breeze/doc/images/output-commands.svg index 5888d1fc862eb..08b135479de56 100644 --- a/dev/breeze/doc/images/output-commands.svg +++ b/dev/breeze/doc/images/output-commands.svg @@ -301,59 +301,59 @@ --python-pPython major/minor version used in Airflow image for images. (>3.8< | 3.9 | 3.10 | 3.11 | 3.12)                           [default: 3.8]                                               ---integrationIntegration(s) to enable when running (can be more than one).                        -(all | all-testable | cassandra | celery | drill | kafka | kerberos | mongo | mssql  -| openlineage | otel | pinot | qdrant | redis | statsd | trino | ydb)                +--integrationCore Integrations to enable when running (can be more than one).                     +(all | all-testable | cassandra | celery | drill | kafka | kerberos | keycloak |     +mongo | mssql | openlineage | otel | pinot | qdrant | redis | statsd | trino | ydb)  --standalone-dag-processorRun standalone dag processor for start-airflow. ---database-isolationRun airflow in database isolation mode. -╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ -╭─ Docker Compose selection and cleanup ───────────────────────────────────────────────────────────────────────────────╮ ---project-nameName of the docker-compose project to bring down. The `docker-compose` is for legacy breeze        -project name and you can use `breeze down --project-name docker-compose` to stop all containers    -belonging to it.                                                                                   -(breeze | pre-commit | docker-compose)                                                             -[default: breeze]                                                                                  ---docker-hostOptional - docker host to use when running docker commands. When set, the `--builder` option is    -ignored when building images.                                                                      -(TEXT)                                                                                             -╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ -╭─ Database ───────────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---backend-bDatabase backend to use. If 'none' is chosen, Breeze will start with an invalid database     -configuration, meaning there will be no database available, and any attempts to connect to   -the Airflow database will fail.                                                              -(>sqlite< | mysql | postgres | none)                                                         -[default: sqlite]                                                                            ---postgres-version-PVersion of Postgres used.(>12< | 13 | 14 | 15 | 16)[default: 12] ---mysql-version-MVersion of MySQL used.(>8.0< | 8.4)[default: 8.0] ---db-reset-dReset DB when entering the container. -╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ -╭─ Build CI image (before entering shell) ─────────────────────────────────────────────────────────────────────────────╮ ---github-repository-gGitHub repository used to pull, push run images.(TEXT)[default: apache/airflow] ---builderBuildx builder used to perform `docker buildx build` commands.(TEXT) -[default: autodetect]                                          ---use-uv/--no-use-uvUse uv instead of pip as packaging tool to build the image.[default: use-uv] ---uv-http-timeoutTimeout for requests that UV makes (only used in case of UV builds).(INTEGER RANGE) -[default: 300; x>=1]                                                 -╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ -╭─ Other options ──────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---forward-credentials-fForward local credentials to container when running. ---max-timeMaximum time that the command should take - if it takes longer, the command will fail. -(INTEGER RANGE)                                                                        -╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ -╭─ Common options ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---answer-aForce answer to questions.(y | n | q | yes | no | quit) ---dry-run-DIf dry-run is set, commands are only printed, not executed. ---verbose-vPrint verbose information about performed steps. ---help-hShow this message and exit. -╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ -╭─ Developer commands ─────────────────────────────────────────────────────────────────────────────────────────────────╮ -start-airflow          Enter breeze environment and starts all Airflow components in the tmux session. Compile     -assets if contents of www directory changed.                                                -static-checks          Run static checks.                                                                          -build-docs             Build documents.                                                                            -down                   Stop running breeze environment.                                                            -shell                  Enter breeze environment. this is the default command use when no other is selected.        -exec                   Joins the interactive shell of running airflow container.                                   +╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ +╭─ Docker Compose selection and cleanup ───────────────────────────────────────────────────────────────────────────────╮ +--project-nameName of the docker-compose project to bring down. The `docker-compose` is for legacy breeze        +project name and you can use `breeze down --project-name docker-compose` to stop all containers    +belonging to it.                                                                                   +(breeze | pre-commit | docker-compose)                                                             +[default: breeze]                                                                                  +--docker-hostOptional - docker host to use when running docker commands. When set, the `--builder` option is    +ignored when building images.                                                                      +(TEXT)                                                                                             +╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ +╭─ Database ───────────────────────────────────────────────────────────────────────────────────────────────────────────╮ +--backend-bDatabase backend to use. If 'none' is chosen, Breeze will start with an        +invalid database configuration, meaning there will be no database available,   +and any attempts to connect to the Airflow database will fail.                 +(>sqlite< | mysql | postgres | none)                                           +[default: sqlite]                                                              +--postgres-version-PVersion of Postgres used.(>13< | 14 | 15 | 16 | 17)[default: 13] +--mysql-version-MVersion of MySQL used.(>8.0< | 8.4)[default: 8.0] +--db-reset-d/--no-db-resetReset DB when entering the container.[default: no-db-reset] +╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ +╭─ Build CI image (before entering shell) ─────────────────────────────────────────────────────────────────────────────╮ +--github-repository-gGitHub repository used to pull, push run images.(TEXT)[default: apache/airflow] +--builderBuildx builder used to perform `docker buildx build` commands.(TEXT) +[default: autodetect]                                          +--use-uv/--no-use-uvUse uv instead of pip as packaging tool to build the image.[default: use-uv] +--uv-http-timeoutTimeout for requests that UV makes (only used in case of UV builds).(INTEGER RANGE) +[default: 300; x>=1]                                                 +╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ +╭─ Other options ──────────────────────────────────────────────────────────────────────────────────────────────────────╮ +--forward-credentials-fForward local credentials to container when running. +--max-timeMaximum time that the command should take - if it takes longer, the command will fail. +(INTEGER RANGE)                                                                        +╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ +╭─ Common options ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ +--answer-aForce answer to questions.(y | n | q | yes | no | quit) +--dry-run-DIf dry-run is set, commands are only printed, not executed. +--verbose-vPrint verbose information about performed steps. +--help-hShow this message and exit. +╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ +╭─ Developer commands ─────────────────────────────────────────────────────────────────────────────────────────────────╮ +start-airflow          Enter breeze environment and starts all Airflow components in the tmux session. Compile     +assets if contents of www directory changed.                                                +static-checks          Run static checks.                                                                          +build-docs             Build documents.                                                                            +down                   Stop running breeze environment.                                                            +shell                  Enter breeze environment. this is the default command use when no other is selected.        +exec                   Joins the interactive shell of running airflow container.                                   +compile-ui-assets      Compiles ui assets.                                                                         compile-www-assets     Compiles www assets.                                                                        cleanup                Cleans the cache of parameters, docker cache and optionally built CI/PROD images.           generate-migration-fileAutogenerate the alembic migration file for the ORM changes.                                diff --git a/dev/breeze/doc/images/output_ci-image.svg b/dev/breeze/doc/images/output_ci-image.svg index fa3cef20ce068..2b8c1414c8105 100644 --- a/dev/breeze/doc/images/output_ci-image.svg +++ b/dev/breeze/doc/images/output_ci-image.svg @@ -1,4 +1,4 @@ - + Tools that developers can use to manually manage CI images ╭─ Common options ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ ---help-hShow this message and exit. +--help-hShow this message and exit. ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ ╭─ CI Image tools ─────────────────────────────────────────────────────────────────────────────────────────────────────╮ build   Build CI image. Include building multiple images for all python versions.                                  pull    Pull and optionally verify CI images - possibly in parallel for all Python versions.                       verify  Verify CI image.                                                                                           -╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ +save    Save CI image to a file.                                                                                   +load    Load CI image from a file.                                                                                 +╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ +╭─ Commands ───────────────────────────────────────────────────────────────────────────────────────────────────────────╮ +export-mount-cache            Export content of the the mount cache to a directory.                                +import-mount-cache            Export content of the the mount cache to a directory.                                +╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯ diff --git a/dev/breeze/doc/images/output_ci-image.txt b/dev/breeze/doc/images/output_ci-image.txt index 291f992d9aa16..25b4a0db49abc 100644 --- a/dev/breeze/doc/images/output_ci-image.txt +++ b/dev/breeze/doc/images/output_ci-image.txt @@ -1 +1 @@ -eb0a44354e231d527eb61abbfa6a410c +de6e1ee2eb602569a8cd1a3c6c4cbafe diff --git a/dev/breeze/doc/images/output_ci-image_build.svg b/dev/breeze/doc/images/output_ci-image_build.svg index 131b618e403ce..beb8c95c2d598 100644 --- a/dev/breeze/doc/images/output_ci-image_build.svg +++ b/dev/breeze/doc/images/output_ci-image_build.svg @@ -1,4 +1,4 @@ - + CycloneDX SBOMs for Apache Airflow{{project_name}}{{ version }} +CycloneDX SBOMs for Apache Airflow{{project_name}}{{ version }}ń

CycloneDX SBOMs for Apache Airflow{{project_name}}{{ version }}

    @@ -640,27 +663,87 @@ def generate_providers_requirements( @option_airflow_version @option_python @click.option( - "-f", - "--csv-file", - type=click.Path(file_okay=True, dir_okay=False, path_type=Path, writable=True), - help="CSV file to produce.", - envvar="CSV_FILE", + "-g", + "--google-spreadsheet-id", + type=str, + help="Google Spreadsheet Id to fill with SBOM data.", + envvar="GOOGLE_SPREADSHEET_ID", required=True, ) +@option_github_token +@click.option( + "--json-credentials-file", + type=click.Path(file_okay=True, dir_okay=False, path_type=Path, writable=False, exists=False), + help="Gsheet JSON credentials file (defaults to ~/.config/gsheet/credentials.json", + envvar="JSON_CREDENTIALS_FILE", + default=Path.home() / ".config" / "gsheet" / "credentials.json" + if not generating_command_images() + else "credentials.json", +) @click.option( "-s", "--include-open-psf-scorecard", + help="Include statistics from the Open PSF Scorecard", is_flag=True, default=False, ) +@click.option( + "-G", + "--include-github-stats", + help="Include statistics from GitHub", + is_flag=True, + default=False, +) +@click.option( + "--include-actions", + help="Include Actions recommended for the project", + is_flag=True, + default=False, +) +@click.option( + "-l", + "--limit-output", + help="Limit the output to the first N dependencies. Default is to output all dependencies. " + "If you want to output all dependencies, do not specify this option.", + type=int, + required=False, +) +@click.option( + "--project-name", + help="Only used for debugging purposes. The name of the project to generate the sbom for.", + type=str, + required=False, +) @option_dry_run @option_answer def export_dependency_information( python: str, airflow_version: str, - csv_file: Path, + google_spreadsheet_id: str | None, + github_token: str | None, + json_credentials_file: Path, include_open_psf_scorecard: bool, + include_github_stats: bool, + include_actions: bool, + limit_output: int | None, + project_name: str | None, ): + if google_spreadsheet_id and not json_credentials_file.exists(): + get_console().print( + f"[error]The JSON credentials file {json_credentials_file} does not exist. " + "Please specify a valid path to the JSON credentials file.[/]\n" + "You can download credentials file from your google developer console:" + "https://console.cloud.google.com/apis/credentials after creating a Desktop Client ID." + ) + sys.exit(1) + if include_actions and not include_open_psf_scorecard: + get_console().print( + "[error]You cannot specify --include-actions without --include-open-psf-scorecard" + ) + sys.exit(1) + + read_metadata_from_google_spreadsheet(get_sheets(json_credentials_file)) + import requests base_url = f"https://airflow.apache.org/docs/apache-airflow/{airflow_version}/sbom" @@ -673,43 +756,220 @@ def export_dependency_information( full_sbom_r = requests.get(sbom_full_url) full_sbom_r.raise_for_status() - core_dependencies = set() - core_sbom = core_sbom_r.json() - full_sbom = full_sbom_r.json() + all_dependency_value_dicts = convert_all_sbom_to_value_dictionaries( + core_sbom=core_sbom, + full_sbom=full_sbom, + include_open_psf_scorecard=include_open_psf_scorecard, + include_github_stats=include_github_stats, + include_actions=include_actions, + limit_output=limit_output, + github_token=github_token, + project_name=project_name, + ) + all_dependency_value_dicts = sorted(all_dependency_value_dicts, key=sort_deps_key) + + fieldnames = get_field_names( + include_open_psf_scorecard=include_open_psf_scorecard, + include_github_stats=include_github_stats, + include_actions=include_actions, + ) + get_console().print( + f"[info]Writing {len(all_dependency_value_dicts)} dependencies to Google Spreadsheet." + ) + + write_sbom_information_to_google_spreadsheet( + sheets=get_sheets(json_credentials_file), + docs=CHECK_DOCS, + google_spreadsheet_id=google_spreadsheet_id, + all_dependencies=all_dependency_value_dicts, + fieldnames=fieldnames, + include_opsf_scorecard=include_open_psf_scorecard, + ) + + +def sort_deps_key(dependency: dict[str, Any]) -> str: + if dependency.get("Vcs"): + return "0:" + dependency["Name"] + else: + return "1:" + dependency["Name"] + + +def convert_all_sbom_to_value_dictionaries( + core_sbom: dict[str, Any], + full_sbom: dict[str, Any], + include_open_psf_scorecard: bool, + include_github_stats: bool, + include_actions: bool, + limit_output: int | None, + github_token: str | None = None, + project_name: str | None = None, +) -> list[dict[str, Any]]: + core_dependencies = set() dev_deps = set(normalize_package_name(name) for name in DEVEL_DEPS_PATH.read_text().splitlines()) num_deps = 0 - with csv_file.open("w") as csvfile: - fieldnames = get_field_names(include_open_psf_scorecard) - writer = csv.DictWriter(csvfile, fieldnames=fieldnames) - writer.writeheader() + all_dependency_value_dicts = [] + dependency_depth: dict[str, int] = json.loads( + (AIRFLOW_SOURCES_ROOT / "generated" / "dependency_depth.json").read_text() + ) + from rich.progress import Progress + + with Progress() as progress: + progress.console.use_theme(get_theme()) + core_dependencies_progress = progress.add_task( + "Core dependencies", total=len(core_sbom["components"]) + ) + other_dependencies_progress = progress.add_task( + "Other dependencies", total=len(full_sbom["components"]) - len(core_sbom["components"]) + ) + for key, value in dependency_depth.items(): + dependency_depth[normalize_package_name(key)] = value for dependency in core_sbom["components"]: - name = dependency["name"] - normalized_name = normalize_package_name(name) + normalized_name = normalize_package_name(dependency["name"]) + if project_name and normalized_name != project_name: + continue core_dependencies.add(normalized_name) is_devel = normalized_name in dev_deps - convert_sbom_to_csv( - writer, + value_dict = convert_sbom_entry_to_dict( dependency, + dependency_depth=dependency_depth, is_core=True, is_devel=is_devel, include_open_psf_scorecard=include_open_psf_scorecard, + include_github_stats=include_github_stats, + include_actions=include_actions, + github_token=github_token, + console=progress.console, ) + if value_dict: + all_dependency_value_dicts.append(value_dict) num_deps += 1 + progress.advance(task_id=core_dependencies_progress, advance=1) + if limit_output and num_deps >= limit_output: + get_console().print(f"[info]Processed limited {num_deps} dependencies and stopping.") + return all_dependency_value_dicts for dependency in full_sbom["components"]: - name = dependency["name"] - normalized_name = normalize_package_name(name) + normalized_name = normalize_package_name(dependency["name"]) + if project_name and normalized_name != project_name: + continue if normalized_name not in core_dependencies: is_devel = normalized_name in dev_deps - convert_sbom_to_csv( - writer, + value_dict = convert_sbom_entry_to_dict( dependency, + dependency_depth=dependency_depth, is_core=False, is_devel=is_devel, include_open_psf_scorecard=include_open_psf_scorecard, + include_github_stats=include_github_stats, + include_actions=include_actions, + github_token=github_token, + console=progress.console, ) + if value_dict: + all_dependency_value_dicts.append(value_dict) num_deps += 1 - - get_console().print(f"[info]Exported {num_deps} dependencies to {csv_file}") + progress.advance(task_id=other_dependencies_progress, advance=1) + if limit_output and num_deps >= limit_output: + get_console().print(f"[info]Processed limited {num_deps} dependencies and stopping.") + return all_dependency_value_dicts + get_console().print(f"[info]Processed {num_deps} dependencies") + return all_dependency_value_dicts + + +def convert_sbom_entry_to_dict( + dependency: dict[str, Any], + dependency_depth: dict[str, int], + is_core: bool, + is_devel: bool, + include_open_psf_scorecard: bool, + include_github_stats: bool, + include_actions: bool, + github_token: str | None, + console: Console, +) -> dict[str, Any] | None: + """ + Convert SBOM to Row for CSV or spreadsheet output + :param dependency: Dependency to convert + :param is_core: Whether the dependency is core or not + :param is_devel: Whether the dependency is devel or not + :param include_open_psf_scorecard: Whether to include Open PSF Scorecard + """ + console.print(f"[bright_blue]Calculating {dependency['name']} information.") + vcs = get_vcs(dependency) + name = dependency.get("name", "") + if name.startswith("apache-airflow"): + return None + normalized_name = normalize_package_name(dependency.get("name", "")) + row = { + "Name": normalized_name, + "Author": dependency.get("author", ""), + "Version": dependency.get("version", ""), + "Description": dependency.get("description"), + "Core": is_core, + "Devel": is_devel, + "Depth": dependency_depth.get(normalized_name, "Extra"), + "Licenses": convert_licenses(dependency.get("licenses", [])), + "Purl": dependency.get("purl"), + "Pypi": get_pypi_link(dependency), + "Vcs": vcs, + "Governance": get_governance(vcs), + } + if vcs and include_open_psf_scorecard: + open_psf_scorecard = get_open_psf_scorecard(vcs, name, console) + row.update(open_psf_scorecard) + if vcs and include_github_stats: + github_stats = get_github_stats( + vcs=vcs, project_name=name, github_token=github_token, console=console + ) + row.update(github_stats) + if name in get_project_metadata(MetadataFromSpreadsheet.RELATIONSHIP_PROJECTS): + row["Relationship"] = "Yes" + if include_actions: + if name in get_project_metadata(MetadataFromSpreadsheet.CONTACTED_PROJECTS): + row["Contacted"] = "Yes" + num_actions = 0 + for action, (threshold, action_text) in ACTIONS.items(): + opsf_action = "OPSF-" + action + if opsf_action in row and int(row[opsf_action]) < threshold: + row[action_text] = "Yes" + num_actions += 1 + row["Num Actions"] = num_actions + console.print(f"[green]Calculated {dependency['name']} information.") + return row + + +def get_field_names( + include_open_psf_scorecard: bool, include_github_stats: bool, include_actions: bool +) -> list[str]: + names = [ + "Name", + "Author", + "Version", + "Description", + "Core", + "Devel", + "Depth", + "Licenses", + "Purl", + "Pypi", + "Vcs", + ] + if include_open_psf_scorecard: + names.append("OPSF-Score") + for check in OPEN_PSF_CHECKS: + names.append("OPSF-" + check) + names.append("OPSF-Details-" + check) + names.append("Governance") + if include_open_psf_scorecard: + names.extend(["Lifecycle status", "Unpatched Vulns"]) + if include_github_stats: + names.append("Industry importance") + if include_actions: + names.append("Relationship") + names.append("Contacted") + for action in ACTIONS.values(): + names.append(action[1]) + names.append("Num Actions") + return names diff --git a/dev/breeze/src/airflow_breeze/commands/sbom_commands_config.py b/dev/breeze/src/airflow_breeze/commands/sbom_commands_config.py index d5b26a6a29d94..96cc5cad8852b 100644 --- a/dev/breeze/src/airflow_breeze/commands/sbom_commands_config.py +++ b/dev/breeze/src/airflow_breeze/commands/sbom_commands_config.py @@ -22,6 +22,7 @@ "update-sbom-information", "build-all-airflow-images", "generate-providers-requirements", + "export-dependency-information", ], } @@ -95,11 +96,32 @@ { "name": "Export dependency information flags", "options": [ - "--csv-file", "--airflow-version", "--python", "--include-open-psf-scorecard", + "--include-github-stats", + "--include-actions", ], - } + }, + { + "name": "Github auth flags", + "options": [ + "--github-token", + ], + }, + { + "name": "Google spreadsheet flags", + "options": [ + "--json-credentials-file", + "--google-spreadsheet-id", + ], + }, + { + "name": "Debugging flags", + "options": [ + "--limit-output", + "--project-name", + ], + }, ], } diff --git a/dev/breeze/src/airflow_breeze/commands/setup_commands.py b/dev/breeze/src/airflow_breeze/commands/setup_commands.py index 76b2cef24928f..801bae9eaf0a8 100644 --- a/dev/breeze/src/airflow_breeze/commands/setup_commands.py +++ b/dev/breeze/src/airflow_breeze/commands/setup_commands.py @@ -193,12 +193,6 @@ def version(): @option_mysql_version @click.option("-C/-c", "--cheatsheet/--no-cheatsheet", help="Enable/disable cheatsheet.", default=None) @click.option("-A/-a", "--asciiart/--no-asciiart", help="Enable/disable ASCIIart.", default=None) -@click.option( - "-U/-u", - "--use-uv/--no-use-uv", - help="Enable/disable using uv for creating venvs by breeze.", - default=None, -) @click.option( "--colour/--no-colour", help="Enable/disable Colour mode (useful for colour blind-friendly communication).", @@ -207,7 +201,6 @@ def version(): def change_config( python: str, backend: str, - use_uv: bool, postgres_version: str, mysql_version: str, cheatsheet: bool, @@ -220,15 +213,6 @@ def change_config( asciiart_file = "suppress_asciiart" cheatsheet_file = "suppress_cheatsheet" colour_file = "suppress_colour" - use_uv_file = "use_uv" - - if use_uv is not None: - if use_uv: - touch_cache_file(use_uv_file) - get_console().print("[info]Enable using uv[/]") - else: - delete_cache(use_uv_file) - get_console().print("[info]Disable using uv[/]") if asciiart is not None: if asciiart: delete_cache(asciiart_file) @@ -262,8 +246,6 @@ def get_status(file: str): get_console().print() get_console().print(f"[info]* Python: {python}[/]") get_console().print(f"[info]* Backend: {backend}[/]") - get_console().print(f"[info]* Use uv: {get_status(use_uv_file)}[/]") - get_console().print() get_console().print(f"[info]* Postgres version: {postgres_version}[/]") get_console().print(f"[info]* MySQL version: {mysql_version}[/]") get_console().print() @@ -589,6 +571,7 @@ def regenerate_help_images_for_all_commands(commands: tuple[str, ...], check_onl "exec", "shell", "compile-www-assets", + "compile-ui-assets", "cleanup", "generate-migration-file", ] diff --git a/dev/breeze/src/airflow_breeze/commands/setup_commands_config.py b/dev/breeze/src/airflow_breeze/commands/setup_commands_config.py index 802a41fc273dd..61460f004ec9f 100644 --- a/dev/breeze/src/airflow_breeze/commands/setup_commands_config.py +++ b/dev/breeze/src/airflow_breeze/commands/setup_commands_config.py @@ -63,7 +63,6 @@ "--backend", "--postgres-version", "--mysql-version", - "--use-uv", "--cheatsheet", "--asciiart", "--colour", diff --git a/dev/breeze/src/airflow_breeze/commands/testing_commands.py b/dev/breeze/src/airflow_breeze/commands/testing_commands.py index cef51d975219e..b7099e950b2f6 100644 --- a/dev/breeze/src/airflow_breeze/commands/testing_commands.py +++ b/dev/breeze/src/airflow_breeze/commands/testing_commands.py @@ -16,7 +16,9 @@ # under the License. from __future__ import annotations +import contextlib import os +import signal import sys from datetime import datetime from time import sleep @@ -27,26 +29,26 @@ from airflow_breeze.commands.ci_image_commands import rebuild_or_pull_ci_image_if_needed from airflow_breeze.commands.common_options import ( option_backend, - option_database_isolation, + option_clean_airflow_installation, + option_core_integration, option_db_reset, option_debug_resources, option_downgrade_pendulum, option_downgrade_sqlalchemy, option_dry_run, + option_excluded_providers, option_force_lowest_dependencies, option_forward_credentials, option_github_repository, option_image_name, - option_image_tag_for_running, option_include_success_outputs, - option_integration, option_keep_env_variables, option_mount_sources, option_mysql_version, option_no_db_cleanup, option_parallelism, option_postgres_version, - option_pydantic, + option_providers_integration, option_python, option_run_db_tests_only, option_run_in_parallel, @@ -65,9 +67,11 @@ ) from airflow_breeze.commands.release_management_commands import option_package_format from airflow_breeze.global_constants import ( - ALLOWED_HELM_TEST_PACKAGES, - ALLOWED_PARALLEL_TEST_TYPE_CHOICES, + ALL_TEST_TYPE, ALLOWED_TEST_TYPE_CHOICES, + GroupOfTests, + all_selective_core_test_types, + providers_test_type, ) from airflow_breeze.params.build_prod_params import BuildProdParams from airflow_breeze.params.shell_params import ShellParams @@ -92,10 +96,11 @@ generate_args_for_pytest, run_docker_compose_tests, ) -from airflow_breeze.utils.run_utils import get_filesystem_type, run_command +from airflow_breeze.utils.run_utils import run_command from airflow_breeze.utils.selective_checks import ALL_CI_SELECTIVE_TEST_TYPES LOW_MEMORY_CONDITION = 8 * 1024 * 1024 * 1024 +DEFAULT_TOTAL_TEST_TIMEOUT = 6500 # 6500 seconds = 1h 48 minutes @click.group(cls=BreezeGroup, name="testing", help="Tools that developers can use to run tests") @@ -111,7 +116,6 @@ def group_for_testing(): ), ) @option_python -@option_image_tag_for_running @option_image_name @click.option( "--skip-docker-compose-deletion", @@ -122,11 +126,10 @@ def group_for_testing(): @option_github_repository @option_verbose @option_dry_run -@click.argument("extra_pytest_args", nargs=-1, type=click.UNPROCESSED) +@click.argument("extra_pytest_args", nargs=-1, type=click.Path(path_type=str)) def docker_compose_tests( python: str, image_name: str, - image_tag: str | None, skip_docker_compose_deletion: bool, github_repository: str, extra_pytest_args: tuple, @@ -134,10 +137,8 @@ def docker_compose_tests( """Run docker-compose tests.""" perform_environment_checks() if image_name is None: - build_params = BuildProdParams( - python=python, image_tag=image_tag, github_repository=github_repository - ) - image_name = build_params.airflow_image_name_with_tag + build_params = BuildProdParams(python=python, github_repository=github_repository) + image_name = build_params.airflow_image_name get_console().print(f"[info]Running docker-compose with PROD image: {image_name}[/]") return_code, info = run_docker_compose_tests( image_name=image_name, @@ -189,30 +190,31 @@ def _run_test( "--rm", "airflow", ] - run_cmd.extend( - generate_args_for_pytest( - test_type=shell_params.test_type, - test_timeout=test_timeout, - skip_provider_tests=shell_params.skip_provider_tests, - skip_db_tests=shell_params.skip_db_tests, - run_db_tests_only=shell_params.run_db_tests_only, - backend=shell_params.backend, - use_xdist=shell_params.use_xdist, - enable_coverage=shell_params.enable_coverage, - collect_only=shell_params.collect_only, - parallelism=shell_params.parallelism, - python_version=python_version, - parallel_test_types_list=shell_params.parallel_test_types_list, - helm_test_package=None, - keep_env_variables=shell_params.keep_env_variables, - no_db_cleanup=shell_params.no_db_cleanup, - database_isolation=shell_params.database_isolation, - ) + pytest_args = generate_args_for_pytest( + test_group=shell_params.test_group, + test_type=shell_params.test_type, + test_timeout=test_timeout, + skip_db_tests=shell_params.skip_db_tests, + run_db_tests_only=shell_params.run_db_tests_only, + backend=shell_params.backend, + use_xdist=shell_params.use_xdist, + enable_coverage=shell_params.enable_coverage, + collect_only=shell_params.collect_only, + parallelism=shell_params.parallelism, + python_version=python_version, + parallel_test_types_list=shell_params.parallel_test_types_list, + keep_env_variables=shell_params.keep_env_variables, + no_db_cleanup=shell_params.no_db_cleanup, ) - run_cmd.extend(list(extra_pytest_args)) + pytest_args.extend(extra_pytest_args) # Skip "FOLDER" in case "--ignore=FOLDER" is passed as an argument # Which might be the case if we are ignoring some providers during compatibility checks - run_cmd = [arg for arg in run_cmd if f"--ignore={arg}" not in run_cmd] + pytest_args_before_skip = pytest_args + pytest_args = [arg for arg in pytest_args if f"--ignore={arg}" not in pytest_args] + # Double check: If no test is leftover we can skip running the test + if pytest_args_before_skip != pytest_args and pytest_args[0].startswith("--"): + return 0, f"Skipped test, no tests needed: {shell_params.test_type}" + run_cmd.extend(pytest_args) try: remove_docker_networks(networks=[f"{compose_project_name}_default"]) result = run_command( @@ -270,16 +272,15 @@ def _run_test( def _run_tests_in_pool( - tests_to_run: list[str], - parallelism: int, - shell_params: ShellParams, + debug_resources: bool, extra_pytest_args: tuple, - test_timeout: int, - db_reset: bool, include_success_outputs: bool, - debug_resources: bool, + parallelism: int, + shell_params: ShellParams, skip_cleanup: bool, skip_docker_compose_down: bool, + test_timeout: int, + tests_to_run: list[str], ): if not tests_to_run: return @@ -289,12 +290,13 @@ def _run_tests_in_pool( # tests are still running. We are only adding here test types that take more than 2 minutes to run # on a fast machine in parallel sorting_order = [ - "Providers", - "Providers[-amazon,google]", + "Providers[standard]", + "Providers[amazon]", + "Providers[google]", + "API", "Other", - "Core", - "PythonVenv", "WWW", + "Core", "CLI", "Serialization", "Always", @@ -355,7 +357,6 @@ def pull_images_for_docker_compose(shell_params: ShellParams): def run_tests_in_parallel( shell_params: ShellParams, extra_pytest_args: tuple, - db_reset: bool, test_timeout: int, include_success_outputs: bool, debug_resources: bool, @@ -366,7 +367,6 @@ def run_tests_in_parallel( get_console().print("\n[info]Summary of the tests to run\n") get_console().print(f"[info]Running tests in parallel with parallelism={parallelism}") get_console().print(f"[info]Extra pytest args: {extra_pytest_args}") - get_console().print(f"[info]DB reset: {db_reset}") get_console().print(f"[info]Test timeout: {test_timeout}") get_console().print(f"[info]Include success outputs: {include_success_outputs}") get_console().print(f"[info]Debug resources: {debug_resources}") @@ -381,7 +381,6 @@ def run_tests_in_parallel( shell_params=shell_params, extra_pytest_args=extra_pytest_args, test_timeout=test_timeout, - db_reset=db_reset, include_success_outputs=include_success_outputs, debug_resources=debug_resources, skip_cleanup=skip_cleanup, @@ -418,21 +417,37 @@ def _verify_parallelism_parameters( is_flag=True, envvar="ENABLE_COVERAGE", ) -option_excluded_parallel_test_types = click.option( +option_excluded_parallel_core_test_types = click.option( "--excluded-parallel-test-types", - help="Space separated list of test types that will be excluded from parallel tes runs.", + help="Space separated list of core test types that will be excluded from parallel tes runs.", default="", show_default=True, envvar="EXCLUDED_PARALLEL_TEST_TYPES", - type=NotVerifiedBetterChoice(ALLOWED_PARALLEL_TEST_TYPE_CHOICES), + type=NotVerifiedBetterChoice(all_selective_core_test_types()), ) -option_parallel_test_types = click.option( +option_parallel_core_test_types = click.option( "--parallel-test-types", - help="Space separated list of test types used for testing in parallel", + help="Space separated list of core test types used for testing in parallel.", default=ALL_CI_SELECTIVE_TEST_TYPES, show_default=True, envvar="PARALLEL_TEST_TYPES", - type=NotVerifiedBetterChoice(ALLOWED_PARALLEL_TEST_TYPE_CHOICES), + type=NotVerifiedBetterChoice(all_selective_core_test_types()), +) +option_excluded_parallel_providers_test_types = click.option( + "--excluded-parallel-test-types", + help="Space separated list of provider test types that will be excluded from parallel tes runs. You can " + "for example `Providers[airbyte,http]`.", + default="", + envvar="EXCLUDED_PARALLEL_TEST_TYPES", + type=str, +) +option_parallel_providers_test_types = click.option( + "--parallel-test-types", + help="Space separated list of provider test types used for testing in parallel. You can also optionally " + "specify tests of which providers should be run: `Providers[airbyte,http]`.", + default=providers_test_type()[0], + envvar="PARALLEL_TEST_TYPES", + type=str, ) option_skip_docker_compose_down = click.option( "--skip-docker-compose-down", @@ -440,12 +455,6 @@ def _verify_parallelism_parameters( is_flag=True, envvar="SKIP_DOCKER_COMPOSE_DOWN", ) -option_skip_provider_tests = click.option( - "--skip-provider-tests", - help="Skip provider tests", - is_flag=True, - envvar="SKIP_PROVIDER_TESTS", -) option_skip_providers = click.option( "--skip-providers", help="Space-separated list of provider ids to skip when running tests", @@ -461,15 +470,31 @@ def _verify_parallelism_parameters( type=IntRange(min=0), show_default=True, ) -option_test_type = click.option( +option_test_type_core_group = click.option( "--test-type", - help="Type of test to run. With Providers, you can specify tests of which providers " + help="Type of tests to run for core test group", + default=ALL_TEST_TYPE, + envvar="TEST_TYPE", + show_default=True, + type=BetterChoice(ALLOWED_TEST_TYPE_CHOICES[GroupOfTests.CORE]), +) +option_test_type_providers_group = click.option( + "--test-type", + help="Type of test to run. You can also optionally specify tests of which providers " "should be run: `Providers[airbyte,http]` or " "excluded from the full test suite: `Providers[-amazon,google]`", - default="Default", + default=ALL_TEST_TYPE, + envvar="TEST_TYPE", + show_default=True, + type=NotVerifiedBetterChoice(ALLOWED_TEST_TYPE_CHOICES[GroupOfTests.PROVIDERS]), +) +option_test_type_helm = click.option( + "--test-type", + help="Type of helm tests to run", + default=ALL_TEST_TYPE, envvar="TEST_TYPE", show_default=True, - type=NotVerifiedBetterChoice(ALLOWED_TEST_TYPE_CHOICES), + type=BetterChoice(ALLOWED_TEST_TYPE_CHOICES[GroupOfTests.HELM]), ) option_use_xdist = click.option( "--use-xdist", @@ -491,13 +516,20 @@ def _verify_parallelism_parameters( show_default=True, envvar="SQLALCHEMY_WARN_20", ) +option_total_test_timeout = click.option( + "--total-test-timeout", + help="Total test timeout in seconds. This is the maximum time parallel tests will run. If there is " + "an underlying pytest command that hangs, the process will be stop with system exit after " + "that time. This should give a chance to upload logs as artifacts on CI.", + default=DEFAULT_TOTAL_TEST_TIMEOUT, + type=int, + envvar="TOTAL_TEST_TIMEOUT", +) @group_for_testing.command( - name="tests", - help="Run the specified unit tests. This is a low level testing command that allows you to run " - "various kind of tests subset with a number of options. You can also use dedicated commands such" - "us db_tests, non_db_tests, integration_tests for more opinionated test suite execution.", + name="core-tests", + help="Run all (default) or specified core unit tests.", context_settings=dict( ignore_unknown_options=True, allow_extra_args=True, @@ -506,33 +538,28 @@ def _verify_parallelism_parameters( @option_airflow_constraints_reference @option_backend @option_collect_only -@option_database_isolation +@option_clean_airflow_installation @option_db_reset @option_debug_resources @option_downgrade_pendulum @option_downgrade_sqlalchemy @option_dry_run @option_enable_coverage -@option_excluded_parallel_test_types +@option_excluded_parallel_core_test_types @option_force_sa_warnings @option_force_lowest_dependencies @option_forward_credentials @option_github_repository -@option_image_tag_for_running @option_include_success_outputs -@option_integration @option_install_airflow_with_constraints @option_keep_env_variables @option_mount_sources @option_mysql_version @option_no_db_cleanup @option_package_format -@option_parallel_test_types +@option_parallel_core_test_types @option_parallelism @option_postgres_version -@option_providers_constraints_location -@option_providers_skip_constraints -@option_pydantic @option_python @option_remove_arm_packages @option_run_db_tests_only @@ -540,44 +567,51 @@ def _verify_parallelism_parameters( @option_skip_cleanup @option_skip_db_tests @option_skip_docker_compose_down -@option_skip_provider_tests -@option_skip_providers @option_test_timeout -@option_test_type +@option_test_type_core_group +@option_total_test_timeout @option_upgrade_boto @option_use_airflow_version @option_use_packages_from_dist @option_use_xdist @option_verbose -@click.argument("extra_pytest_args", nargs=-1, type=click.UNPROCESSED) -def command_for_tests(**kwargs): - _run_test_command(**kwargs) +@click.argument("extra_pytest_args", nargs=-1, type=click.Path(path_type=str)) +def core_tests(**kwargs): + _run_test_command( + test_group=GroupOfTests.CORE, + integration=(), + excluded_providers="", + providers_skip_constraints=False, + providers_constraints_location="", + skip_providers="", + **kwargs, + ) @group_for_testing.command( - name="db-tests", - help="Run all (default) or specified DB-bound unit tests. This is a dedicated command that only runs " - "DB tests and it runs them in parallel via splitting tests by test types into separate " - "containers with separate database started for each container.", + name="providers-tests", + help="Run all (default) or specified Providers unit tests.", context_settings=dict( - ignore_unknown_options=False, - allow_extra_args=False, + ignore_unknown_options=True, + allow_extra_args=True, ), ) @option_airflow_constraints_reference @option_backend @option_collect_only -@option_database_isolation +@option_clean_airflow_installation +@option_db_reset @option_debug_resources @option_downgrade_pendulum @option_downgrade_sqlalchemy @option_dry_run @option_enable_coverage -@option_excluded_parallel_test_types -@option_forward_credentials +@option_excluded_providers +@option_excluded_parallel_providers_test_types +@option_force_sa_warnings @option_force_lowest_dependencies +@option_forward_credentials @option_github_repository -@option_image_tag_for_running @option_include_success_outputs @option_install_airflow_with_constraints @option_keep_env_variables @@ -585,253 +619,115 @@ def command_for_tests(**kwargs): @option_mysql_version @option_no_db_cleanup @option_package_format -@option_parallel_test_types +@option_parallel_providers_test_types @option_parallelism @option_postgres_version @option_providers_constraints_location @option_providers_skip_constraints -@option_pydantic @option_python @option_remove_arm_packages +@option_run_db_tests_only +@option_run_in_parallel @option_skip_cleanup +@option_skip_db_tests @option_skip_docker_compose_down -@option_skip_provider_tests @option_skip_providers @option_test_timeout +@option_test_type_providers_group +@option_total_test_timeout @option_upgrade_boto @option_use_airflow_version @option_use_packages_from_dist -@option_force_sa_warnings +@option_use_xdist @option_verbose -def command_for_db_tests(**kwargs): - _run_test_command( - integration=(), - run_in_parallel=True, - use_xdist=False, - skip_db_tests=False, - run_db_tests_only=True, - test_type="Default", - db_reset=True, - extra_pytest_args=(), - **kwargs, - ) +@click.argument("extra_pytest_args", nargs=-1, type=click.Path(path_type=str)) +def providers_tests(**kwargs): + _run_test_command(test_group=GroupOfTests.PROVIDERS, integration=(), **kwargs) @group_for_testing.command( - name="non-db-tests", - help="Run all (default) or specified Non-DB unit tests. This is a dedicated command that only" - "runs Non-DB tests and it runs them in parallel via pytest-xdist in single container, " - "with `none` backend set.", + name="core-integration-tests", + help="Run the specified integration tests.", context_settings=dict( - ignore_unknown_options=False, - allow_extra_args=False, + ignore_unknown_options=True, + allow_extra_args=True, ), ) -@option_airflow_constraints_reference +@option_backend @option_collect_only -@option_debug_resources -@option_downgrade_sqlalchemy -@option_downgrade_pendulum +@option_db_reset @option_dry_run @option_enable_coverage -@option_excluded_parallel_test_types +@option_force_sa_warnings @option_forward_credentials -@option_force_lowest_dependencies @option_github_repository -@option_image_tag_for_running -@option_include_success_outputs -@option_install_airflow_with_constraints +@option_core_integration @option_keep_env_variables @option_mount_sources +@option_mysql_version @option_no_db_cleanup -@option_package_format -@option_parallel_test_types -@option_parallelism -@option_providers_constraints_location -@option_providers_skip_constraints -@option_pydantic +@option_postgres_version @option_python -@option_remove_arm_packages -@option_skip_cleanup @option_skip_docker_compose_down -@option_skip_provider_tests -@option_skip_providers @option_test_timeout -@option_upgrade_boto -@option_use_airflow_version -@option_use_packages_from_dist -@option_force_sa_warnings @option_verbose -def command_for_non_db_tests(**kwargs): - _run_test_command( - backend="none", - database_isolation=False, - db_reset=False, - extra_pytest_args=(), - integration=(), - run_db_tests_only=False, - run_in_parallel=False, - skip_db_tests=True, - test_type="Default", - use_xdist=True, - **kwargs, - ) - - -def _run_test_command( - *, - airflow_constraints_reference: str, +@click.argument("extra_pytest_args", nargs=-1, type=click.Path(path_type=str)) +def core_integration_tests( backend: str, collect_only: bool, db_reset: bool, - database_isolation: bool, - debug_resources: bool, - downgrade_sqlalchemy: bool, - downgrade_pendulum: bool, enable_coverage: bool, - excluded_parallel_test_types: str, extra_pytest_args: tuple, force_sa_warnings: bool, forward_credentials: bool, - force_lowest_dependencies: bool, github_repository: str, - image_tag: str | None, - include_success_outputs: bool, - install_airflow_with_constraints: bool, - integration: tuple[str, ...], keep_env_variables: bool, + integration: tuple, mount_sources: str, + mysql_version: str, no_db_cleanup: bool, - parallel_test_types: str, - parallelism: int, - package_format: str, - providers_constraints_location: str, - providers_skip_constraints: bool, - pydantic: str, + postgres_version: str, python: str, - remove_arm_packages: bool, - run_db_tests_only: bool, - run_in_parallel: bool, - skip_cleanup: bool, - skip_db_tests: bool, skip_docker_compose_down: bool, - skip_provider_tests: bool, - skip_providers: str, test_timeout: int, - test_type: str, - upgrade_boto: bool, - use_airflow_version: str | None, - use_packages_from_dist: bool, - use_xdist: bool, - mysql_version: str = "", - postgres_version: str = "", ): - docker_filesystem = get_filesystem_type("/var/lib/docker") - get_console().print(f"Docker filesystem: {docker_filesystem}") - _verify_parallelism_parameters( - excluded_parallel_test_types, run_db_tests_only, run_in_parallel, use_xdist - ) - test_list = parallel_test_types.split(" ") - excluded_test_list = excluded_parallel_test_types.split(" ") - if excluded_test_list: - test_list = [test for test in test_list if test not in excluded_test_list] - if skip_provider_tests or "Providers" in excluded_test_list: - test_list = [test for test in test_list if not test.startswith("Providers")] shell_params = ShellParams( - airflow_constraints_reference=airflow_constraints_reference, + test_group=GroupOfTests.INTEGRATION_CORE, backend=backend, collect_only=collect_only, - database_isolation=database_isolation, - downgrade_sqlalchemy=downgrade_sqlalchemy, - downgrade_pendulum=downgrade_pendulum, enable_coverage=enable_coverage, - force_sa_warnings=force_sa_warnings, - force_lowest_dependencies=force_lowest_dependencies, forward_credentials=forward_credentials, forward_ports=False, github_repository=github_repository, - image_tag=image_tag, integration=integration, - install_airflow_with_constraints=install_airflow_with_constraints, keep_env_variables=keep_env_variables, mount_sources=mount_sources, mysql_version=mysql_version, no_db_cleanup=no_db_cleanup, - package_format=package_format, - parallel_test_types_list=test_list, - parallelism=parallelism, postgres_version=postgres_version, - providers_constraints_location=providers_constraints_location, - providers_skip_constraints=providers_skip_constraints, - pydantic=pydantic, python=python, - remove_arm_packages=remove_arm_packages, - run_db_tests_only=run_db_tests_only, - skip_db_tests=skip_db_tests, - skip_provider_tests=skip_provider_tests, - test_type=test_type, - upgrade_boto=upgrade_boto, - use_airflow_version=use_airflow_version, - use_packages_from_dist=use_packages_from_dist, - use_xdist=use_xdist, + test_type="All", + force_sa_warnings=force_sa_warnings, run_tests=True, db_reset=db_reset, ) - rebuild_or_pull_ci_image_if_needed(command_params=shell_params) fix_ownership_using_docker() cleanup_python_generated_files() perform_environment_checks() - if pydantic != "v2": - # Avoid edge cases when there are no available tests, e.g. No-Pydantic for Weaviate provider. - # https://docs.pytest.org/en/stable/reference/exit-codes.html - # https://github.com/apache/airflow/pull/38402#issuecomment-2014938950 - extra_pytest_args = (*extra_pytest_args, "--suppress-no-test-exit-code") - if skip_providers: - ignored_path_list = [ - f"--ignore=tests/providers/{provider_id.replace('.','/')}" - for provider_id in skip_providers.split(" ") - ] - extra_pytest_args = (*extra_pytest_args, *ignored_path_list) - if run_in_parallel: - if test_type != "Default": - get_console().print( - "[error]You should not specify --test-type when --run-in-parallel is set[/]. " - f"Your test type = {test_type}\n" - ) - sys.exit(1) - run_tests_in_parallel( - shell_params=shell_params, - extra_pytest_args=extra_pytest_args, - db_reset=db_reset, - test_timeout=test_timeout, - include_success_outputs=include_success_outputs, - parallelism=parallelism, - skip_cleanup=skip_cleanup, - debug_resources=debug_resources, - skip_docker_compose_down=skip_docker_compose_down, - ) - else: - if shell_params.test_type == "Default": - if any([arg.startswith("tests") for arg in extra_pytest_args]): - # in case some tests are specified as parameters, do not pass "tests" as default - shell_params.test_type = "None" - shell_params.parallel_test_types_list = [] - else: - shell_params.test_type = "All" - returncode, _ = _run_test( - shell_params=shell_params, - extra_pytest_args=extra_pytest_args, - python_version=python, - output=None, - test_timeout=test_timeout, - output_outside_the_group=True, - skip_docker_compose_down=skip_docker_compose_down, - ) - sys.exit(returncode) + returncode, _ = _run_test( + shell_params=shell_params, + extra_pytest_args=extra_pytest_args, + python_version=python, + output=None, + test_timeout=test_timeout, + output_outside_the_group=True, + skip_docker_compose_down=skip_docker_compose_down, + ) + sys.exit(returncode) @group_for_testing.command( - name="integration-tests", + name="providers-integration-tests", help="Run the specified integration tests.", context_settings=dict( ignore_unknown_options=True, @@ -839,55 +735,59 @@ def _run_test_command( ), ) @option_backend +@option_collect_only @option_db_reset @option_dry_run @option_enable_coverage +@option_force_sa_warnings @option_forward_credentials @option_github_repository -@option_image_tag_for_running -@option_integration +@option_providers_integration +@option_keep_env_variables @option_mount_sources @option_mysql_version +@option_no_db_cleanup @option_postgres_version @option_python -@option_skip_provider_tests +@option_skip_docker_compose_down @option_test_timeout -@option_force_sa_warnings @option_verbose -@click.argument("extra_pytest_args", nargs=-1, type=click.UNPROCESSED) -def integration_tests( +@click.argument("extra_pytest_args", nargs=-1, type=click.Path(path_type=str)) +def integration_providers_tests( backend: str, + collect_only: bool, db_reset: bool, enable_coverage: bool, extra_pytest_args: tuple, + force_sa_warnings: bool, forward_credentials: bool, github_repository: str, - image_tag: str | None, integration: tuple, + keep_env_variables: bool, mount_sources: str, mysql_version: str, + no_db_cleanup: bool, postgres_version: str, python: str, - skip_provider_tests: bool, - force_sa_warnings: bool, + skip_docker_compose_down: bool, test_timeout: int, ): - docker_filesystem = get_filesystem_type("/var/lib/docker") - get_console().print(f"Docker filesystem: {docker_filesystem}") shell_params = ShellParams( + test_group=GroupOfTests.INTEGRATION_PROVIDERS, backend=backend, + collect_only=collect_only, enable_coverage=enable_coverage, forward_credentials=forward_credentials, forward_ports=False, github_repository=github_repository, - image_tag=image_tag, integration=integration, + keep_env_variables=keep_env_variables, mount_sources=mount_sources, mysql_version=mysql_version, + no_db_cleanup=no_db_cleanup, postgres_version=postgres_version, python=python, - skip_provider_tests=skip_provider_tests, - test_type="Integration", + test_type="All", force_sa_warnings=force_sa_warnings, run_tests=True, db_reset=db_reset, @@ -902,6 +802,86 @@ def integration_tests( output=None, test_timeout=test_timeout, output_outside_the_group=True, + skip_docker_compose_down=skip_docker_compose_down, + ) + sys.exit(returncode) + + +@group_for_testing.command( + name="system-tests", + help="Run the specified system tests.", + context_settings=dict( + ignore_unknown_options=True, + allow_extra_args=True, + ), +) +@option_backend +@option_collect_only +@option_db_reset +@option_dry_run +@option_enable_coverage +@option_force_sa_warnings +@option_forward_credentials +@option_github_repository +@option_keep_env_variables +@option_mount_sources +@option_mysql_version +@option_no_db_cleanup +@option_postgres_version +@option_python +@option_skip_docker_compose_down +@option_test_timeout +@option_verbose +@click.argument("extra_pytest_args", nargs=-1, type=click.Path(path_type=str)) +def system_tests( + backend: str, + collect_only: bool, + db_reset: bool, + enable_coverage: bool, + extra_pytest_args: tuple, + force_sa_warnings: bool, + forward_credentials: bool, + github_repository: str, + keep_env_variables: bool, + mount_sources: str, + mysql_version: str, + no_db_cleanup: bool, + postgres_version: str, + python: str, + skip_docker_compose_down: bool, + test_timeout: int, +): + shell_params = ShellParams( + test_group=GroupOfTests.SYSTEM, + backend=backend, + collect_only=collect_only, + enable_coverage=enable_coverage, + forward_credentials=forward_credentials, + forward_ports=False, + github_repository=github_repository, + integration=(), + keep_env_variables=keep_env_variables, + mount_sources=mount_sources, + mysql_version=mysql_version, + no_db_cleanup=no_db_cleanup, + postgres_version=postgres_version, + python=python, + test_type="None", + force_sa_warnings=force_sa_warnings, + run_tests=True, + db_reset=db_reset, + ) + fix_ownership_using_docker() + cleanup_python_generated_files() + perform_environment_checks() + returncode, _ = _run_test( + shell_params=shell_params, + extra_pytest_args=extra_pytest_args, + python_version=python, + output=None, + test_timeout=test_timeout, + output_outside_the_group=True, + skip_docker_compose_down=skip_docker_compose_down, ) sys.exit(returncode) @@ -914,49 +894,38 @@ def integration_tests( allow_extra_args=True, ), ) -@option_image_tag_for_running @option_mount_sources @option_github_repository @option_test_timeout @option_parallelism +@option_test_type_helm @option_use_xdist @option_verbose @option_dry_run -@click.option( - "--helm-test-package", - help="Package to tests", - default="all", - type=BetterChoice(ALLOWED_HELM_TEST_PACKAGES), -) -@click.argument("extra_pytest_args", nargs=-1, type=click.UNPROCESSED) +@click.argument("extra_pytest_args", nargs=-1, type=click.Path(path_type=str)) def helm_tests( extra_pytest_args: tuple, - image_tag: str | None, mount_sources: str, - helm_test_package: str, github_repository: str, test_timeout: int, + test_type: str, parallelism: int, use_xdist: bool, ): - if helm_test_package == "all": - helm_test_package = "" shell_params = ShellParams( - image_tag=image_tag, mount_sources=mount_sources, github_repository=github_repository, run_tests=True, - test_type="Helm", - helm_test_package=helm_test_package, + test_type=test_type, ) env = shell_params.env_variables_for_docker_commands perform_environment_checks() fix_ownership_using_docker() cleanup_python_generated_files() pytest_args = generate_args_for_pytest( - test_type="Helm", + test_group=GroupOfTests.HELM, + test_type=test_type, test_timeout=test_timeout, - skip_provider_tests=True, skip_db_tests=False, run_db_tests_only=False, backend="none", @@ -966,12 +935,253 @@ def helm_tests( parallelism=parallelism, parallel_test_types_list=[], python_version=shell_params.python, - helm_test_package=helm_test_package, keep_env_variables=False, no_db_cleanup=False, - database_isolation=False, ) cmd = ["docker", "compose", "run", "--service-ports", "--rm", "airflow", *pytest_args, *extra_pytest_args] result = run_command(cmd, check=False, env=env, output_outside_the_group=True) fix_ownership_using_docker() sys.exit(result.returncode) + + +@group_for_testing.command( + name="python-api-client-tests", + help="Run python api client tests.", + context_settings=dict( + ignore_unknown_options=True, + allow_extra_args=True, + ), +) +@option_backend +@option_collect_only +@option_db_reset +@option_no_db_cleanup +@option_enable_coverage +@option_force_sa_warnings +@option_forward_credentials +@option_github_repository +@option_keep_env_variables +@option_mysql_version +@option_postgres_version +@option_python +@option_skip_docker_compose_down +@option_test_timeout +@option_dry_run +@option_verbose +@click.argument("extra_pytest_args", nargs=-1, type=click.Path(path_type=str)) +def python_api_client_tests( + backend: str, + collect_only: bool, + db_reset: bool, + no_db_cleanup: bool, + enable_coverage: bool, + force_sa_warnings: bool, + forward_credentials: bool, + github_repository: str, + keep_env_variables: bool, + mysql_version: str, + postgres_version: str, + python: str, + skip_docker_compose_down: bool, + test_timeout: int, + extra_pytest_args: tuple, +): + shell_params = ShellParams( + test_group=GroupOfTests.PYTHON_API_CLIENT, + backend=backend, + collect_only=collect_only, + enable_coverage=enable_coverage, + forward_credentials=forward_credentials, + forward_ports=False, + github_repository=github_repository, + integration=(), + keep_env_variables=keep_env_variables, + mysql_version=mysql_version, + postgres_version=postgres_version, + python=python, + test_type="python-api-client", + force_sa_warnings=force_sa_warnings, + run_tests=True, + db_reset=db_reset, + no_db_cleanup=no_db_cleanup, + install_airflow_python_client=True, + start_webserver_with_examples=True, + ) + rebuild_or_pull_ci_image_if_needed(command_params=shell_params) + fix_ownership_using_docker() + cleanup_python_generated_files() + perform_environment_checks() + returncode, _ = _run_test( + shell_params=shell_params, + extra_pytest_args=extra_pytest_args, + python_version=python, + output=None, + test_timeout=test_timeout, + output_outside_the_group=True, + skip_docker_compose_down=skip_docker_compose_down, + ) + sys.exit(returncode) + + +@contextlib.contextmanager +def run_with_timeout(timeout: int): + def timeout_handler(signum, frame): + get_console().print("[error]Timeout reached. Killing the container(s)[/]") + list_of_containers = run_command( + ["docker", "ps", "-q"], + check=True, + capture_output=True, + text=True, + ) + run_command( + ["docker", "kill", "--signal", "SIGQUIT"] + list_of_containers.stdout.splitlines(), + check=True, + capture_output=False, + text=True, + ) + + signal.signal(signal.SIGALRM, timeout_handler) + signal.alarm(timeout) + try: + yield + finally: + signal.alarm(0) + + +def _run_test_command( + *, + test_group: GroupOfTests, + airflow_constraints_reference: str, + backend: str, + collect_only: bool, + clean_airflow_installation: bool, + db_reset: bool, + debug_resources: bool, + downgrade_sqlalchemy: bool, + downgrade_pendulum: bool, + enable_coverage: bool, + excluded_parallel_test_types: str, + excluded_providers: str, + extra_pytest_args: tuple, + force_sa_warnings: bool, + forward_credentials: bool, + force_lowest_dependencies: bool, + github_repository: str, + include_success_outputs: bool, + install_airflow_with_constraints: bool, + integration: tuple[str, ...], + keep_env_variables: bool, + mount_sources: str, + no_db_cleanup: bool, + parallel_test_types: str, + parallelism: int, + package_format: str, + providers_constraints_location: str, + providers_skip_constraints: bool, + python: str, + remove_arm_packages: bool, + run_db_tests_only: bool, + run_in_parallel: bool, + skip_cleanup: bool, + skip_db_tests: bool, + skip_docker_compose_down: bool, + skip_providers: str, + test_timeout: int, + test_type: str, + total_test_timeout: int, + upgrade_boto: bool, + use_airflow_version: str | None, + use_packages_from_dist: bool, + use_xdist: bool, + mysql_version: str = "", + postgres_version: str = "", +): + _verify_parallelism_parameters( + excluded_parallel_test_types, run_db_tests_only, run_in_parallel, use_xdist + ) + test_list = parallel_test_types.split(" ") + excluded_test_list = excluded_parallel_test_types.split(" ") + if excluded_test_list: + test_list = [test for test in test_list if test not in excluded_test_list] + shell_params = ShellParams( + airflow_constraints_reference=airflow_constraints_reference, + backend=backend, + collect_only=collect_only, + clean_airflow_installation=clean_airflow_installation, + downgrade_sqlalchemy=downgrade_sqlalchemy, + downgrade_pendulum=downgrade_pendulum, + enable_coverage=enable_coverage, + excluded_providers=excluded_providers, + force_sa_warnings=force_sa_warnings, + force_lowest_dependencies=force_lowest_dependencies, + forward_credentials=forward_credentials, + forward_ports=False, + github_repository=github_repository, + integration=integration, + install_airflow_with_constraints=install_airflow_with_constraints, + keep_env_variables=keep_env_variables, + mount_sources=mount_sources, + mysql_version=mysql_version, + no_db_cleanup=no_db_cleanup, + package_format=package_format, + parallel_test_types_list=test_list, + parallelism=parallelism, + postgres_version=postgres_version, + providers_constraints_location=providers_constraints_location, + providers_skip_constraints=providers_skip_constraints, + python=python, + remove_arm_packages=remove_arm_packages, + run_db_tests_only=run_db_tests_only, + skip_db_tests=skip_db_tests, + test_type=test_type, + test_group=test_group, + upgrade_boto=upgrade_boto, + use_airflow_version=use_airflow_version, + use_packages_from_dist=use_packages_from_dist, + use_xdist=use_xdist, + run_tests=True, + db_reset=db_reset if not skip_db_tests else False, + ) + rebuild_or_pull_ci_image_if_needed(command_params=shell_params) + fix_ownership_using_docker() + cleanup_python_generated_files() + perform_environment_checks() + if skip_providers: + ignored_path_list = [ + f"--ignore=tests/providers/{provider_id.replace('.','/')}" + for provider_id in skip_providers.split(" ") + ] + extra_pytest_args = (*extra_pytest_args, *ignored_path_list) + if run_in_parallel: + if test_type != ALL_TEST_TYPE: + get_console().print( + "[error]You should not specify --test-type when --run-in-parallel is set[/]. " + f"Your test type = {test_type}\n" + ) + sys.exit(1) + with run_with_timeout(total_test_timeout): + run_tests_in_parallel( + shell_params=shell_params, + extra_pytest_args=extra_pytest_args, + test_timeout=test_timeout, + include_success_outputs=include_success_outputs, + parallelism=parallelism, + skip_cleanup=skip_cleanup, + debug_resources=debug_resources, + skip_docker_compose_down=skip_docker_compose_down, + ) + else: + if shell_params.test_type == ALL_TEST_TYPE: + if any(["tests/" in arg and not arg.startswith("-") for arg in extra_pytest_args]): + shell_params.test_type = "None" + shell_params.parallel_test_types_list = [] + returncode, _ = _run_test( + shell_params=shell_params, + extra_pytest_args=extra_pytest_args, + python_version=python, + output=None, + test_timeout=test_timeout, + output_outside_the_group=True, + skip_docker_compose_down=skip_docker_compose_down, + ) + sys.exit(returncode) diff --git a/dev/breeze/src/airflow_breeze/commands/testing_commands_config.py b/dev/breeze/src/airflow_breeze/commands/testing_commands_config.py index 5f464bfd9b917..895756ec2ab7d 100644 --- a/dev/breeze/src/airflow_breeze/commands/testing_commands_config.py +++ b/dev/breeze/src/airflow_breeze/commands/testing_commands_config.py @@ -16,273 +16,185 @@ # under the License. from __future__ import annotations -TESTING_COMMANDS: dict[str, str | list[str]] = { - "name": "Testing", - "commands": ["tests", "integration-tests", "helm-tests", "docker-compose-tests"], +TEST_OPTIONS_NON_DB: dict[str, str | list[str]] = { + "name": "Test options", + "options": [ + "--test-timeout", + "--enable-coverage", + "--collect-only", + ], +} + +TEST_OPTIONS_DB: dict[str, str | list[str]] = { + "name": "Test options", + "options": [ + "--test-timeout", + "--enable-coverage", + "--collect-only", + "--db-reset", + ], +} + +TEST_ENVIRONMENT_DB: dict[str, str | list[str]] = { + "name": "Test environment", + "options": [ + "--backend", + "--no-db-cleanup", + "--python", + "--postgres-version", + "--mysql-version", + "--forward-credentials", + "--force-sa-warnings", + ], +} + +TEST_PARALLELISM_OPTIONS: dict[str, str | list[str]] = { + "name": "Options for parallel test commands", + "options": [ + "--run-in-parallel", + "--use-xdist", + "--parallelism", + "--skip-cleanup", + "--debug-resources", + "--include-success-outputs", + "--total-test-timeout", + ], +} + +TEST_UPGRADING_PACKAGES: dict[str, str | list[str]] = { + "name": "Upgrading/downgrading/removing selected packages", + "options": [ + "--upgrade-boto", + "--downgrade-sqlalchemy", + "--downgrade-pendulum", + "--remove-arm-packages", + ], +} + +TEST_ADVANCED_FLAGS: dict[str, str | list[str]] = { + "name": "Advanced flag for tests command", + "options": [ + "--github-repository", + "--mount-sources", + "--skip-docker-compose-down", + "--keep-env-variables", + ], +} + +TEST_ADVANCED_FLAGS_FOR_INSTALLATION: dict[str, str | list[str]] = { + "name": "Advanced flag for installing airflow in container", + "options": [ + "--airflow-constraints-reference", + "--clean-airflow-installation", + "--force-lowest-dependencies", + "--install-airflow-with-constraints", + "--package-format", + "--use-airflow-version", + "--use-packages-from-dist", + ], +} + +TEST_ADVANCED_FLAGS_FOR_PROVIDERS: dict[str, str | list[str]] = { + "name": "Advanced flag for provider tests command", + "options": [ + "--excluded-providers", + "--providers-constraints-location", + "--providers-skip-constraints", + "--skip-providers", + ], +} + +TEST_PARAMS: list[dict[str, str | list[str]]] = [ + { + "name": "Select test types to run (tests can also be selected by command args individually)", + "options": [ + "--test-type", + "--parallel-test-types", + "--excluded-parallel-test-types", + ], + }, + TEST_OPTIONS_DB, + { + "name": "Selectively run DB or non-DB tests", + "options": [ + "--run-db-tests-only", + "--skip-db-tests", + ], + }, + TEST_ENVIRONMENT_DB, + TEST_PARALLELISM_OPTIONS, + TEST_UPGRADING_PACKAGES, +] + +INTEGRATION_TESTS: dict[str, str | list[str]] = { + "name": "Integration tests", + "options": [ + "--integration", + ], } + +TESTING_COMMANDS: list[dict[str, str | list[str]]] = [ + { + "name": "Core Tests", + "commands": [ + "core-tests", + "core-integration-tests", + ], + }, + { + "name": "Providers Tests", + "commands": ["providers-tests", "providers-integration-tests"], + }, + { + "name": "Other Tests", + "commands": ["system-tests", "helm-tests", "docker-compose-tests", "python-api-client-tests"], + }, +] + TESTING_PARAMETERS: dict[str, list[dict[str, str | list[str]]]] = { - "breeze testing tests": [ - { - "name": "Select test types to run (tests can also be selected by command args individually)", - "options": [ - "--test-type", - "--parallel-test-types", - "--excluded-parallel-test-types", - ], - }, - { - "name": "Test options", - "options": [ - "--test-timeout", - "--enable-coverage", - "--collect-only", - "--db-reset", - "--skip-provider-tests", - ], - }, - { - "name": "Selectively run DB or non-DB tests", - "options": [ - "--run-db-tests-only", - "--skip-db-tests", - ], - }, - { - "name": "Test environment", - "options": [ - "--integration", - "--backend", - "--database-isolation", - "--python", - "--postgres-version", - "--mysql-version", - "--forward-credentials", - "--force-sa-warnings", - ], - }, - { - "name": "Options for parallel test commands", - "options": [ - "--run-in-parallel", - "--use-xdist", - "--parallelism", - "--skip-cleanup", - "--debug-resources", - "--include-success-outputs", - ], - }, - { - "name": "Upgrading/downgrading/removing selected packages", - "options": [ - "--upgrade-boto", - "--downgrade-sqlalchemy", - "--downgrade-pendulum", - "--pydantic", - "--remove-arm-packages", - ], - }, - { - "name": "Advanced flag for tests command", - "options": [ - "--airflow-constraints-reference", - "--force-lowest-dependencies", - "--github-repository", - "--image-tag", - "--install-airflow-with-constraints", - "--package-format", - "--providers-constraints-location", - "--providers-skip-constraints", - "--use-airflow-version", - "--use-packages-from-dist", - "--mount-sources", - "--skip-docker-compose-down", - "--skip-providers", - "--keep-env-variables", - "--no-db-cleanup", - ], - }, + "breeze testing core-tests": [ + *TEST_PARAMS, + TEST_ADVANCED_FLAGS, + TEST_ADVANCED_FLAGS_FOR_INSTALLATION, ], - "breeze testing non-db-tests": [ - { - "name": "Select test types to run", - "options": [ - "--parallel-test-types", - "--excluded-parallel-test-types", - ], - }, - { - "name": "Test options", - "options": [ - "--test-timeout", - "--enable-coverage", - "--collect-only", - "--skip-provider-tests", - ], - }, - { - "name": "Test environment", - "options": [ - "--python", - "--forward-credentials", - "--force-sa-warnings", - ], - }, - { - "name": "Options for parallel test commands", - "options": [ - "--parallelism", - "--skip-cleanup", - "--debug-resources", - "--include-success-outputs", - ], - }, - { - "name": "Upgrading/downgrading/removing selected packages", - "options": [ - "--upgrade-boto", - "--downgrade-sqlalchemy", - "--downgrade-pendulum", - "--pydantic", - "--remove-arm-packages", - ], - }, - { - "name": "Advanced flag for tests command", - "options": [ - "--airflow-constraints-reference", - "--force-lowest-dependencies", - "--github-repository", - "--image-tag", - "--install-airflow-with-constraints", - "--package-format", - "--providers-constraints-location", - "--providers-skip-constraints", - "--use-airflow-version", - "--use-packages-from-dist", - "--mount-sources", - "--skip-docker-compose-down", - "--skip-providers", - "--keep-env-variables", - "--no-db-cleanup", - ], - }, + "breeze testing providers-tests": [ + *TEST_PARAMS, + TEST_ADVANCED_FLAGS, + TEST_ADVANCED_FLAGS_FOR_INSTALLATION, + TEST_ADVANCED_FLAGS_FOR_PROVIDERS, ], - "breeze testing db-tests": [ - { - "name": "Select tests to run", - "options": [ - "--parallel-test-types", - "--database-isolation", - "--excluded-parallel-test-types", - ], - }, - { - "name": "Test options", - "options": [ - "--test-timeout", - "--enable-coverage", - "--collect-only", - "--skip-provider-tests", - ], - }, - { - "name": "Test environment", - "options": [ - "--backend", - "--python", - "--postgres-version", - "--mysql-version", - "--forward-credentials", - "--force-sa-warnings", - ], - }, - { - "name": "Options for parallel test commands", - "options": [ - "--parallelism", - "--skip-cleanup", - "--debug-resources", - "--include-success-outputs", - ], - }, - { - "name": "Upgrading/downgrading/removing selected packages", - "options": [ - "--upgrade-boto", - "--downgrade-sqlalchemy", - "--downgrade-pendulum", - "--pydantic", - "--remove-arm-packages", - ], - }, - { - "name": "Advanced flag for tests command", - "options": [ - "--airflow-constraints-reference", - "--force-lowest-dependencies", - "--github-repository", - "--image-tag", - "--install-airflow-with-constraints", - "--package-format", - "--providers-constraints-location", - "--providers-skip-constraints", - "--use-airflow-version", - "--use-packages-from-dist", - "--mount-sources", - "--skip-docker-compose-down", - "--skip-providers", - "--keep-env-variables", - "--no-db-cleanup", - ], - }, + "breeze testing core-integration-tests": [ + TEST_OPTIONS_DB, + TEST_ENVIRONMENT_DB, + INTEGRATION_TESTS, + TEST_ADVANCED_FLAGS, ], - "breeze testing integration-tests": [ - { - "name": "Test options", - "options": [ - "--test-timeout", - "--enable-coverage", - "--db-reset", - "--skip-provider-tests", - ], - }, - { - "name": "Test environment", - "options": [ - "--integration", - "--backend", - "--python", - "--postgres-version", - "--mysql-version", - "--forward-credentials", - "--force-sa-warnings", - ], - }, - { - "name": "Advanced flag for integration tests command", - "options": [ - "--image-tag", - "--mount-sources", - "--github-repository", - ], - }, + "breeze testing providers-integration-tests": [ + TEST_OPTIONS_DB, + TEST_ENVIRONMENT_DB, + INTEGRATION_TESTS, + TEST_ADVANCED_FLAGS, + ], + "breeze testing system-tests": [ + TEST_OPTIONS_DB, + TEST_ENVIRONMENT_DB, + TEST_ADVANCED_FLAGS, ], "breeze testing helm-tests": [ { "name": "Flags for helms-tests command", "options": [ - "--helm-test-package", + "--test-type", "--test-timeout", "--use-xdist", "--parallelism", ], }, { - "name": "Advanced flags for helms-tests command", + "name": "Advanced flag for helm-test command", "options": [ - "--image-tag", - "--mount-sources", "--github-repository", + "--mount-sources", ], }, ], @@ -291,11 +203,22 @@ "name": "Docker-compose tests flag", "options": [ "--image-name", - "--image-tag", "--python", "--skip-docker-compose-deletion", "--github-repository", ], } ], + "breeze testing python-api-client-tests": [ + { + "name": "Advanced flag for tests command", + "options": [ + "--github-repository", + "--skip-docker-compose-down", + "--keep-env-variables", + ], + }, + TEST_OPTIONS_DB, + TEST_ENVIRONMENT_DB, + ], } diff --git a/dev/breeze/src/airflow_breeze/configure_rich_click.py b/dev/breeze/src/airflow_breeze/configure_rich_click.py index fc6ccb9d59084..a7a147a04e5d0 100644 --- a/dev/breeze/src/airflow_breeze/configure_rich_click.py +++ b/dev/breeze/src/airflow_breeze/configure_rich_click.py @@ -97,7 +97,7 @@ "commands": ["setup", "ci"], }, ], - "breeze testing": [TESTING_COMMANDS], + "breeze testing": TESTING_COMMANDS, "breeze k8s": [ KUBERNETES_CLUSTER_COMMANDS, KUBERNETES_INSPECTION_COMMANDS, diff --git a/dev/breeze/src/airflow_breeze/global_constants.py b/dev/breeze/src/airflow_breeze/global_constants.py index 2e1b76dd0318f..d374a30ededb9 100644 --- a/dev/breeze/src/airflow_breeze/global_constants.py +++ b/dev/breeze/src/airflow_breeze/global_constants.py @@ -23,9 +23,9 @@ import json import platform from enum import Enum -from functools import lru_cache from pathlib import Path +from airflow_breeze.utils.functools_cache import clearable_cache from airflow_breeze.utils.host_info_utils import Architecture from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT @@ -48,17 +48,24 @@ ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS = ["3.8", "3.9", "3.10", "3.11", "3.12"] DEFAULT_PYTHON_MAJOR_MINOR_VERSION = ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS[0] ALLOWED_ARCHITECTURES = [Architecture.X86_64, Architecture.ARM] -# Database Backends used when starting Breeze. The "none" value means that invalid configuration -# Is set and no database started - access to a database will fail. -ALLOWED_BACKENDS = ["sqlite", "mysql", "postgres", "none"] -ALLOWED_PROD_BACKENDS = ["mysql", "postgres"] +# Database Backends used when starting Breeze. The "none" value means that the configuration is invalid. +# No database will be started - access to a database will fail. +SQLITE_BACKEND = "sqlite" +MYSQL_BACKEND = "mysql" +POSTGRES_BACKEND = "postgres" +NONE_BACKEND = "none" +ALLOWED_BACKENDS = [SQLITE_BACKEND, MYSQL_BACKEND, POSTGRES_BACKEND, NONE_BACKEND] +ALLOWED_PROD_BACKENDS = [MYSQL_BACKEND, POSTGRES_BACKEND] DEFAULT_BACKEND = ALLOWED_BACKENDS[0] -TESTABLE_INTEGRATIONS = [ +CELERY_INTEGRATION = "celery" +TESTABLE_CORE_INTEGRATIONS = [ + CELERY_INTEGRATION, + "kerberos", +] +TESTABLE_PROVIDERS_INTEGRATIONS = [ "cassandra", - "celery", "drill", "kafka", - "kerberos", "mongo", "mssql", "pinot", @@ -70,19 +77,45 @@ DISABLE_TESTABLE_INTEGRATIONS_FROM_CI = [ "mssql", ] -OTHER_INTEGRATIONS = ["statsd", "otel", "openlineage"] +KEYCLOAK_INTEGRATION = "keycloak" +STATSD_INTEGRATION = "statsd" +OTEL_INTEGRATION = "otel" +OPENLINEAGE_INTEGRATION = "openlineage" +OTHER_CORE_INTEGRATIONS = [STATSD_INTEGRATION, OTEL_INTEGRATION, KEYCLOAK_INTEGRATION] +OTHER_PROVIDERS_INTEGRATIONS = [OPENLINEAGE_INTEGRATION] ALLOWED_DEBIAN_VERSIONS = ["bookworm"] -ALL_INTEGRATIONS = sorted( +ALL_CORE_INTEGRATIONS = sorted( [ - *TESTABLE_INTEGRATIONS, - *OTHER_INTEGRATIONS, + *TESTABLE_CORE_INTEGRATIONS, + *OTHER_CORE_INTEGRATIONS, + ] +) +ALL_PROVIDERS_INTEGRATIONS = sorted( + [ + *TESTABLE_PROVIDERS_INTEGRATIONS, + *OTHER_PROVIDERS_INTEGRATIONS, + ] +) +AUTOCOMPLETE_CORE_INTEGRATIONS = sorted( + [ + "all-testable", + "all", + *ALL_CORE_INTEGRATIONS, + ] +) +AUTOCOMPLETE_PROVIDERS_INTEGRATIONS = sorted( + [ + "all-testable", + "all", + *ALL_PROVIDERS_INTEGRATIONS, ] ) -AUTOCOMPLETE_INTEGRATIONS = sorted( +AUTOCOMPLETE_ALL_INTEGRATIONS = sorted( [ "all-testable", "all", - *ALL_INTEGRATIONS, + *ALL_CORE_INTEGRATIONS, + *ALL_PROVIDERS_INTEGRATIONS, ] ) ALLOWED_TTY = ["auto", "enabled", "disabled"] @@ -93,7 +126,7 @@ # - https://endoflife.date/amazon-eks # - https://endoflife.date/azure-kubernetes-service # - https://endoflife.date/google-kubernetes-engine -ALLOWED_KUBERNETES_VERSIONS = ["v1.27.13", "v1.28.9", "v1.29.4", "v1.30.0"] +ALLOWED_KUBERNETES_VERSIONS = ["v1.28.15", "v1.29.12", "v1.30.8", "v1.31.4", "v1.32.0"] LOCAL_EXECUTOR = "LocalExecutor" KUBERNETES_EXECUTOR = "KubernetesExecutor" @@ -139,7 +172,7 @@ ] USE_AIRFLOW_MOUNT_SOURCES = [MOUNT_REMOVE, MOUNT_TESTS, MOUNT_PROVIDERS_AND_TESTS] -ALLOWED_POSTGRES_VERSIONS = ["12", "13", "14", "15", "16"] +ALLOWED_POSTGRES_VERSIONS = ["13", "14", "15", "16", "17"] # Oracle introduced new release model for MySQL # - LTS: Long Time Support releases, new release approx every 2 year, # with 5 year premier and 3 year extended support, no new features/removals during current LTS release. @@ -156,7 +189,7 @@ ALLOWED_INSTALL_MYSQL_CLIENT_TYPES = ["mariadb", "mysql"] PIP_VERSION = "24.3.1" -UV_VERSION = "0.5.11" +UV_VERSION = "0.5.17" DEFAULT_UV_HTTP_TIMEOUT = 300 DEFAULT_WSL2_HTTP_TIMEOUT = 900 @@ -170,48 +203,58 @@ ] -@lru_cache(maxsize=None) -def all_selective_test_types() -> tuple[str, ...]: - return tuple(sorted(e.value for e in SelectiveUnitTestTypes)) +@clearable_cache +def all_selective_core_test_types() -> tuple[str, ...]: + return tuple(sorted(e.value for e in SelectiveCoreTestType)) -@lru_cache(maxsize=None) -def all_selective_test_types_except_providers() -> tuple[str, ...]: - return tuple(sorted(e.value for e in SelectiveUnitTestTypes if e != SelectiveUnitTestTypes.PROVIDERS)) +@clearable_cache +def providers_test_type() -> tuple[str, ...]: + return tuple(sorted(e.value for e in SelectiveProvidersTestType)) -class SelectiveUnitTestTypes(Enum): +class SelectiveTestType(Enum): + pass + + +class SelectiveCoreTestType(SelectiveTestType): ALWAYS = "Always" API = "API" - BRANCH_PYTHON_VENV = "BranchPythonVenv" - EXTERNAL_PYTHON = "ExternalPython" - EXTERNAL_BRANCH_PYTHON = "BranchExternalPython" CLI = "CLI" CORE = "Core" SERIALIZATION = "Serialization" OTHER = "Other" OPERATORS = "Operators" - PLAIN_ASSERTS = "PlainAsserts" - PROVIDERS = "Providers" - PYTHON_VENV = "PythonVenv" WWW = "WWW" -ALLOWED_TEST_TYPE_CHOICES = [ - "All", - "Default", - *all_selective_test_types(), - "All-Postgres", - "All-MySQL", - "All-Quarantined", -] +class SelectiveProvidersTestType(SelectiveTestType): + PROVIDERS = "Providers" -ALLOWED_PARALLEL_TEST_TYPE_CHOICES = [ - *all_selective_test_types(), -] + +class GroupOfTests(Enum): + CORE = "core" + PROVIDERS = "providers" + HELM = "helm" + INTEGRATION_CORE = "integration-core" + INTEGRATION_PROVIDERS = "integration-providers" + SYSTEM = "system" + PYTHON_API_CLIENT = "python-api-client" -@lru_cache(maxsize=None) +ALL_TEST_TYPE = "All" +NONE_TEST_TYPE = "None" + +ALL_TEST_SUITES: dict[str, tuple[str, ...]] = { + "All": (), + "All-Long": ("-m", "long_running", "--include-long-running"), + "All-Quarantined": ("-m", "quarantined", "--include-quarantined"), + "All-Postgres": ("--backend", "postgres"), + "All-MySQL": ("--backend", "mysql"), +} + + +@clearable_cache def all_helm_test_packages() -> list[str]: return sorted( [ @@ -222,10 +265,11 @@ def all_helm_test_packages() -> list[str]: ) -ALLOWED_HELM_TEST_PACKAGES = [ - "all", - *all_helm_test_packages(), -] +ALLOWED_TEST_TYPE_CHOICES: dict[GroupOfTests, list[str]] = { + GroupOfTests.CORE: [*ALL_TEST_SUITES.keys(), *all_selective_core_test_types()], + GroupOfTests.PROVIDERS: [*ALL_TEST_SUITES.keys()], + GroupOfTests.HELM: [ALL_TEST_TYPE, *all_helm_test_packages()], +} ALLOWED_PACKAGE_FORMATS = ["wheel", "sdist", "both"] ALLOWED_INSTALLATION_PACKAGE_FORMATS = ["wheel", "sdist"] @@ -237,7 +281,6 @@ def all_helm_test_packages() -> list[str]: ALLOWED_PLATFORMS = [*SINGLE_PLATFORMS, MULTI_PLATFORM] ALLOWED_USE_AIRFLOW_VERSIONS = ["none", "wheel", "sdist"] -ALLOWED_PYDANTIC_VERSIONS = ["v2", "v1", "none"] ALL_HISTORICAL_PYTHON_VERSIONS = ["3.6", "3.7", "3.8", "3.9", "3.10", "3.11", "3.12"] @@ -262,6 +305,7 @@ def get_default_platform_machine() -> str: REDIS_HOST_PORT = "26379" SSH_PORT = "12322" WEBSERVER_HOST_PORT = "28080" +VITE_DEV_PORT = "5173" CELERY_BROKER_URLS_MAP = {"rabbitmq": "amqp://guest:guest@rabbitmq:5672", "redis": "redis://redis:6379/0"} SQLITE_URL = "sqlite:////root/airflow/sqlite/airflow.db" @@ -273,7 +317,7 @@ def get_default_platform_machine() -> str: # changes from main to the previous branch. ALL_PYTHON_MAJOR_MINOR_VERSIONS = ["3.8", "3.9", "3.10", "3.11", "3.12"] CURRENT_PYTHON_MAJOR_MINOR_VERSIONS = ALL_PYTHON_MAJOR_MINOR_VERSIONS -CURRENT_POSTGRES_VERSIONS = ["12", "13", "14", "15", "16"] +CURRENT_POSTGRES_VERSIONS = ["13", "14", "15", "16", "17"] DEFAULT_POSTGRES_VERSION = CURRENT_POSTGRES_VERSIONS[0] USE_MYSQL_INNOVATION_RELEASE = True if USE_MYSQL_INNOVATION_RELEASE: @@ -324,6 +368,14 @@ def get_default_platform_machine() -> str: "2.8.2": ["3.8", "3.9", "3.10", "3.11"], "2.8.3": ["3.8", "3.9", "3.10", "3.11"], "2.9.0": ["3.8", "3.9", "3.10", "3.11", "3.12"], + "2.9.1": ["3.8", "3.9", "3.10", "3.11", "3.12"], + "2.9.2": ["3.8", "3.9", "3.10", "3.11", "3.12"], + "2.9.3": ["3.8", "3.9", "3.10", "3.11", "3.12"], + "2.10.0": ["3.8", "3.9", "3.10", "3.11", "3.12"], + "2.10.1": ["3.8", "3.9", "3.10", "3.11", "3.12"], + "2.10.2": ["3.8", "3.9", "3.10", "3.11", "3.12"], + "2.10.3": ["3.8", "3.9", "3.10", "3.11", "3.12"], + "2.10.4": ["3.8", "3.9", "3.10", "3.11", "3.12"], } DB_RESET = False @@ -355,12 +407,14 @@ def get_default_platform_machine() -> str: "bolkedebruin", "criccomini", "dimberman", + "dirrao", "dstandish", "eladkal", "ephraimbuddy", "feluelle", "feng-tao", "ferruzzi", + "gopidesupavan", "houqp", "hussein-awala", "jedcunningham", @@ -392,6 +446,7 @@ def get_default_platform_machine() -> str: "saguziel", "sekikn", "shahar1", + "tirkarthi", "turbaszek", "uranusjr", "utkarsharma2", @@ -416,7 +471,7 @@ def get_airflow_version(): return airflow_version -@lru_cache(maxsize=None) +@clearable_cache def get_airflow_extras(): airflow_dockerfile = AIRFLOW_SOURCES_ROOT / "Dockerfile" with open(airflow_dockerfile) as dockerfile: @@ -427,7 +482,7 @@ def get_airflow_extras(): # Initialize integrations -ALL_PROVIDER_YAML_FILES = Path(AIRFLOW_SOURCES_ROOT, "airflow", "providers").rglob("provider.yaml") +ALL_PROVIDER_YAML_FILES = Path(AIRFLOW_SOURCES_ROOT, "providers").rglob("provider.yaml") PROVIDER_RUNTIME_DATA_SCHEMA_PATH = AIRFLOW_SOURCES_ROOT / "airflow" / "provider_info.schema.json" with Path(AIRFLOW_SOURCES_ROOT, "generated", "provider_dependencies.json").open() as f: @@ -444,21 +499,18 @@ def get_airflow_extras(): "scripts/docker/common.sh", "scripts/docker/install_additional_dependencies.sh", "scripts/docker/install_airflow.sh", - "scripts/docker/install_airflow_dependencies_from_branch_tip.sh", "scripts/docker/install_from_docker_context_files.sh", "scripts/docker/install_mysql.sh", ] -ENABLED_SYSTEMS = "" - CURRENT_KUBERNETES_VERSIONS = ALLOWED_KUBERNETES_VERSIONS CURRENT_EXECUTORS = [KUBERNETES_EXECUTOR] DEFAULT_KUBERNETES_VERSION = CURRENT_KUBERNETES_VERSIONS[0] DEFAULT_EXECUTOR = CURRENT_EXECUTORS[0] -KIND_VERSION = "v0.23.0" -HELM_VERSION = "v3.15.3" +KIND_VERSION = "v0.26.0" +HELM_VERSION = "v3.16.4" # Initialize image build variables - Have to check if this has to go to ci dataclass USE_AIRFLOW_VERSION = None @@ -510,26 +562,20 @@ def get_airflow_extras(): # END OF EXTRAS LIST UPDATED BY PRE COMMIT ] -CHICKEN_EGG_PROVIDERS = " ".join([]) +CHICKEN_EGG_PROVIDERS = "" -BASE_PROVIDERS_COMPATIBILITY_CHECKS: list[dict[str, str | list[str]]] = [ - { - "python-version": "3.8", - "airflow-version": "2.7.3", - "remove-providers": "common.io fab", - "run-tests": "true", - }, +PROVIDERS_COMPATIBILITY_TESTS_MATRIX: list[dict[str, str | list[str]]] = [ { - "python-version": "3.8", - "airflow-version": "2.8.4", - "remove-providers": "fab", + "python-version": "3.9", + "airflow-version": "2.9.3", + "remove-providers": "cloudant fab edge", "run-tests": "true", }, { - "python-version": "3.8", - "airflow-version": "2.9.1", - "remove-providers": "", + "python-version": "3.9", + "airflow-version": "2.10.4", + "remove-providers": "cloudant fab", "run-tests": "true", }, ] @@ -546,6 +592,6 @@ class GithubEvents(Enum): WORKFLOW_RUN = "workflow_run" -@lru_cache(maxsize=None) +@clearable_cache def github_events() -> list[str]: return [e.value for e in GithubEvents] diff --git a/dev/breeze/src/airflow_breeze/params/build_ci_params.py b/dev/breeze/src/airflow_breeze/params/build_ci_params.py index 05179df07b8c4..e330ec29d567a 100644 --- a/dev/breeze/src/airflow_breeze/params/build_ci_params.py +++ b/dev/breeze/src/airflow_breeze/params/build_ci_params.py @@ -34,7 +34,6 @@ class BuildCiParams(CommonBuildParams): airflow_constraints_mode: str = "constraints-source-providers" airflow_constraints_reference: str = DEFAULT_AIRFLOW_CONSTRAINTS_BRANCH airflow_extras: str = "devel-ci" - airflow_pre_cached_pip_packages: bool = True force_build: bool = False upgrade_to_newer_dependencies: bool = False upgrade_on_failure: bool = False @@ -65,7 +64,6 @@ def prepare_arguments_for_docker_build_command(self) -> list[str]: self._req_arg("AIRFLOW_EXTRAS", self.airflow_extras) self._req_arg("AIRFLOW_IMAGE_DATE_CREATED", self.airflow_image_date_created) self._req_arg("AIRFLOW_IMAGE_REPOSITORY", self.airflow_image_repository) - self._req_arg("AIRFLOW_PRE_CACHED_PIP_PACKAGES", self.airflow_pre_cached_pip_packages) self._req_arg("AIRFLOW_USE_UV", self.use_uv) if self.use_uv: from airflow_breeze.utils.uv_utils import get_uv_timeout diff --git a/dev/breeze/src/airflow_breeze/params/build_prod_params.py b/dev/breeze/src/airflow_breeze/params/build_prod_params.py index 2533c30d6f3ae..01f2bb8fa4a8d 100644 --- a/dev/breeze/src/airflow_breeze/params/build_prod_params.py +++ b/dev/breeze/src/airflow_breeze/params/build_prod_params.py @@ -44,7 +44,6 @@ class BuildProdParams(CommonBuildParams): airflow_constraints_reference: str = DEFAULT_AIRFLOW_CONSTRAINTS_BRANCH cleanup_context: bool = False airflow_extras: str = field(default_factory=get_airflow_extras) - disable_airflow_repo_cache: bool = False disable_mssql_client_installation: bool = False disable_mysql_client_installation: bool = False disable_postgres_client_installation: bool = False @@ -64,6 +63,18 @@ def airflow_version(self) -> str: else: return self._get_version_with_suffix() + @property + def airflow_semver_version(self) -> str: + """The airflow version in SemVer compatible form""" + from packaging.version import Version + + pyVer = Version(self.airflow_version) + + ver = pyVer.base_version + # if (dev := pyVer.dev) is not None: + # ver += f"-dev.{dev}" + return ver + @property def image_type(self) -> str: return "PROD" @@ -186,10 +197,6 @@ def _extra_prod_docker_build_flags(self) -> list[str]: ) return extra_build_flags - @property - def airflow_pre_cached_pip_packages(self) -> str: - return "false" if self.disable_airflow_repo_cache else "true" - @property def install_mssql_client(self) -> str: return "false" if self.disable_mssql_client_installation else "true" @@ -219,7 +226,6 @@ def prepare_arguments_for_docker_build_command(self) -> list[str]: self._req_arg("AIRFLOW_IMAGE_DATE_CREATED", self.airflow_image_date_created) self._req_arg("AIRFLOW_IMAGE_README_URL", self.airflow_image_readme_url) self._req_arg("AIRFLOW_IMAGE_REPOSITORY", self.airflow_image_repository) - self._req_arg("AIRFLOW_PRE_CACHED_PIP_PACKAGES", self.airflow_pre_cached_pip_packages) self._opt_arg("AIRFLOW_USE_UV", self.use_uv) if self.use_uv: from airflow_breeze.utils.uv_utils import get_uv_timeout diff --git a/dev/breeze/src/airflow_breeze/params/common_build_params.py b/dev/breeze/src/airflow_breeze/params/common_build_params.py index 6286a28b86d03..ed32eea4f2ca2 100644 --- a/dev/breeze/src/airflow_breeze/params/common_build_params.py +++ b/dev/breeze/src/airflow_breeze/params/common_build_params.py @@ -32,7 +32,7 @@ get_airflow_version, ) from airflow_breeze.utils.console import get_console -from airflow_breeze.utils.platforms import get_real_platform +from airflow_breeze.utils.platforms import get_normalized_platform @dataclass @@ -56,19 +56,18 @@ class CommonBuildParams: commit_sha: str | None = None dev_apt_command: str | None = None dev_apt_deps: str | None = None + disable_airflow_repo_cache: bool = False docker_cache: str = "registry" docker_host: str | None = os.environ.get("DOCKER_HOST") github_actions: str = os.environ.get("GITHUB_ACTIONS", "false") github_repository: str = APACHE_AIRFLOW_GITHUB_REPOSITORY github_token: str = os.environ.get("GITHUB_TOKEN", "") - image_tag: str | None = None install_mysql_client_type: str = ALLOWED_INSTALL_MYSQL_CLIENT_TYPES[0] platform: str = DOCKER_DEFAULT_PLATFORM prepare_buildx_cache: bool = False python_image: str | None = None push: bool = False python: str = "3.8" - tag_as_latest: bool = False uv_http_timeout: int = DEFAULT_UV_HTTP_TIMEOUT dry_run: bool = False version_suffix_for_pypi: str | None = None @@ -88,10 +87,6 @@ def build_id(self) -> str: def image_type(self) -> str: raise NotImplementedError() - @property - def airflow_pre_cached_pip_packages(self): - raise NotImplementedError() - @property def airflow_base_image_name(self): image = f"ghcr.io/{self.github_repository.lower()}" @@ -136,15 +131,6 @@ def airflow_image_date_created(self): def airflow_image_readme_url(self): return "https://raw.githubusercontent.com/apache/airflow/main/docs/docker-stack/README.md" - @property - def airflow_image_name_with_tag(self): - """Construct image link""" - image = ( - f"{self.airflow_base_image_name}/{self.airflow_branch}/" - f"{self.image_type.lower()}/python{self.python}" - ) - return image if self.image_tag is None else image + f":{self.image_tag}" - def get_cache(self, single_platform: str) -> str: if "," in single_platform: get_console().print( @@ -152,21 +138,15 @@ def get_cache(self, single_platform: str) -> str: f"tried for {single_platform}[/]" ) sys.exit(1) - return f"{self.airflow_image_name}:cache-{get_real_platform(single_platform)}" + platform_tag = get_normalized_platform(single_platform).replace("/", "-") + return f"{self.airflow_image_name}:cache-{platform_tag}" def is_multi_platform(self) -> bool: return "," in self.platform - def preparing_latest_image(self) -> bool: - return ( - self.tag_as_latest - or self.airflow_image_name == self.airflow_image_name_with_tag - or self.airflow_image_name_with_tag.endswith("latest") - ) - @property def platforms(self) -> list[str]: - return self.platform.split(",") + return [get_normalized_platform(single_platform) for single_platform in self.platform.split(",")] def _build_arg(self, name: str, value: Any, optional: bool): if value is None or "": diff --git a/dev/breeze/src/airflow_breeze/params/shell_params.py b/dev/breeze/src/airflow_breeze/params/shell_params.py index d0f429aa47464..1bd095c2cbc56 100644 --- a/dev/breeze/src/airflow_breeze/params/shell_params.py +++ b/dev/breeze/src/airflow_breeze/params/shell_params.py @@ -25,14 +25,14 @@ from airflow_breeze.branch_defaults import AIRFLOW_BRANCH, DEFAULT_AIRFLOW_CONSTRAINTS_BRANCH from airflow_breeze.global_constants import ( - ALL_INTEGRATIONS, + ALL_CORE_INTEGRATIONS, + ALL_PROVIDERS_INTEGRATIONS, ALLOWED_BACKENDS, ALLOWED_CONSTRAINTS_MODES_CI, ALLOWED_DOCKER_COMPOSE_PROJECTS, ALLOWED_INSTALLATION_PACKAGE_FORMATS, ALLOWED_MYSQL_VERSIONS, ALLOWED_POSTGRES_VERSIONS, - ALLOWED_PYDANTIC_VERSIONS, ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS, APACHE_AIRFLOW_GITHUB_REPOSITORY, CELERY_BROKER_URLS_MAP, @@ -43,6 +43,7 @@ DRILL_HOST_PORT, EDGE_EXECUTOR, FLOWER_HOST_PORT, + KEYCLOAK_INTEGRATION, MOUNT_ALL, MOUNT_PROVIDERS_AND_TESTS, MOUNT_REMOVE, @@ -50,14 +51,18 @@ MOUNT_TESTS, MSSQL_HOST_PORT, MYSQL_HOST_PORT, + POSTGRES_BACKEND, POSTGRES_HOST_PORT, REDIS_HOST_PORT, + SEQUENTIAL_EXECUTOR, SSH_PORT, START_AIRFLOW_DEFAULT_ALLOWED_EXECUTOR, - TESTABLE_INTEGRATIONS, + TESTABLE_CORE_INTEGRATIONS, + TESTABLE_PROVIDERS_INTEGRATIONS, USE_AIRFLOW_MOUNT_SOURCES, WEBSERVER_HOST_PORT, GithubEvents, + GroupOfTests, get_airflow_version, ) from airflow_breeze.utils.console import get_console @@ -138,8 +143,8 @@ class ShellParams: celery_broker: str = DEFAULT_CELERY_BROKER celery_flower: bool = False chicken_egg_providers: str = "" + clean_airflow_installation: bool = False collect_only: bool = False - database_isolation: bool = False db_reset: bool = False default_constraints_branch: str = DEFAULT_AIRFLOW_CONSTRAINTS_BRANCH dev_mode: bool = False @@ -148,6 +153,7 @@ class ShellParams: downgrade_pendulum: bool = False dry_run: bool = False enable_coverage: bool = False + excluded_providers: str = "" executor: str = START_AIRFLOW_DEFAULT_ALLOWED_EXECUTOR extra_args: tuple = () force_build: bool = False @@ -158,10 +164,9 @@ class ShellParams: github_actions: str = os.environ.get("GITHUB_ACTIONS", "false") github_repository: str = APACHE_AIRFLOW_GITHUB_REPOSITORY github_token: str = os.environ.get("GITHUB_TOKEN", "") - helm_test_package: str | None = None - image_tag: str | None = None include_mypy_volume: bool = False install_airflow_version: str = "" + install_airflow_python_client: bool = False install_airflow_with_constraints: bool = False install_selected_providers: str | None = None integration: tuple[str, ...] = () @@ -184,24 +189,23 @@ class ShellParams: providers_constraints_mode: str = ALLOWED_CONSTRAINTS_MODES_CI[0] providers_constraints_reference: str = "" providers_skip_constraints: bool = False - pydantic: str = ALLOWED_PYDANTIC_VERSIONS[0] python: str = ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS[0] quiet: bool = False regenerate_missing_docs: bool = False remove_arm_packages: bool = False restart: bool = False run_db_tests_only: bool = False - run_system_tests: bool = os.environ.get("RUN_SYSTEM_TESTS", "false") == "true" run_tests: bool = False skip_db_tests: bool = False skip_environment_initialization: bool = False skip_image_upgrade_check: bool = False skip_provider_dependencies_check: bool = False - skip_provider_tests: bool = False skip_ssh_setup: bool = os.environ.get("SKIP_SSH_SETUP", "false") == "true" standalone_dag_processor: bool = False start_airflow: bool = False test_type: str | None = None + start_webserver_with_examples: bool = False + test_group: GroupOfTests | None = None tty: str = "auto" upgrade_boto: bool = False use_airflow_version: str | None = None @@ -253,11 +257,6 @@ def airflow_image_name(self) -> str: image = f"{self.airflow_base_image_name}/{self.airflow_branch}/ci/python{self.python}" return image - @cached_property - def airflow_image_name_with_tag(self) -> str: - image = self.airflow_image_name - return image if not self.image_tag else image + f":{self.image_tag}" - @cached_property def airflow_image_kubernetes(self) -> str: image = f"{self.airflow_base_image_name}/{self.airflow_branch}/kubernetes/python{self.python}" @@ -293,7 +292,7 @@ def print_badge_info(self): if get_verbose(): get_console().print(f"[info]Use {self.image_type} image[/]") get_console().print(f"[info]Branch Name: {self.airflow_branch}[/]") - get_console().print(f"[info]Docker Image: {self.airflow_image_name_with_tag}[/]") + get_console().print(f"[info]Docker Image: {self.airflow_image_name}[/]") get_console().print(f"[info]Airflow source version:{self.airflow_version}[/]") get_console().print(f"[info]Python Version: {self.python}[/]") get_console().print(f"[info]Backend: {self.backend} {self.backend_version}[/]") @@ -333,7 +332,9 @@ def compose_file(self) -> str: get_console().print( "[warning]Adding `celery` extras as it is implicitly needed by celery executor" ) - self.airflow_extras = ",".join(current_extras.split(",") + ["celery"]) + self.airflow_extras = ( + ",".join(current_extras.split(",") + ["celery"]) if current_extras else "celery" + ) compose_file_list.append(DOCKER_COMPOSE_DIR / "base.yml") self.add_docker_in_docker(compose_file_list) @@ -371,9 +372,26 @@ def compose_file(self) -> str: if self.include_mypy_volume: compose_file_list.append(DOCKER_COMPOSE_DIR / "mypy.yml") if "all-testable" in self.integration: - integrations = TESTABLE_INTEGRATIONS + if self.test_group == GroupOfTests.CORE: + integrations = TESTABLE_CORE_INTEGRATIONS + elif self.test_group == GroupOfTests.PROVIDERS: + integrations = TESTABLE_PROVIDERS_INTEGRATIONS + else: + get_console().print( + "[error]You can only use `core` or `providers` test " + "group with `all-testable` integration." + ) + sys.exit(1) elif "all" in self.integration: - integrations = ALL_INTEGRATIONS + if self.test_group == GroupOfTests.CORE: + integrations = ALL_CORE_INTEGRATIONS + elif self.test_group == GroupOfTests.PROVIDERS: + integrations = ALL_PROVIDERS_INTEGRATIONS + else: + get_console().print( + "[error]You can only use `core` or `providers` test group with `all` integration." + ) + sys.exit(1) else: integrations = self.integration for integration in integrations: @@ -479,21 +497,42 @@ def env_variables_for_docker_commands(self) -> dict[str, str]: _env: dict[str, str] = {} _set_var(_env, "AIRFLOW_CI_IMAGE", self.airflow_image_name) - _set_var(_env, "AIRFLOW_CI_IMAGE_WITH_TAG", self.airflow_image_name_with_tag) _set_var(_env, "AIRFLOW_CONSTRAINTS_LOCATION", self.airflow_constraints_location) _set_var(_env, "AIRFLOW_CONSTRAINTS_MODE", self.airflow_constraints_mode) _set_var(_env, "AIRFLOW_CONSTRAINTS_REFERENCE", self.airflow_constraints_reference) - _set_var(_env, "AIRFLOW_ENABLE_AIP_44", None, "true") _set_var(_env, "AIRFLOW_ENV", "development") _set_var(_env, "AIRFLOW_EXTRAS", self.airflow_extras) _set_var(_env, "AIRFLOW_SKIP_CONSTRAINTS", self.airflow_skip_constraints) _set_var(_env, "AIRFLOW_IMAGE_KUBERNETES", self.airflow_image_kubernetes) _set_var(_env, "AIRFLOW_VERSION", self.airflow_version) _set_var(_env, "AIRFLOW__CELERY__BROKER_URL", self.airflow_celery_broker_url) - _set_var(_env, "AIRFLOW__CORE__EXECUTOR", self.executor) + if self.backend == "sqlite": + get_console().print(f"[warning]SQLite backend needs {SEQUENTIAL_EXECUTOR}[/]") + _set_var(_env, "AIRFLOW__CORE__EXECUTOR", SEQUENTIAL_EXECUTOR) + else: + _set_var(_env, "AIRFLOW__CORE__EXECUTOR", self.executor) if self.executor == EDGE_EXECUTOR: + _set_var( + _env, "AIRFLOW__CORE__EXECUTOR", "airflow.providers.edge.executors.edge_executor.EdgeExecutor" + ) _set_var(_env, "AIRFLOW__EDGE__API_ENABLED", "true") - _set_var(_env, "AIRFLOW__EDGE__API_URL", "http://localhost:8080/edge_worker/v1/rpcapi") + + # For testing Edge Worker on Windows... Default Run ID is having a colon (":") from the time which is + # made into the log path template, which then fails to be used in Windows. So we replace it with a dash + _set_var( + _env, + "AIRFLOW__LOGGING__LOG_FILENAME_TEMPLATE", + "dag_id={{ ti.dag_id }}/run_id={{ ti.run_id|replace(':', '-') }}/task_id={{ ti.task_id }}/" + "{% if ti.map_index >= 0 %}map_index={{ ti.map_index }}/{% endif %}" + "attempt={{ try_number|default(ti.try_number) }}.log", + ) + + # Dev Airflow 3 runs API on FastAPI transitional + port = 9091 + if self.use_airflow_version and self.use_airflow_version.startswith("2."): + # Airflow 2.10 runs it in the webserver atm + port = 8080 + _set_var(_env, "AIRFLOW__EDGE__API_URL", f"http://localhost:{port}/edge_worker/v1/rpcapi") _set_var(_env, "ANSWER", get_forced_answer() or "") _set_var(_env, "BACKEND", self.backend) _set_var(_env, "BASE_BRANCH", self.base_branch, "main") @@ -502,6 +541,7 @@ def env_variables_for_docker_commands(self) -> dict[str, str]: _set_var(_env, "CELERY_BROKER_URLS_MAP", CELERY_BROKER_URLS_MAP) _set_var(_env, "CELERY_FLOWER", self.celery_flower) _set_var(_env, "CHICKEN_EGG_PROVIDERS", self.chicken_egg_providers) + _set_var(_env, "CLEAN_AIRFLOW_INSTALLATION", self.clean_airflow_installation) _set_var(_env, "CI", None, "false") _set_var(_env, "CI_BUILD_ID", None, "0") _set_var(_env, "CI_EVENT_TYPE", None, GithubEvents.PULL_REQUEST.value) @@ -511,7 +551,6 @@ def env_variables_for_docker_commands(self) -> dict[str, str]: _set_var(_env, "COLLECT_ONLY", self.collect_only) _set_var(_env, "COMMIT_SHA", None, commit_sha()) _set_var(_env, "COMPOSE_FILE", self.compose_file) - _set_var(_env, "DATABASE_ISOLATION", self.database_isolation) _set_var(_env, "DB_RESET", self.db_reset) _set_var(_env, "DEFAULT_BRANCH", self.airflow_branch) _set_var(_env, "DEFAULT_CONSTRAINTS_BRANCH", self.default_constraints_branch) @@ -520,18 +559,18 @@ def env_variables_for_docker_commands(self) -> dict[str, str]: _set_var(_env, "DOWNGRADE_SQLALCHEMY", self.downgrade_sqlalchemy) _set_var(_env, "DOWNGRADE_PENDULUM", self.downgrade_pendulum) _set_var(_env, "DRILL_HOST_PORT", None, DRILL_HOST_PORT) - _set_var(_env, "ENABLED_SYSTEMS", None, "") _set_var(_env, "ENABLE_COVERAGE", self.enable_coverage) _set_var(_env, "FLOWER_HOST_PORT", None, FLOWER_HOST_PORT) + _set_var(_env, "EXCLUDED_PROVIDERS", self.excluded_providers) _set_var(_env, "FORCE_LOWEST_DEPENDENCIES", self.force_lowest_dependencies) _set_var(_env, "SQLALCHEMY_WARN_20", self.force_sa_warnings) _set_var(_env, "GITHUB_ACTIONS", self.github_actions) - _set_var(_env, "HELM_TEST_PACKAGE", self.helm_test_package, "") _set_var(_env, "HOST_GROUP_ID", self.host_group_id) _set_var(_env, "HOST_OS", self.host_os) _set_var(_env, "HOST_USER_ID", self.host_user_id) _set_var(_env, "INIT_SCRIPT_FILE", None, "init.sh") _set_var(_env, "INSTALL_AIRFLOW_WITH_CONSTRAINTS", self.install_airflow_with_constraints) + _set_var(_env, "INSTALL_AIRFLOW_PYTHON_CLIENT", self.install_airflow_python_client) _set_var(_env, "INSTALL_AIRFLOW_VERSION", self.install_airflow_version) _set_var(_env, "INSTALL_SELECTED_PROVIDERS", self.install_selected_providers) _set_var(_env, "ISSUE_ID", self.issue_id) @@ -557,7 +596,6 @@ def env_variables_for_docker_commands(self) -> dict[str, str]: _set_var(_env, "REDIS_HOST_PORT", None, REDIS_HOST_PORT) _set_var(_env, "REGENERATE_MISSING_DOCS", self.regenerate_missing_docs) _set_var(_env, "REMOVE_ARM_PACKAGES", self.remove_arm_packages) - _set_var(_env, "RUN_SYSTEM_TESTS", self.run_system_tests) _set_var(_env, "RUN_TESTS", self.run_tests) _set_var(_env, "SKIP_ENVIRONMENT_INITIALIZATION", self.skip_environment_initialization) _set_var(_env, "SKIP_SSH_SETUP", self.skip_ssh_setup) @@ -566,10 +604,15 @@ def env_variables_for_docker_commands(self) -> dict[str, str]: _set_var(_env, "STANDALONE_DAG_PROCESSOR", self.standalone_dag_processor) _set_var(_env, "START_AIRFLOW", self.start_airflow) _set_var(_env, "SUSPENDED_PROVIDERS_FOLDERS", self.suspended_providers_folders) + _set_var( + _env, + "START_WEBSERVER_WITH_EXAMPLES", + self.start_webserver_with_examples, + ) _set_var(_env, "SYSTEM_TESTS_ENV_ID", None, "") _set_var(_env, "TEST_TYPE", self.test_type, "") + _set_var(_env, "TEST_GROUP", str(self.test_group.value) if self.test_group else "") _set_var(_env, "UPGRADE_BOTO", self.upgrade_boto) - _set_var(_env, "PYDANTIC", self.pydantic) _set_var(_env, "USE_AIRFLOW_VERSION", self.use_airflow_version, "") _set_var(_env, "USE_PACKAGES_FROM_DIST", self.use_packages_from_dist) _set_var(_env, "USE_UV", self.use_uv) @@ -660,3 +703,14 @@ def __post_init__(self): self.airflow_constraints_reference = self.default_constraints_branch if self.providers_constraints_reference == "default": self.providers_constraints_reference = self.default_constraints_branch + + if ( + self.backend + and self.integration + and KEYCLOAK_INTEGRATION in self.integration + and not self.backend == POSTGRES_BACKEND + ): + get_console().print( + "[error]When using the Keycloak integration the backend must be Postgres![/]\n" + ) + sys.exit(2) diff --git a/dev/breeze/src/airflow_breeze/pre_commit_ids.py b/dev/breeze/src/airflow_breeze/pre_commit_ids.py index b0ceca5c4b2b1..0100095013f9e 100644 --- a/dev/breeze/src/airflow_breeze/pre_commit_ids.py +++ b/dev/breeze/src/airflow_breeze/pre_commit_ids.py @@ -28,37 +28,27 @@ "blacken-docs", "check-aiobotocore-optional", "check-airflow-k8s-not-used", - "check-airflow-provider-compatibility", - "check-airflow-providers-bug-report-template", "check-apache-license-rat", - "check-base-operator-partial-arguments", "check-base-operator-usage", "check-boring-cyborg-configuration", "check-breeze-top-dependencies-limited", "check-builtin-literals", "check-changelog-format", "check-changelog-has-no-duplicates", - "check-cncf-k8s-only-for-executors", - "check-code-deprecations", "check-common-compat-used-for-openlineage", - "check-compat-cache-on-methods", "check-core-deprecation-classes", "check-daysago-import-from-utils", "check-decorated-operator-implements-custom-name", - "check-deferrable-default", "check-docstring-param-types", - "check-example-dags-urls", "check-executables-have-shebangs", "check-extra-packages-references", "check-extras-order", "check-fab-migrations", "check-for-inclusive-language", "check-get-lineage-collector-providers", - "check-google-re2-as-dependency", "check-hatch-build-order", "check-hooks-apply", "check-incorrect-use-of-LoggingMixin", - "check-init-decorator-arguments", "check-integrations-list-consistent", "check-lazy-logging", "check-links-to-example-dags-do-not-use-hardcoded-versions", @@ -73,18 +63,11 @@ "check-provide-create-sessions-imports", "check-provider-docs-valid", "check-provider-yaml-valid", - "check-providers-init-file-missing", "check-providers-subpackages-init-file-exist", "check-pydevd-left-in-code", - "check-revision-heads-map", "check-safe-filter-usage-in-html", - "check-sql-dependency-common-data-structure", "check-start-date-not-used-in-defaults", - "check-system-tests-present", - "check-system-tests-tocs", - "check-taskinstance-tis-attrs", "check-template-context-variable-in-sync", - "check-tests-in-the-right-folders", "check-tests-unittest-testcase", "check-urlparse-usage-in-code", "check-usage-of-re2-over-re", @@ -129,15 +112,12 @@ "update-black-version", "update-breeze-cmd-output", "update-breeze-readme-config-hash", - "update-build-dependencies", "update-chart-dependencies", - "update-common-sql-api-stubs", "update-er-diagram", - "update-extras", "update-in-the-wild-to-be-sorted", "update-inlined-dockerfile-scripts", "update-installed-providers-to-be-sorted", - "update-installers", + "update-installers-and-pre-commit", "update-local-yml-file", "update-migration-references", "update-openapi-spec-tags-to-be-sorted", @@ -147,6 +127,6 @@ "update-supported-versions", "update-vendored-in-k8s-json-schema", "update-version", - "validate-operators-init", "yamllint", + "zizmor", ] diff --git a/dev/breeze/src/airflow_breeze/prepare_providers/provider_documentation.py b/dev/breeze/src/airflow_breeze/prepare_providers/provider_documentation.py index e7b59becf40df..6e18966416da8 100644 --- a/dev/breeze/src/airflow_breeze/prepare_providers/provider_documentation.py +++ b/dev/breeze/src/airflow_breeze/prepare_providers/provider_documentation.py @@ -21,6 +21,7 @@ import os import random import re +import shutil import subprocess import sys import tempfile @@ -44,11 +45,12 @@ clear_cache_for_provider_metadata, get_provider_details, get_provider_jinja_context, - get_source_package_path, + get_provider_yaml, refresh_provider_metadata_from_yaml_file, refresh_provider_metadata_with_provider_id, render_template, ) +from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT from airflow_breeze.utils.run_utils import run_command from airflow_breeze.utils.shared_options import get_verbose from airflow_breeze.utils.versions import get_version_tag @@ -153,6 +155,7 @@ def __init__(self): self.misc: list[Change] = [] self.features: list[Change] = [] self.breaking_changes: list[Change] = [] + self.docs: list[Change] = [] self.other: list[Change] = [] @@ -186,11 +189,14 @@ class PrepareReleaseDocsUserQuitException(Exception): } -def _get_git_log_command(from_commit: str | None = None, to_commit: str | None = None) -> list[str]: +def _get_git_log_command( + folder_paths: list[Path] | None = None, from_commit: str | None = None, to_commit: str | None = None +) -> list[str]: """Get git command to run for the current repo from the current folder. The current directory should always be the package folder. + :param folder_paths: list of folder paths to check for changes :param from_commit: if present - base commit from which to start the log from :param to_commit: if present - final commit which should be the start of the log :return: git command to run @@ -207,7 +213,8 @@ def _get_git_log_command(from_commit: str | None = None, to_commit: str | None = git_cmd.append(from_commit) elif to_commit: raise ValueError("It makes no sense to specify to_commit without from_commit.") - git_cmd.extend(["--", "."]) + folders = [folder_path.as_posix() for folder_path in folder_paths] if folder_paths else ["."] + git_cmd.extend(["--", *folders]) return git_cmd @@ -307,18 +314,25 @@ def _get_all_changes_for_package( get_console().print(f"[info]Checking if tag '{current_tag_no_suffix}' exist.") result = run_command( ["git", "rev-parse", current_tag_no_suffix], - cwd=provider_details.source_provider_package_path, + cwd=AIRFLOW_SOURCES_ROOT, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, check=False, ) + providers_folder_paths_for_git_commit_retrieval = [ + provider_details.root_provider_path, + ] if not reapply_templates_only and result.returncode == 0: if get_verbose(): get_console().print(f"[info]The tag {current_tag_no_suffix} exists.") # The tag already exists result = run_command( - _get_git_log_command(f"{HTTPS_REMOTE}/{base_branch}", current_tag_no_suffix), - cwd=provider_details.source_provider_package_path, + _get_git_log_command( + providers_folder_paths_for_git_commit_retrieval, + f"{HTTPS_REMOTE}/{base_branch}", + current_tag_no_suffix, + ), + cwd=AIRFLOW_SOURCES_ROOT, capture_output=True, text=True, check=True, @@ -326,15 +340,24 @@ def _get_all_changes_for_package( changes = result.stdout.strip() if changes: provider_details = get_provider_details(provider_package_id) - doc_only_change_file = ( - provider_details.source_provider_package_path / ".latest-doc-only-change.txt" - ) + if provider_details.is_new_structure: + doc_only_change_file = ( + provider_details.root_provider_path / "docs" / ".latest-doc-only-change.txt" + ) + else: + doc_only_change_file = ( + provider_details.base_provider_package_path / ".latest-doc-only-change.txt" + ) if doc_only_change_file.exists(): last_doc_only_hash = doc_only_change_file.read_text().strip() try: result = run_command( - _get_git_log_command(f"{HTTPS_REMOTE}/{base_branch}", last_doc_only_hash), - cwd=provider_details.source_provider_package_path, + _get_git_log_command( + providers_folder_paths_for_git_commit_retrieval, + f"{HTTPS_REMOTE}/{base_branch}", + last_doc_only_hash, + ), + cwd=AIRFLOW_SOURCES_ROOT, capture_output=True, text=True, check=True, @@ -387,8 +410,10 @@ def _get_all_changes_for_package( for version in provider_details.versions[1:]: version_tag = get_version_tag(version, provider_package_id) result = run_command( - _get_git_log_command(next_version_tag, version_tag), - cwd=provider_details.source_provider_package_path, + _get_git_log_command( + providers_folder_paths_for_git_commit_retrieval, next_version_tag, version_tag + ), + cwd=AIRFLOW_SOURCES_ROOT, capture_output=True, text=True, check=True, @@ -402,8 +427,8 @@ def _get_all_changes_for_package( next_version_tag = version_tag current_version = version result = run_command( - _get_git_log_command(next_version_tag), - cwd=provider_details.source_provider_package_path, + _get_git_log_command(providers_folder_paths_for_git_commit_retrieval, next_version_tag), + cwd=provider_details.root_provider_path, capture_output=True, text=True, check=True, @@ -430,7 +455,7 @@ def _ask_the_user_for_the_type_of_changes(non_interactive: bool) -> TypeOfChange display_answers = "/".join(type_of_changes_array) + "/q" while True: get_console().print( - "[warning]Type of change (b)ugfix, (f)eature, (x)breaking " + "[warning]Type of change (d)ocumentation, (b)ugfix, (f)eature, (x)breaking " f"change, (m)misc, (s)kip, (q)uit [{display_answers}]?[/] ", end="", ) @@ -456,23 +481,30 @@ def _mark_latest_changes_as_documentation_only( f"[special]Marking last change: {latest_change.short_hash} and all above " f"changes since the last release as doc-only changes!" ) - (provider_details.source_provider_package_path / ".latest-doc-only-change.txt").write_text( - latest_change.full_hash + "\n" - ) + if provider_details.is_new_structure: + latest_doc_onl_change_file = ( + provider_details.root_provider_path / "docs" / ".latest-doc-only-change.txt" + ) + else: + latest_doc_onl_change_file = ( + provider_details.base_provider_package_path / ".latest-doc-only-change.txt" + ) + + latest_doc_onl_change_file.write_text(latest_change.full_hash + "\n") raise PrepareReleaseDocsChangesOnlyException() def _update_version_in_provider_yaml( - provider_package_id: str, + provider_id: str, type_of_change: TypeOfChange, ) -> tuple[bool, bool, str]: """ Updates provider version based on the type of change selected by the user :param type_of_change: type of change selected - :param provider_package_id: provider package + :param provider_id: provider package :return: tuple of two bools: (with_breaking_change, maybe_with_new_features, original_text) """ - provider_details = get_provider_details(provider_package_id) + provider_details = get_provider_details(provider_id) version = provider_details.versions[0] v = semver.VersionInfo.parse(version) with_breaking_changes = False @@ -487,7 +519,9 @@ def _update_version_in_provider_yaml( maybe_with_new_features = True elif type_of_change == TypeOfChange.BUGFIX: v = v.bump_patch() - provider_yaml_path = get_source_package_path(provider_package_id) / "provider.yaml" + elif type_of_change == TypeOfChange.MISC: + v = v.bump_patch() + provider_yaml_path, is_new_structure = get_provider_yaml(provider_id) original_provider_yaml_content = provider_yaml_path.read_text() new_provider_yaml_content = re.sub( r"^versions:", f"versions:\n - {v}", original_provider_yaml_content, 1, re.MULTILINE @@ -498,27 +532,27 @@ def _update_version_in_provider_yaml( def _update_source_date_epoch_in_provider_yaml( - provider_package_id: str, + provider_id: str, ) -> None: """ Updates source date epoch in provider yaml that then can be used to generate reproducible packages. - :param provider_package_id: provider package + :param provider_id: provider package """ - provider_yaml_path = get_source_package_path(provider_package_id) / "provider.yaml" + provider_yaml_path, is_new_structure = get_provider_yaml(provider_id) original_text = provider_yaml_path.read_text() source_date_epoch = int(time()) new_text = re.sub( r"source-date-epoch: [0-9]*", f"source-date-epoch: {source_date_epoch}", original_text, 1 ) provider_yaml_path.write_text(new_text) - refresh_provider_metadata_with_provider_id(provider_package_id) + refresh_provider_metadata_with_provider_id(provider_id) get_console().print(f"[special]Updated source-date-epoch to {source_date_epoch}\n") def _verify_changelog_exists(package: str) -> Path: provider_details = get_provider_details(package) - changelog_path = Path(provider_details.source_provider_package_path) / "CHANGELOG.rst" + changelog_path = Path(provider_details.root_provider_path) / "CHANGELOG.rst" if not os.path.isfile(changelog_path): get_console().print(f"\n[error]ERROR: Missing {changelog_path}[/]\n") get_console().print("[info]Please add the file with initial content:") @@ -772,10 +806,15 @@ def update_release_notes( f"[special]{TYPE_OF_CHANGE_DESCRIPTION[type_of_change]}" ) get_console().print() - if type_of_change in [TypeOfChange.BUGFIX, TypeOfChange.FEATURE, TypeOfChange.BREAKING_CHANGE]: + if type_of_change in [ + TypeOfChange.BUGFIX, + TypeOfChange.FEATURE, + TypeOfChange.BREAKING_CHANGE, + TypeOfChange.MISC, + ]: with_breaking_changes, maybe_with_new_features, original_provider_yaml_content = ( _update_version_in_provider_yaml( - provider_package_id=provider_package_id, type_of_change=type_of_change + provider_id=provider_package_id, type_of_change=type_of_change ) ) _update_source_date_epoch_in_provider_yaml(provider_package_id) @@ -801,9 +840,10 @@ def update_release_notes( if answer == Answer.NO: if original_provider_yaml_content is not None: # Restore original content of the provider.yaml - (get_source_package_path(provider_package_id) / "provider.yaml").write_text( - original_provider_yaml_content - ) + ( + (AIRFLOW_SOURCES_ROOT / "airflow" / "providers").joinpath(*provider_package_id.split(".")) + / "provider.yaml" + ).write_text(original_provider_yaml_content) clear_cache_for_provider_metadata(provider_package_id) type_of_change = _ask_the_user_for_the_type_of_changes(non_interactive=False) @@ -816,9 +856,14 @@ def update_release_notes( get_console().print() if type_of_change == TypeOfChange.DOCUMENTATION: _mark_latest_changes_as_documentation_only(provider_package_id, list_of_list_of_changes) - elif type_of_change in [TypeOfChange.BUGFIX, TypeOfChange.FEATURE, TypeOfChange.BREAKING_CHANGE]: + elif type_of_change in [ + TypeOfChange.BUGFIX, + TypeOfChange.FEATURE, + TypeOfChange.BREAKING_CHANGE, + TypeOfChange.MISC, + ]: with_breaking_changes, maybe_with_new_features, _ = _update_version_in_provider_yaml( - provider_package_id=provider_package_id, + provider_id=provider_package_id, type_of_change=type_of_change, ) _update_source_date_epoch_in_provider_yaml(provider_package_id) @@ -910,6 +955,8 @@ def _get_changes_classified( classified_changes.features.append(change) elif type_of_change == TypeOfChange.BREAKING_CHANGE and with_breaking_changes: classified_changes.breaking_changes.append(change) + elif type_of_change == TypeOfChange.DOCUMENTATION: + classified_changes.docs.append(change) else: classified_changes.other.append(change) return classified_changes @@ -1022,7 +1069,7 @@ def get_provider_documentation_jinja_context( jinja_context["MAYBE_WITH_NEW_FEATURES"] = maybe_with_new_features jinja_context["ADDITIONAL_INFO"] = ( - _get_additional_package_info(provider_package_path=provider_details.source_provider_package_path), + _get_additional_package_info(provider_package_path=provider_details.root_provider_path), ) return jinja_context @@ -1042,6 +1089,7 @@ def update_changelog( :param reapply_templates_only: only reapply templates, no changelog generation :param with_breaking_changes: whether there are any breaking changes :param maybe_with_new_features: whether there are any new features + :param only_min_version_update: whether to only update the min version """ provider_details = get_provider_details(package_id) jinja_context = get_provider_documentation_jinja_context( @@ -1076,9 +1124,63 @@ def update_changelog( _update_index_rst(jinja_context, package_id, provider_details.documentation_provider_package_path) -def _generate_init_py_file_for_provider( +def _generate_get_provider_info_py(context, provider_details): + get_provider_info_content = black_format( + render_template( + template_name="get_provider_info", + context=context, + extension=".py", + autoescape=False, + keep_trailing_newline=True, + ) + ) + get_provider_info_path = provider_details.base_provider_package_path / "get_provider_info.py" + get_provider_info_path.write_text(get_provider_info_content) + get_console().print( + f"[info]Generated {get_provider_info_path} for the {provider_details.provider_id} provider\n" + ) + + +def _generate_readme_rst(context, provider_details): + get_provider_readme_content = render_template( + template_name="PROVIDER_README", + context=context, + extension=".rst", + keep_trailing_newline=True, + ) + get_provider_readme_path = provider_details.root_provider_path / "README.rst" + get_provider_readme_path.write_text(get_provider_readme_content) + get_console().print( + f"[info]Generated {get_provider_readme_path} for the {provider_details.provider_id} provider\n" + ) + + +def _generate_pyproject(context, provider_details): + get_pyproject_toml_path = provider_details.root_provider_path / "pyproject.toml" + try: + import tomllib + except ImportError: + import tomli as tomllib + old_toml_content = tomllib.loads(get_pyproject_toml_path.read_text()) + old_dependencies = old_toml_content["project"]["dependencies"] + install_requirements = "".join(f'\n "{ir}",' for ir in old_dependencies) + context["INSTALL_REQUIREMENTS"] = install_requirements + get_pyproject_toml_content = render_template( + template_name="pyproject", + context=context, + extension=".toml", + autoescape=False, + keep_trailing_newline=True, + ) + get_pyproject_toml_path.write_text(get_pyproject_toml_content) + get_console().print( + f"[info]Generated {get_pyproject_toml_path} for the {provider_details.provider_id} provider\n" + ) + + +def _generate_build_files_for_provider( context: dict[str, Any], - target_path: Path, + provider_details: ProviderPackageDetails, ): init_py_content = black_format( render_template( @@ -1088,8 +1190,14 @@ def _generate_init_py_file_for_provider( keep_trailing_newline=True, ) ) - init_py_path = target_path / "__init__.py" + init_py_path = provider_details.base_provider_package_path / "__init__.py" init_py_path.write_text(init_py_content) + # TODO(potiuk) - remove this if when we move all providers to new structure + if provider_details.is_new_structure: + _generate_get_provider_info_py(context, provider_details) + _generate_readme_rst(context, provider_details) + _generate_pyproject(context, provider_details) + shutil.copy(AIRFLOW_SOURCES_ROOT / "LICENSE", provider_details.base_provider_package_path / "LICENSE") def _replace_min_airflow_version_in_provider_yaml( @@ -1107,7 +1215,7 @@ def _replace_min_airflow_version_in_provider_yaml( refresh_provider_metadata_from_yaml_file(provider_yaml_path) -def update_min_airflow_version( +def update_min_airflow_version_and_build_files( provider_package_id: str, with_breaking_changes: bool, maybe_with_new_features: bool ): """Updates min airflow version in provider yaml and __init__.py @@ -1125,10 +1233,10 @@ def update_min_airflow_version( with_breaking_changes=with_breaking_changes, maybe_with_new_features=maybe_with_new_features, ) - _generate_init_py_file_for_provider( + _generate_build_files_for_provider( context=jinja_context, - target_path=provider_details.source_provider_package_path, + provider_details=provider_details, ) _replace_min_airflow_version_in_provider_yaml( - context=jinja_context, target_path=provider_details.source_provider_package_path + context=jinja_context, target_path=provider_details.root_provider_path ) diff --git a/dev/breeze/src/airflow_breeze/prepare_providers/provider_packages.py b/dev/breeze/src/airflow_breeze/prepare_providers/provider_packages.py index 88ad3e8c8cf3a..5aba39b98ba8c 100644 --- a/dev/breeze/src/airflow_breeze/prepare_providers/provider_packages.py +++ b/dev/breeze/src/airflow_breeze/prepare_providers/provider_packages.py @@ -32,13 +32,12 @@ get_provider_details, get_provider_jinja_context, get_removed_provider_ids, - get_source_package_path, - get_target_root_for_copied_provider_sources, render_template, tag_exists_for_provider, ) -from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT +from airflow_breeze.utils.path_utils import AIRFLOW_PROVIDERS_DIR, AIRFLOW_SOURCES_ROOT from airflow_breeze.utils.run_utils import run_command +from airflow_breeze.utils.version_utils import is_local_package_version LICENCE_RST = """ .. Licensed to the Apache Software Foundation (ASF) under one @@ -73,7 +72,7 @@ class PrepareReleasePackageErrorBuildingPackageException(Exception): def copy_provider_sources_to_target(provider_id: str) -> Path: - target_provider_root_path = get_target_root_for_copied_provider_sources(provider_id) + target_provider_root_path = Path(AIRFLOW_SOURCES_ROOT / "dist").joinpath(*provider_id.split(".")) if target_provider_root_path.exists() and not target_provider_root_path.is_dir(): get_console().print( @@ -82,8 +81,10 @@ def copy_provider_sources_to_target(provider_id: str) -> Path: ) rmtree(target_provider_root_path, ignore_errors=True) target_provider_root_path.mkdir(parents=True) - source_provider_sources_path = get_source_package_path(provider_id) - relative_provider_path = source_provider_sources_path.relative_to(AIRFLOW_SOURCES_ROOT) + source_provider_sources_path = Path(AIRFLOW_SOURCES_ROOT / "airflow" / "providers").joinpath( + *provider_id.split(".") + ) + relative_provider_path = source_provider_sources_path.relative_to(AIRFLOW_PROVIDERS_DIR) target_providers_sub_folder = target_provider_root_path / relative_provider_path get_console().print( f"[info]Copying provider sources: {source_provider_sources_path} -> {target_providers_sub_folder}" @@ -161,8 +162,11 @@ def should_skip_the_package(provider_id: str, version_suffix: str) -> tuple[bool For RC and official releases we check if the "officially released" version exists and skip the released if it was. This allows to skip packages that have not been marked for release in this wave. For "dev" suffixes, we always build all packages. + A local version of an RC release will always be built. """ - if version_suffix != "" and not version_suffix.startswith("rc"): + if version_suffix != "" and ( + not version_suffix.startswith("rc") or is_local_package_version(version_suffix) + ): return False, version_suffix if version_suffix == "": current_tag = get_latest_provider_tag(provider_id, "") @@ -218,7 +222,10 @@ def build_provider_package(provider_id: str, target_provider_root_sources_path: def move_built_packages_and_cleanup( - target_provider_root_sources_path: Path, dist_folder: Path, skip_cleanup: bool + target_provider_root_sources_path: Path, + dist_folder: Path, + skip_cleanup: bool, + delete_only_build_and_dist_folders: bool = False, ): for file in (target_provider_root_sources_path / "dist").glob("apache*"): get_console().print(f"[info]Moving {file} to {dist_folder}") @@ -236,8 +243,17 @@ def move_built_packages_and_cleanup( f"src/airflow_breeze/templates" ) else: - get_console().print(f"[info]Cleaning up {target_provider_root_sources_path}") - shutil.rmtree(target_provider_root_sources_path, ignore_errors=True) + get_console().print( + f"[info]Cleaning up {target_provider_root_sources_path} with " + f"delete_only_build_and_dist_folders={delete_only_build_and_dist_folders}" + ) + if delete_only_build_and_dist_folders: + shutil.rmtree(target_provider_root_sources_path / "build", ignore_errors=True) + shutil.rmtree(target_provider_root_sources_path / "dist", ignore_errors=True) + for file in target_provider_root_sources_path.glob("*.egg-info"): + shutil.rmtree(file, ignore_errors=True) + else: + shutil.rmtree(target_provider_root_sources_path, ignore_errors=True) get_console().print(f"[info]Cleaned up {target_provider_root_sources_path}") diff --git a/dev/breeze/src/airflow_breeze/provider_issue_TEMPLATE.md.jinja2 b/dev/breeze/src/airflow_breeze/provider_issue_TEMPLATE.md.jinja2 index 895fd47a3cca0..4a5c93f178077 100644 --- a/dev/breeze/src/airflow_breeze/provider_issue_TEMPLATE.md.jinja2 +++ b/dev/breeze/src/airflow_breeze/provider_issue_TEMPLATE.md.jinja2 @@ -5,9 +5,9 @@ The guidelines on how to test providers can be found in [Verify providers by contributors](https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-the-release-candidate-by-contributors) -Let us know in the comment, whether the issue is addressed. +Let us know in the comments, whether the issue is addressed. -Those are providers that require testing as there were some substantial changes introduced: +These are providers that require testing as there were some substantial changes introduced: {% for provider_id, provider_info in providers.items() %} ## Provider [{{ provider_id }}: {{ provider_info.version }}{{ provider_info.suffix }}](https://pypi.org/project/{{ provider_info.pypi_package_name }}/{{ provider_info.version }}{{ provider_info.suffix }}) diff --git a/dev/breeze/src/airflow_breeze/templates/CHANGELOG_TEMPLATE.rst.jinja2 b/dev/breeze/src/airflow_breeze/templates/CHANGELOG_TEMPLATE.rst.jinja2 index e51939c57571c..bd59c96fa9393 100644 --- a/dev/breeze/src/airflow_breeze/templates/CHANGELOG_TEMPLATE.rst.jinja2 +++ b/dev/breeze/src/airflow_breeze/templates/CHANGELOG_TEMPLATE.rst.jinja2 @@ -60,6 +60,16 @@ Misc {%- endif %} +{%- if classified_changes and classified_changes.docs %} + +Doc-only +~~~~ +{% for doc in classified_changes.docs %} +* ``{{ doc.message_without_backticks | safe }}`` +{%- endfor %} +{%- endif %} + + .. Below changes are excluded from the changelog. Move them to appropriate section above if needed. Do not delete the lines(!): {%- if classified_changes and classified_changes.other %} diff --git a/dev/breeze/src/airflow_breeze/templates/PROVIDER_CHANGELOG_TEMPLATE.rst.jinja2 b/dev/breeze/src/airflow_breeze/templates/PROVIDER_CHANGELOG_TEMPLATE.rst.jinja2 index 594379827e878..3c9e939e20362 100644 --- a/dev/breeze/src/airflow_breeze/templates/PROVIDER_CHANGELOG_TEMPLATE.rst.jinja2 +++ b/dev/breeze/src/airflow_breeze/templates/PROVIDER_CHANGELOG_TEMPLATE.rst.jinja2 @@ -37,8 +37,7 @@ specific language governing permissions and limitations under the License. - .. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE - OVERWRITTEN WHEN PREPARING PACKAGES. + .. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE OVERWRITTEN! .. IF YOU WANT TO MODIFY THIS FILE, YOU SHOULD MODIFY THE TEMPLATE `PROVIDER_CHANGELOG_TEMPLATE.rst.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY diff --git a/dev/breeze/src/airflow_breeze/templates/PROVIDER_COMMITS_TEMPLATE.rst.jinja2 b/dev/breeze/src/airflow_breeze/templates/PROVIDER_COMMITS_TEMPLATE.rst.jinja2 index f89b0913bc0e0..12beac1846bc4 100644 --- a/dev/breeze/src/airflow_breeze/templates/PROVIDER_COMMITS_TEMPLATE.rst.jinja2 +++ b/dev/breeze/src/airflow_breeze/templates/PROVIDER_COMMITS_TEMPLATE.rst.jinja2 @@ -34,13 +34,12 @@ specific language governing permissions and limitations under the License. - .. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE - OVERWRITTEN WHEN PREPARING PACKAGES. + .. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE OVERWRITTEN! .. IF YOU WANT TO MODIFY THIS FILE, YOU SHOULD MODIFY THE TEMPLATE `PROVIDER_COMMITS_TEMPLATE.rst.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY - .. THE REMAINDER OF THE FILE IS AUTOMATICALLY GENERATED. IT WILL BE OVERWRITTEN AT RELEASE TIME! + .. THE REMAINDER OF THE FILE IS AUTOMATICALLY GENERATED. IT WILL BE OVERWRITTEN! Package {{ PACKAGE_PIP_NAME }} ------------------------------------------------------ diff --git a/dev/breeze/src/airflow_breeze/templates/PROVIDER_README_TEMPLATE.rst.jinja2 b/dev/breeze/src/airflow_breeze/templates/PROVIDER_README_TEMPLATE.rst.jinja2 index 9bcff72fe85ee..615dc65c3278f 100644 --- a/dev/breeze/src/airflow_breeze/templates/PROVIDER_README_TEMPLATE.rst.jinja2 +++ b/dev/breeze/src/airflow_breeze/templates/PROVIDER_README_TEMPLATE.rst.jinja2 @@ -34,8 +34,7 @@ specific language governing permissions and limitations under the License. - .. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE - OVERWRITTEN WHEN PREPARING PACKAGES. + .. NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE OVERWRITTEN! .. IF YOU WANT TO MODIFY TEMPLATE FOR THIS FILE, YOU SHOULD MODIFY THE TEMPLATE `PROVIDER_README_TEMPLATE.rst.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY diff --git a/dev/breeze/src/airflow_breeze/templates/get_provider_info_TEMPLATE.py.jinja2 b/dev/breeze/src/airflow_breeze/templates/get_provider_info_TEMPLATE.py.jinja2 index 5340dc9b76a14..f1283fdb2297f 100644 --- a/dev/breeze/src/airflow_breeze/templates/get_provider_info_TEMPLATE.py.jinja2 +++ b/dev/breeze/src/airflow_breeze/templates/get_provider_info_TEMPLATE.py.jinja2 @@ -34,8 +34,7 @@ # specific language governing permissions and limitations # under the License. -# NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE -# OVERWRITTEN WHEN PREPARING PACKAGES. +# NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE OVERWRITTEN! # # IF YOU WANT TO MODIFY THIS FILE, YOU SHOULD MODIFY THE TEMPLATE # `get_provider_info_TEMPLATE.py.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY diff --git a/dev/breeze/src/airflow_breeze/templates/pyproject_TEMPLATE.toml.jinja2 b/dev/breeze/src/airflow_breeze/templates/pyproject_TEMPLATE.toml.jinja2 index a375ffedc63ef..4f364a0026e77 100644 --- a/dev/breeze/src/airflow_breeze/templates/pyproject_TEMPLATE.toml.jinja2 +++ b/dev/breeze/src/airflow_breeze/templates/pyproject_TEMPLATE.toml.jinja2 @@ -34,8 +34,7 @@ # specific language governing permissions and limitations # under the License. -# NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE -# OVERWRITTEN WHEN PREPARING PACKAGES. +# NOTE! THIS FILE IS AUTOMATICALLY GENERATED AND WILL BE OVERWRITTEN! # IF YOU WANT TO MODIFY THIS FILE, YOU SHOULD MODIFY THE TEMPLATE # `pyproject_TEMPLATE.toml.jinja2` IN the `dev/breeze/src/airflow_breeze/templates` DIRECTORY @@ -69,7 +68,7 @@ classifiers = [ {%- endfor %} "Topic :: System :: Monitoring", ] -requires-python = "~=3.8" +requires-python = "~=3.9" dependencies = [ {{- INSTALL_REQUIREMENTS }} ] @@ -80,7 +79,7 @@ dependencies = [ "Bug Tracker" = "https://github.com/apache/airflow/issues" "Source Code" = "https://github.com/apache/airflow" "Slack Chat" = "https://s.apache.org/airflow-slack" -"Twitter" = "https://twitter.com/ApacheAirflow" +"Twitter" = "https://x.com/ApacheAirflow" "YouTube" = "https://www.youtube.com/channel/UCSXwxpWZQ7XZ1WL3wqevChA/" [project.entry-points."apache_airflow_provider"] diff --git a/dev/breeze/src/airflow_breeze/utils/backtracking.py b/dev/breeze/src/airflow_breeze/utils/backtracking.py index d719be8d0d4bb..418e381b87fc4 100644 --- a/dev/breeze/src/airflow_breeze/utils/backtracking.py +++ b/dev/breeze/src/airflow_breeze/utils/backtracking.py @@ -38,7 +38,7 @@ def print_backtracking_candidates(): all_latest_dependencies_response = requests.get( "https://raw.githubusercontent.com/apache/airflow/" - "constraints-main/constraints-source-providers-3.8.txt" + "constraints-main/constraints-source-providers-3.9.txt" ) all_latest_dependencies_response.raise_for_status() constraints_text = all_latest_dependencies_response.text diff --git a/dev/breeze/src/airflow_breeze/utils/black_utils.py b/dev/breeze/src/airflow_breeze/utils/black_utils.py index 23891b8206c94..678b82ccd3139 100644 --- a/dev/breeze/src/airflow_breeze/utils/black_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/black_utils.py @@ -17,14 +17,14 @@ from __future__ import annotations import os -from functools import lru_cache from black import Mode, TargetVersion, format_str, parse_pyproject_toml +from airflow_breeze.utils.functools_cache import clearable_cache from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT -@lru_cache(maxsize=None) +@clearable_cache def _black_mode() -> Mode: config = parse_pyproject_toml(os.path.join(AIRFLOW_SOURCES_ROOT, "pyproject.toml")) target_versions = {TargetVersion[val.upper()] for val in config.get("target_version", ())} diff --git a/dev/breeze/src/airflow_breeze/utils/cdxgen.py b/dev/breeze/src/airflow_breeze/utils/cdxgen.py index 3d129b4674e86..5431daaa0fb90 100644 --- a/dev/breeze/src/airflow_breeze/utils/cdxgen.py +++ b/dev/breeze/src/airflow_breeze/utils/cdxgen.py @@ -24,11 +24,11 @@ import sys import time from abc import abstractmethod -from csv import DictWriter +from collections.abc import Generator from dataclasses import dataclass from multiprocessing.pool import Pool from pathlib import Path -from typing import Any, Generator +from typing import TYPE_CHECKING, Any import yaml @@ -42,9 +42,13 @@ download_file_from_github, ) from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT, FILES_SBOM_DIR +from airflow_breeze.utils.projects_google_spreadsheet import MetadataFromSpreadsheet, get_project_metadata from airflow_breeze.utils.run_utils import run_command from airflow_breeze.utils.shared_options import get_dry_run +if TYPE_CHECKING: + from rich.console import Console + def start_cdxgen_server(application_root_path: Path, run_in_parallel: bool, parallelism: int) -> None: """ @@ -538,7 +542,7 @@ def get_vcs(dependency: dict[str, Any]) -> str: if "externalReferences" in dependency: for reference in dependency["externalReferences"]: if reference["type"] == "vcs": - return reference["url"] + return reference["url"].replace("http://", "https://") return "" @@ -570,10 +574,51 @@ def get_pypi_link(dependency: dict[str, Any]) -> str: "SAST", ] +CHECK_DOCS: dict[str, str] = {} + -def get_open_psf_scorecard(vcs): +def get_github_stats( + vcs: str, project_name: str, github_token: str | None, console: Console +) -> dict[str, Any]: import requests + result = {} + if vcs and vcs.startswith("https://github.com/"): + importance = "Low" + api_url = vcs.replace("https://github.com/", "https://api.github.com/repos/") + if api_url.endswith("/"): + api_url = api_url[:-1] + headers = {"Authorization": f"token {github_token}"} if github_token else {} + console.print(f"[bright_blue]Retrieving GitHub Stats from {api_url}") + response = requests.get(api_url, headers=headers) + if response.status_code == 404: + console.print(f"[yellow]Github API returned 404 for {api_url}") + return {} + response.raise_for_status() + github_data = response.json() + stargazer_count = github_data.get("stargazers_count") + forks_count = github_data.get("forks_count") + if project_name in get_project_metadata(MetadataFromSpreadsheet.KNOWN_LOW_IMPORTANCE_PROJECTS): + importance = "Low" + elif project_name in get_project_metadata(MetadataFromSpreadsheet.KNOWN_MEDIUM_IMPORTANCE_PROJECTS): + importance = "Medium" + elif project_name in get_project_metadata(MetadataFromSpreadsheet.KNOWN_HIGH_IMPORTANCE_PROJECTS): + importance = "High" + elif forks_count > 1000 or stargazer_count > 1000: + importance = "High" + elif stargazer_count > 100 or forks_count > 100: + importance = "Medium" + result["Industry importance"] = importance + console.print("[green]Successfully retrieved GitHub Stats.") + else: + console.print(f"[yellow]Not retrieving Github Stats for {vcs}") + return result + + +def get_open_psf_scorecard(vcs: str, project_name: str, console: Console) -> dict[str, Any]: + import requests + + console.print(f"[info]Retrieving Open PSF Scorecard for {project_name}") repo_url = vcs.split("://")[1] open_psf_url = f"https://api.securityscorecards.dev/projects/{repo_url}" scorecard_response = requests.get(open_psf_url) @@ -586,58 +631,41 @@ def get_open_psf_scorecard(vcs): if "checks" in open_psf_scorecard: for check in open_psf_scorecard["checks"]: check_name = check["name"] + score = check["score"] results["OPSF-" + check_name] = check["score"] reason = check.get("reason") or "" if check.get("details"): reason += "\n".join(check["details"]) results["OPSF-Details-" + check_name] = reason + CHECK_DOCS[check_name] = check["documentation"]["short"] + "\n" + check["documentation"]["url"] + if check_name == "Maintained": + if project_name in get_project_metadata(MetadataFromSpreadsheet.KNOWN_STABLE_PROJECTS): + lifecycle_status = "Stable" + else: + if score == 0: + lifecycle_status = "Abandoned" + elif score < 6: + lifecycle_status = "Somewhat maintained" + else: + lifecycle_status = "Actively maintained" + results["Lifecycle status"] = lifecycle_status + if check_name == "Vulnerabilities": + results["Unpatched Vulns"] = "Yes" if score != 10 else "" + console.print(f"[success]Retrieved Open PSF Scorecard for {project_name}") return results -def convert_sbom_to_csv( - writer: DictWriter, - dependency: dict[str, Any], - is_core: bool, - is_devel: bool, - include_open_psf_scorecard: bool = False, -) -> None: - """ - Convert SBOM to CSV - :param writer: CSV writer - :param dependency: Dependency to convert - :param is_core: Whether the dependency is core or not - """ - get_console().print(f"[info]Converting {dependency['name']} to CSV") - vcs = get_vcs(dependency) - name = dependency.get("name", "") - if name.startswith("apache-airflow"): - return - row = { - "Name": dependency.get("name", ""), - "Author": dependency.get("author", ""), - "Version": dependency.get("version", ""), - "Description": dependency.get("description"), - "Core": is_core, - "Devel": is_devel, - "Licenses": convert_licenses(dependency.get("licenses", [])), - "Purl": dependency.get("purl"), - "Pypi": get_pypi_link(dependency), - "Vcs": vcs, - } - if vcs and include_open_psf_scorecard: - open_psf_scorecard = get_open_psf_scorecard(vcs) - row.update(open_psf_scorecard) - writer.writerow(row) - - -def get_field_names(include_open_psf_scorecard: bool) -> list[str]: - names = ["Name", "Author", "Version", "Description", "Core", "Devel", "Licenses", "Purl", "Pypi", "Vcs"] - if include_open_psf_scorecard: - names.append("OPSF-Score") - for check in OPEN_PSF_CHECKS: - names.append("OPSF-" + check) - names.append("OPSF-Details-" + check) - return names +def get_governance(vcs: str | None): + if not vcs or not vcs.startswith("https://github.com/"): + return "" + organization = vcs.split("/")[3] + if organization.lower() in get_project_metadata(MetadataFromSpreadsheet.KNOWN_REPUTABLE_FOUNDATIONS): + return "Reputable Foundation" + if organization.lower() in get_project_metadata(MetadataFromSpreadsheet.KNOWN_STRONG_COMMUNITIES): + return "Strong Community" + if organization.lower() in get_project_metadata(MetadataFromSpreadsheet.KNOWN_COMPANIES): + return "Company" + return "Loose community/ Single Person" def normalize_package_name(name): diff --git a/dev/breeze/src/airflow_breeze/utils/coertions.py b/dev/breeze/src/airflow_breeze/utils/coertions.py index 415b9472c23d1..6f8c2c21baac8 100644 --- a/dev/breeze/src/airflow_breeze/utils/coertions.py +++ b/dev/breeze/src/airflow_breeze/utils/coertions.py @@ -17,7 +17,7 @@ from __future__ import annotations -from typing import Iterable +from collections.abc import Iterable def coerce_bool_value(value: str | bool) -> bool: diff --git a/dev/breeze/src/airflow_breeze/utils/console.py b/dev/breeze/src/airflow_breeze/utils/console.py index 1083875c372a4..6e0b3e77dbfe7 100644 --- a/dev/breeze/src/airflow_breeze/utils/console.py +++ b/dev/breeze/src/airflow_breeze/utils/console.py @@ -23,12 +23,13 @@ import os from enum import Enum -from functools import lru_cache from typing import NamedTuple, TextIO from rich.console import Console from rich.theme import Theme +from airflow_breeze.utils.functools_cache import clearable_cache + recording_width = os.environ.get("RECORD_BREEZE_WIDTH") recording_file = os.environ.get("RECORD_BREEZE_OUTPUT_FILE") @@ -83,14 +84,14 @@ class Output(NamedTuple): @property def file(self) -> TextIO: - return open(self.file_name, "a+t") + return open(self.file_name, "a+") @property def escaped_title(self) -> str: return self.title.replace("[", "\\[") -@lru_cache(maxsize=None) +@clearable_cache def get_console(output: Output | None = None) -> Console: return Console( force_terminal=True, @@ -102,7 +103,7 @@ def get_console(output: Output | None = None) -> Console: ) -@lru_cache(maxsize=None) +@clearable_cache def get_stderr_console(output: Output | None = None) -> Console: return Console( force_terminal=True, diff --git a/dev/breeze/src/airflow_breeze/utils/custom_param_types.py b/dev/breeze/src/airflow_breeze/utils/custom_param_types.py index 8f0529ffb0e30..a5d99fc846073 100644 --- a/dev/breeze/src/airflow_breeze/utils/custom_param_types.py +++ b/dev/breeze/src/airflow_breeze/utils/custom_param_types.py @@ -18,8 +18,9 @@ import os import re +from collections.abc import Sequence from dataclasses import dataclass -from typing import Any, Sequence +from typing import Any import click from click import Context, Parameter, ParamType diff --git a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py index ef7779afd9749..39641bf7e4e08 100644 --- a/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/docker_command_utils.py @@ -23,6 +23,7 @@ import os import re import sys +from functools import lru_cache from subprocess import DEVNULL, CalledProcessError, CompletedProcess from typing import TYPE_CHECKING @@ -51,7 +52,6 @@ DOCKER_DEFAULT_PLATFORM, MIN_DOCKER_COMPOSE_VERSION, MIN_DOCKER_VERSION, - SEQUENTIAL_EXECUTOR, ) from airflow_breeze.utils.console import Output, get_console from airflow_breeze.utils.run_utils import ( @@ -202,7 +202,8 @@ def check_docker_version(quiet: bool = False): dry_run_override=False, ) if docker_version_result.returncode == 0: - docker_version = docker_version_result.stdout.strip() + regex = re.compile(r"^(" + version.VERSION_PATTERN + r").*$", re.VERBOSE | re.IGNORECASE) + docker_version = re.sub(regex, r"\1", docker_version_result.stdout.strip()) if docker_version == "": get_console().print( f""" @@ -413,7 +414,7 @@ def prepare_docker_build_command( final_command.extend(image_params.common_docker_build_flags) final_command.extend(["--pull"]) final_command.extend(image_params.prepare_arguments_for_docker_build_command()) - final_command.extend(["-t", image_params.airflow_image_name_with_tag, "--target", "main", "."]) + final_command.extend(["-t", image_params.airflow_image_name, "--target", "main", "."]) final_command.extend( ["-f", "Dockerfile" if isinstance(image_params, BuildProdParams) else "Dockerfile.ci"] ) @@ -429,7 +430,7 @@ def construct_docker_push_command( :param image_params: parameters of the image :return: Command to run as list of string """ - return ["docker", "push", image_params.airflow_image_name_with_tag] + return ["docker", "push", image_params.airflow_image_name] def build_cache(image_params: CommonBuildParams, output: Output | None) -> RunCommandResult: @@ -502,6 +503,7 @@ def check_executable_entrypoint_permissions(quiet: bool = False): get_console().print("[success]Executable permissions on entrypoints are OK[/]") +@lru_cache def perform_environment_checks(quiet: bool = False): check_docker_is_running() check_docker_version(quiet) @@ -529,7 +531,6 @@ def warm_up_docker_builder(image_params_list: list[CommonBuildParams]): docker_syntax = get_docker_syntax_version() get_console().print(f"[info]Warming up the {docker_context} builder for syntax: {docker_syntax}") warm_up_image_param = copy.deepcopy(image_params_list[0]) - warm_up_image_param.image_tag = "warmup" warm_up_image_param.push = False warm_up_image_param.platform = platform build_command = prepare_base_build_command(image_params=warm_up_image_param) @@ -719,16 +720,13 @@ def execute_command_in_shell( :param command: """ shell_params.backend = "sqlite" - shell_params.executor = SEQUENTIAL_EXECUTOR shell_params.forward_ports = False shell_params.project_name = project_name shell_params.quiet = True shell_params.skip_environment_initialization = True shell_params.skip_image_upgrade_check = True if get_verbose(): - get_console().print(f"[warning]Backend forced to: sqlite and {SEQUENTIAL_EXECUTOR}[/]") get_console().print("[warning]Sqlite DB is cleaned[/]") - get_console().print(f"[warning]Executor forced to {SEQUENTIAL_EXECUTOR}[/]") get_console().print("[warning]Disabled port forwarding[/]") get_console().print(f"[warning]Project name set to: {project_name}[/]") get_console().print("[warning]Forced quiet mode[/]") @@ -770,13 +768,6 @@ def enter_shell(shell_params: ShellParams, output: Output | None = None) -> RunC ) bring_compose_project_down(preserve_volumes=False, shell_params=shell_params) - if shell_params.backend == "sqlite" and shell_params.executor != SEQUENTIAL_EXECUTOR: - get_console().print( - f"\n[warning]backend: sqlite is not " - f"compatible with executor: {shell_params.executor}. " - f"Changing the executor to {SEQUENTIAL_EXECUTOR}.\n" - ) - shell_params.executor = SEQUENTIAL_EXECUTOR if shell_params.restart: bring_compose_project_down(preserve_volumes=False, shell_params=shell_params) if shell_params.include_mypy_volume: diff --git a/scripts/ci/pre_commit/check_providers_init.py b/dev/breeze/src/airflow_breeze/utils/functools_cache.py old mode 100755 new mode 100644 similarity index 66% rename from scripts/ci/pre_commit/check_providers_init.py rename to dev/breeze/src/airflow_breeze/utils/functools_cache.py index 33def71253f34..bb88924624538 --- a/scripts/ci/pre_commit/check_providers_init.py +++ b/dev/breeze/src/airflow_breeze/utils/functools_cache.py @@ -1,5 +1,3 @@ -#!/usr/bin/env python -# # Licensed to the Apache Software Foundation (ASF) under one # or more contributor license agreements. See the NOTICE file # distributed with this work for additional information @@ -18,14 +16,17 @@ # under the License. from __future__ import annotations -import sys -from pathlib import Path +cached_functions = [] + + +def clearable_cache(func): + from functools import cache + + cached_function = cache(func) + cached_functions.append(cached_function) + return cached_function -AIRFLOW_SOURCES = Path(__file__).parents[3] -PROVIDERS_INIT_FILE = AIRFLOW_SOURCES / "airflow" / "providers" / "__init__.py" -print(f"Checking if {PROVIDERS_INIT_FILE} exists.") -if PROVIDERS_INIT_FILE.exists(): - print(f"\033[0;31mERROR: {PROVIDERS_INIT_FILE} file should not exist. Deleting it.\033[0m\n") - PROVIDERS_INIT_FILE.unlink() - sys.exit(1) +def clear_all_cached_functions(): + for func in cached_functions: + func.cache_clear() diff --git a/dev/breeze/src/airflow_breeze/utils/github.py b/dev/breeze/src/airflow_breeze/utils/github.py index 7a40643e88de7..47b3f814be413 100644 --- a/dev/breeze/src/airflow_breeze/utils/github.py +++ b/dev/breeze/src/airflow_breeze/utils/github.py @@ -16,8 +16,11 @@ # under the License. from __future__ import annotations +import os import re import sys +import tempfile +import zipfile from datetime import datetime, timezone from pathlib import Path from typing import Any @@ -54,6 +57,13 @@ def download_file_from_github(tag: str, path: str, output_file: Path, timeout: i if not get_dry_run(): try: response = requests.get(url, timeout=timeout) + if response.status_code == 403: + get_console().print( + f"[error]The {url} is not accessible.This may be caused by either of:\n" + f" 1. network issues or VPN settings\n" + f" 2. Github rate limit" + ) + return False if response.status_code == 404: get_console().print(f"[warning]The {url} has not been found. Skipping") return False @@ -170,3 +180,119 @@ def get_tag_date(tag: str) -> str | None: tag_object.committed_date if hasattr(tag_object, "committed_date") else tag_object.tagged_date ) return datetime.fromtimestamp(timestamp, tz=timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ") + + +def download_artifact_from_run_id(run_id: str, output_file: Path, github_repository: str, github_token: str): + """ + Downloads a file from GitHub Actions artifact + + :param run_id: run_id of the workflow + :param output_file: Path where the file should be downloaded + :param github_repository: GitHub repository + :param github_token: GitHub token + """ + import requests + from tqdm import tqdm + + url = f"https://api.github.com/repos/{github_repository}/actions/runs/{run_id}/artifacts" + headers = {"Accept": "application/vnd.github.v3+json"} + + session = requests.Session() + headers["Authorization"] = f"Bearer {github_token}" + artifact_response = requests.get(url, headers=headers) + + if artifact_response.status_code != 200: + get_console().print( + "[error]Describing artifacts failed with status code " + f"{artifact_response.status_code}: {artifact_response.text}", + ) + sys.exit(1) + + download_url = None + file_name = os.path.splitext(os.path.basename(output_file))[0] + for artifact in artifact_response.json()["artifacts"]: + if artifact["name"].startswith(file_name): + download_url = artifact["archive_download_url"] + break + + if not download_url: + get_console().print(f"[error]No artifact found for {file_name}") + sys.exit(1) + + get_console().print(f"[info]Downloading artifact from {download_url} to {output_file}") + + response = session.get(download_url, stream=True, headers=headers) + + if response.status_code != 200: + get_console().print( + "[error]Downloading artifacts failed with status code " + f"{response.status_code}: {response.text}", + ) + sys.exit(1) + + total_size = int(response.headers.get("content-length", 0)) + temp_file = tempfile.NamedTemporaryFile().name + "/file.zip" + os.makedirs(os.path.dirname(temp_file), exist_ok=True) + + with tqdm(total=total_size, unit="B", unit_scale=True, desc=temp_file, ascii=True) as progress_bar: + with open(temp_file, "wb") as f: + for chunk in response.iter_content(chunk_size=1 * 1024 * 1024): + if chunk: + f.write(chunk) + progress_bar.update(len(chunk)) + + with zipfile.ZipFile(temp_file, "r") as zip_ref: + zip_ref.extractall("/tmp/") + + os.remove(temp_file) + + +def download_artifact_from_pr(pr: str, output_file: Path, github_repository: str, github_token: str): + import requests + + pr_number = pr.lstrip("#") + pr_url = f"https://api.github.com/repos/{github_repository}/pulls/{pr_number}" + workflow_run_url = f"https://api.github.com/repos/{github_repository}/actions/runs" + + headers = {"Accept": "application/vnd.github.v3+json"} + + session = requests.Session() + headers["Authorization"] = f"Bearer {github_token}" + + pull_response = session.get(pr_url, headers=headers) + + if pull_response.status_code != 200: + get_console().print( + "[error]Fetching PR failed with status codee " + f"{pull_response.status_code}: {pull_response.text}", + ) + sys.exit(1) + + ref = pull_response.json()["head"]["ref"] + + workflow_runs = session.get( + workflow_run_url, headers=headers, params={"event": "pull_request", "branch": ref} + ) + + if workflow_runs.status_code != 200: + get_console().print( + "[error]Fetching workflow runs failed with status code %s, %s, " + "you might need to provide GITHUB_TOKEN, set it as environment variable", + workflow_runs.status_code, + workflow_runs.content, + ) + sys.exit(1) + + data = workflow_runs.json()["workflow_runs"] + sorted_data = sorted(data, key=lambda x: datetime.fromisoformat(x["created_at"]), reverse=True) + run_id = None + # Filter only workflow with ci.yml, we may get multiple workflows for a PR ex: codeql-analysis.yml, news-fragment.yml + + for run in sorted_data: + if run.get("path").endswith("ci.yml"): + run_id = run["id"] + break + + get_console().print(f"[info]Found run id {run_id} for PR {pr}") + + download_artifact_from_run_id(str(run_id), output_file, github_repository, github_token) diff --git a/dev/breeze/src/airflow_breeze/utils/image.py b/dev/breeze/src/airflow_breeze/utils/image.py index 3adc920a4c97d..cd2301ccf487e 100644 --- a/dev/breeze/src/airflow_breeze/utils/image.py +++ b/dev/breeze/src/airflow_breeze/utils/image.py @@ -29,7 +29,7 @@ from airflow_breeze.params.shell_params import ShellParams from airflow_breeze.utils.ci_group import ci_group from airflow_breeze.utils.console import Output, get_console -from airflow_breeze.utils.mark_image_as_refreshed import mark_image_as_refreshed +from airflow_breeze.utils.mark_image_as_refreshed import mark_image_as_rebuilt from airflow_breeze.utils.parallel import ( DOCKER_PULL_PROGRESS_REGEXP, GenericRegexpProgressMatcher, @@ -53,7 +53,6 @@ def run_pull_in_parallel( python_version_list: list[str], verify: bool, include_success_outputs: bool, - tag_as_latest: bool, wait_for_image: bool, extra_pytest_args: tuple, ): @@ -77,7 +76,6 @@ def get_kwds(index: int, image_param: BuildCiParams | BuildProdParams): d = { "image_params": image_param, "wait_for_image": wait_for_image, - "tag_as_latest": tag_as_latest, "poll_time_seconds": 10.0, "output": outputs[index], } @@ -101,7 +99,6 @@ def get_kwds(index: int, image_param: BuildCiParams | BuildProdParams): def run_pull_image( image_params: CommonBuildParams, wait_for_image: bool, - tag_as_latest: bool, output: Output | None, poll_time_seconds: float = 10.0, max_time_minutes: float = 70, @@ -113,24 +110,23 @@ def run_pull_image( :param output: output to write to :param wait_for_image: whether we should wait for the image to be available - :param tag_as_latest: tag the image as latest :param poll_time_seconds: what's the polling time between checks if images are there (default 10 s) :param max_time_minutes: what's the maximum time to wait for the image to be pulled (default 70 minutes) :return: Tuple of return code and description of the image pulled """ get_console(output=output).print( f"\n[info]Pulling {image_params.image_type} image of airflow python version: " - f"{image_params.python} image: {image_params.airflow_image_name_with_tag} " + f"{image_params.python} image: {image_params.airflow_image_name} " f"with wait for image: {wait_for_image} and max time to poll {max_time_minutes} minutes[/]\n" ) current_loop = 1 start_time = time.time() while True: - command_to_run = ["docker", "pull", image_params.airflow_image_name_with_tag] + command_to_run = ["docker", "pull", image_params.airflow_image_name] command_result = run_command(command_to_run, check=False, output=output) if command_result.returncode == 0: command_result = run_command( - ["docker", "inspect", image_params.airflow_image_name_with_tag, "-f", "{{.Size}}"], + ["docker", "inspect", image_params.airflow_image_name, "-f", "{{.Size}}"], capture_output=True, output=output, text=True, @@ -152,22 +148,20 @@ def run_pull_image( command_result.returncode, f"Image Python {image_params.python}", ) - if tag_as_latest: - command_result = tag_image_as_latest(image_params=image_params, output=output) - if command_result.returncode == 0 and isinstance(image_params, BuildCiParams): - mark_image_as_refreshed(image_params) + if isinstance(image_params, BuildCiParams): + mark_image_as_rebuilt(image_params) return command_result.returncode, f"Image Python {image_params.python}" if wait_for_image: if get_verbose() or get_dry_run(): get_console(output=output).print( - f"\n[info]Waiting: #{current_loop} {image_params.airflow_image_name_with_tag}.[/]\n" + f"\n[info]Waiting: #{current_loop} {image_params.airflow_image_name}.[/]\n" ) time.sleep(poll_time_seconds) current_loop += 1 current_time = time.time() if (current_time - start_time) / 60 > max_time_minutes: get_console(output=output).print( - f"\n[error]The image {image_params.airflow_image_name_with_tag} " + f"\n[error]The image {image_params.airflow_image_name} " f"did not appear in {max_time_minutes} minutes. Failing.[/]\n" ) return 1, f"Image Python {image_params.python}" @@ -179,33 +173,9 @@ def run_pull_image( return command_result.returncode, f"Image Python {image_params.python}" -def tag_image_as_latest(image_params: CommonBuildParams, output: Output | None) -> RunCommandResult: - if image_params.airflow_image_name_with_tag == image_params.airflow_image_name: - get_console(output=output).print( - f"[info]Skip tagging {image_params.airflow_image_name} as latest as it is already 'latest'[/]" - ) - return subprocess.CompletedProcess(returncode=0, args=[]) - command = run_command( - [ - "docker", - "tag", - image_params.airflow_image_name_with_tag, - image_params.airflow_image_name + ":latest", - ], - output=output, - capture_output=True, - check=False, - ) - if command.returncode != 0: - get_console(output=output).print(command.stdout) - get_console(output=output).print(command.stderr) - return command - - def run_pull_and_verify_image( image_params: CommonBuildParams, wait_for_image: bool, - tag_as_latest: bool, poll_time_seconds: float, extra_pytest_args: tuple, output: Output | None, @@ -213,7 +183,6 @@ def run_pull_and_verify_image( return_code, info = run_pull_image( image_params=image_params, wait_for_image=wait_for_image, - tag_as_latest=tag_as_latest, output=output, poll_time_seconds=poll_time_seconds, ) @@ -222,7 +191,7 @@ def run_pull_and_verify_image( f"\n[error]Not running verification for {image_params.python} as pulling failed.[/]\n" ) return verify_an_image( - image_name=image_params.airflow_image_name_with_tag, + image_name=image_params.airflow_image_name, image_type=image_params.image_type, output=output, slim_image=False, @@ -239,9 +208,9 @@ def just_pull_ci_image(github_repository, python_version: str) -> tuple[ShellPar skip_image_upgrade_check=True, quiet=True, ) - get_console().print(f"[info]Pulling {shell_params.airflow_image_name_with_tag}.[/]") + get_console().print(f"[info]Pulling {shell_params.airflow_image_name}.[/]") pull_command_result = run_command( - ["docker", "pull", shell_params.airflow_image_name_with_tag], + ["docker", "pull", shell_params.airflow_image_name], check=True, ) return shell_params, pull_command_result @@ -259,7 +228,7 @@ def check_if_ci_image_available( quiet=True, ) inspect_command_result = run_command( - ["docker", "inspect", shell_params.airflow_image_name_with_tag], + ["docker", "inspect", shell_params.airflow_image_name], stdout=subprocess.DEVNULL, check=False, ) @@ -273,9 +242,7 @@ def find_available_ci_image(github_repository: str) -> ShellParams: for python_version in ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS: shell_params, inspect_command_result = check_if_ci_image_available(github_repository, python_version) if inspect_command_result.returncode == 0: - get_console().print( - f"[info]Running fix_ownership with {shell_params.airflow_image_name_with_tag}.[/]" - ) + get_console().print(f"[info]Running fix_ownership with {shell_params.airflow_image_name}.[/]") return shell_params shell_params, _ = just_pull_ci_image(github_repository, DEFAULT_PYTHON_MAJOR_MINOR_VERSION) return shell_params diff --git a/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py b/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py index b9bdc5302bdfc..8525d1e9acc50 100644 --- a/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/kubernetes_utils.py @@ -51,7 +51,7 @@ from airflow_breeze.utils.shared_options import get_dry_run, get_verbose from airflow_breeze.utils.virtualenv_utils import create_pip_command, create_uv_command -K8S_ENV_PATH = BUILD_CACHE_DIR / ".k8s-env" +K8S_ENV_PATH = BUILD_CACHE_DIR / "k8s-env" K8S_CLUSTERS_PATH = BUILD_CACHE_DIR / ".k8s-clusters" K8S_BIN_BASE_PATH = K8S_ENV_PATH / "bin" KIND_BIN_PATH = K8S_BIN_BASE_PATH / "kind" @@ -305,8 +305,10 @@ def _requirements_changed() -> bool: def _install_packages_in_k8s_virtualenv(): if check_if_cache_exists("use_uv"): + get_console().print("[info]Using uv to install k8s env[/]") command = create_uv_command(PYTHON_BIN_PATH) else: + get_console().print("[info]Using pip to install k8s env[/]") command = create_pip_command(PYTHON_BIN_PATH) install_command_no_constraints = [ *command, @@ -406,7 +408,7 @@ def create_virtualenv(force_venv_setup: bool) -> RunCommandResult: f"{venv_command_result.stdout}\n{venv_command_result.stderr}" ) return venv_command_result - get_console().print(f"[info]Reinstalling PIP version in {K8S_ENV_PATH}") + get_console().print(f"[info]Reinstalling pip=={PIP_VERSION} in {K8S_ENV_PATH}") command = create_pip_command(PYTHON_BIN_PATH) pip_reinstall_result = run_command( [*command, "install", f"pip=={PIP_VERSION}"], @@ -419,6 +421,7 @@ def create_virtualenv(force_venv_setup: bool) -> RunCommandResult: f"{pip_reinstall_result.stdout}\n{pip_reinstall_result.stderr}" ) return pip_reinstall_result + get_console().print(f"[info]Reinstalling uv=={UV_VERSION} in {K8S_ENV_PATH}") uv_reinstall_result = run_command( [*command, "install", f"uv=={UV_VERSION}"], check=False, diff --git a/dev/breeze/src/airflow_breeze/utils/mark_image_as_refreshed.py b/dev/breeze/src/airflow_breeze/utils/mark_image_as_refreshed.py index 38a2916fd92e1..aa6a1392d07fe 100644 --- a/dev/breeze/src/airflow_breeze/utils/mark_image_as_refreshed.py +++ b/dev/breeze/src/airflow_breeze/utils/mark_image_as_refreshed.py @@ -26,7 +26,7 @@ from airflow_breeze.params.build_ci_params import BuildCiParams -def mark_image_as_refreshed(ci_image_params: BuildCiParams): +def mark_image_as_rebuilt(ci_image_params: BuildCiParams): ci_image_cache_dir = BUILD_CACHE_DIR / ci_image_params.airflow_branch ci_image_cache_dir.mkdir(parents=True, exist_ok=True) touch_cache_file(f"built_{ci_image_params.python}", root_dir=ci_image_cache_dir) diff --git a/dev/breeze/src/airflow_breeze/utils/packages.py b/dev/breeze/src/airflow_breeze/utils/packages.py index 6c4a824140791..1d7e4f1b03c6a 100644 --- a/dev/breeze/src/airflow_breeze/utils/packages.py +++ b/dev/breeze/src/airflow_breeze/utils/packages.py @@ -22,10 +22,11 @@ import os import subprocess import sys +from collections.abc import Iterable from enum import Enum from functools import lru_cache from pathlib import Path -from typing import Any, Iterable, NamedTuple +from typing import Any, NamedTuple from airflow_breeze.global_constants import ( ALLOWED_PYTHON_MAJOR_MINOR_VERSIONS, @@ -34,21 +35,23 @@ REGULAR_DOC_PACKAGES, ) from airflow_breeze.utils.console import get_console +from airflow_breeze.utils.functools_cache import clearable_cache from airflow_breeze.utils.path_utils import ( - AIRFLOW_PROVIDERS_ROOT, + AIRFLOW_ORIGINAL_PROVIDERS_DIR, + AIRFLOW_PROVIDERS_DIR, + AIRFLOW_SOURCES_ROOT, BREEZE_SOURCES_ROOT, - DOCS_ROOT, - GENERATED_PROVIDER_PACKAGES_DIR, PROVIDER_DEPENDENCIES_JSON_FILE_PATH, ) from airflow_breeze.utils.publish_docs_helpers import ( - _load_schema, - get_provider_yaml_paths, + NEW_PROVIDER_DATA_SCHEMA_PATH, + OLD_PROVIDER_DATA_SCHEMA_PATH, ) from airflow_breeze.utils.run_utils import run_command +from airflow_breeze.utils.version_utils import remove_local_version_suffix from airflow_breeze.utils.versions import get_version_tag, strip_leading_zeros_from_version -MIN_AIRFLOW_VERSION = "2.7.0" +MIN_AIRFLOW_VERSION = "2.9.0" HTTPS_REMOTE = "apache-https-for-providers" LONG_PROVIDERS_PREFIX = "apache-airflow-providers-" @@ -70,10 +73,13 @@ class PluginInfo(NamedTuple): class ProviderPackageDetails(NamedTuple): provider_id: str + provider_yaml_path: Path + is_new_structure: bool source_date_epoch: int full_package_name: str pypi_package_name: str - source_provider_package_path: Path + root_provider_path: Path + base_provider_package_path: Path documentation_provider_package_path: Path changelog_path: Path provider_description: str @@ -118,32 +124,40 @@ def from_requirement(cls, requirement_string: str) -> PipRequirements: return cls(package=package, version_required=version_required.strip()) +@clearable_cache +def old_provider_yaml_schema() -> dict[str, Any]: + with open(OLD_PROVIDER_DATA_SCHEMA_PATH) as schema_file: + return json.load(schema_file) + + +@clearable_cache +def new_provider_yaml_schema() -> dict[str, Any]: + with open(NEW_PROVIDER_DATA_SCHEMA_PATH) as schema_file: + return json.load(schema_file) + + PROVIDER_METADATA: dict[str, dict[str, Any]] = {} def refresh_provider_metadata_from_yaml_file(provider_yaml_path: Path): - import yaml - - schema = _load_schema() + schema = old_provider_yaml_schema() with open(provider_yaml_path) as yaml_file: - provider = yaml.safe_load(yaml_file) - try: - import jsonschema + import yaml - try: - jsonschema.validate(provider, schema=schema) - except jsonschema.ValidationError as ex: - msg = f"Unable to parse: {provider_yaml_path}. Original error {type(ex).__name__}: {ex}" - raise RuntimeError(msg) - except ImportError: - # we only validate the schema if jsonschema is available. This is needed for autocomplete - # to not fail with import error if jsonschema is not installed - pass - PROVIDER_METADATA[get_short_package_name(provider["package-name"])] = provider + provider_yaml_content = yaml.safe_load(yaml_file) + import jsonschema + + try: + jsonschema.validate(provider_yaml_content, schema=schema) + except jsonschema.ValidationError as ex: + msg = f"Unable to parse: {provider_yaml_path}. Original error {type(ex).__name__}: {ex}" + raise RuntimeError(msg) + provider_id = get_short_package_name(provider_yaml_content["package-name"]) + PROVIDER_METADATA[provider_id] = provider_yaml_content def refresh_provider_metadata_with_provider_id(provider_id: str): - provider_yaml_path = get_source_package_path(provider_id) / "provider.yaml" + provider_yaml_path, _ = get_provider_yaml(provider_id) refresh_provider_metadata_from_yaml_file(provider_yaml_path) @@ -152,7 +166,31 @@ def clear_cache_for_provider_metadata(provider_id: str): refresh_provider_metadata_with_provider_id(provider_id) -@lru_cache(maxsize=1) +@clearable_cache +def get_all_provider_yaml_paths() -> list[Path]: + """Returns list of provider.yaml files""" + return sorted(list(AIRFLOW_PROVIDERS_DIR.glob("**/provider.yaml"))) + + +def get_provider_id_from_path(file_path: Path) -> str | None: + """ + Get the provider id from the path of the file it belongs to. + """ + for parent in file_path.parents: + # This works fine for both new and old providers structure - because we moved provider.yaml to + # the top-level of the provider and this code finding "providers" will find the "providers" package + # in old structure and "providers" directory in new structure - in both cases we can determine + # the provider id from the relative folders + if (parent / "provider.yaml").exists(): + for providers_root_candidate in parent.parents: + if providers_root_candidate.name == "providers": + return parent.relative_to(providers_root_candidate).as_posix().replace("/", ".") + else: + return None + return None + + +@clearable_cache def get_provider_packages_metadata() -> dict[str, dict[str, Any]]: """ Load all data from providers files @@ -163,7 +201,7 @@ def get_provider_packages_metadata() -> dict[str, dict[str, Any]]: if PROVIDER_METADATA: return PROVIDER_METADATA - for provider_yaml_path in get_provider_yaml_paths(): + for provider_yaml_path in get_all_provider_yaml_paths(): refresh_provider_metadata_from_yaml_file(provider_yaml_path) return PROVIDER_METADATA @@ -380,16 +418,10 @@ def find_matching_long_package_names( ) -def get_source_package_path(provider_id: str) -> Path: - return AIRFLOW_PROVIDERS_ROOT.joinpath(*provider_id.split(".")) - - -def get_documentation_package_path(provider_id: str) -> Path: - return DOCS_ROOT / f"apache-airflow-providers-{provider_id.replace('.', '-')}" - - -def get_target_root_for_copied_provider_sources(provider_id: str) -> Path: - return GENERATED_PROVIDER_PACKAGES_DIR.joinpath(*provider_id.split(".")) +# We should not remove those old/original package paths as they are used to get changes +# When documentation is generated +def get_original_source_package_path(provider_id: str) -> Path: + return AIRFLOW_ORIGINAL_PROVIDERS_DIR.joinpath(*provider_id.split(".")) def get_pip_package_name(provider_id: str) -> str: @@ -413,7 +445,9 @@ def get_dist_package_name_prefix(provider_id: str) -> str: def apply_version_suffix(install_clause: str, version_suffix: str) -> str: - if install_clause.startswith("apache-airflow") and ">=" in install_clause and version_suffix: + # Need to resolve a version suffix based on PyPi versions, but can ignore local version suffix. + pypi_version_suffix = remove_local_version_suffix(version_suffix) + if pypi_version_suffix and install_clause.startswith("apache-airflow") and ">=" in install_clause: # Applies version suffix to the apache-airflow and provider package dependencies to make # sure that pre-release versions have correct limits - this address the issue with how # pip handles pre-release versions when packages are pre-release and refer to each other - we @@ -422,6 +456,8 @@ def apply_version_suffix(install_clause: str, version_suffix: str) -> str: # For example `apache-airflow-providers-fab==2.0.0.dev0` should refer to # `apache-airflow>=2.9.0.dev0` and not `apache-airflow>=2.9.0` because both packages are # released together and >= 2.9.0 is not correct reference for 2.9.0.dev0 version of Airflow. + # This assumes a local release, one where the suffix starts with a plus sign, uses the last + # version of the dependency, so it is not necessary to add the suffix to the dependency. prefix, version = install_clause.split(">=") # If version has a upper limit (e.g. ">=2.10.0,<3.0"), we need to cut this off not to fail if "," in version: @@ -430,9 +466,9 @@ def apply_version_suffix(install_clause: str, version_suffix: str) -> str: base_version = Version(version).base_version # always use `pre-release`+ `0` as the version suffix - version_suffix = version_suffix.rstrip("0123456789") + "0" + pypi_version_suffix = pypi_version_suffix.rstrip("0123456789") + "0" - target_version = Version(str(base_version) + "." + version_suffix) + target_version = Version(str(base_version) + "." + pypi_version_suffix) return prefix + ">=" + str(target_version) return install_clause @@ -462,14 +498,36 @@ def get_package_extras(provider_id: str, version_suffix: str) -> dict[str, list[ :param provider_id: id of the package """ + if provider_id == "providers": return {} if provider_id in get_removed_provider_ids(): return {} + + from packaging.requirements import Requirement + + deps_list = list( + map( + lambda x: Requirement(x).name, + PROVIDER_DEPENDENCIES.get(provider_id)["deps"], + ) + ) + deps = list(filter(lambda x: x.startswith("apache-airflow-providers"), deps_list)) extras_dict: dict[str, list[str]] = { module: [get_pip_package_name(module)] for module in PROVIDER_DEPENDENCIES.get(provider_id)["cross-providers-deps"] } + + to_pop_extras = [] + # remove the keys from extras_dict if the provider is already a required dependency + for k, v in extras_dict.items(): + if v and v[0] in deps: + to_pop_extras.append(k) + + for k in to_pop_extras: + get_console().print(f"[warning]Removing {k} from extras as it is already a required dependency") + del extras_dict[k] + provider_yaml_dict = get_provider_packages_metadata().get(provider_id) additional_extras = provider_yaml_dict.get("additional-extras") if provider_yaml_dict else None if additional_extras: @@ -493,6 +551,25 @@ def get_package_extras(provider_id: str, version_suffix: str) -> dict[str, list[ return extras_dict +def get_provider_yaml(provider_id: str) -> tuple[Path, bool]: + new_structure_provider_path = AIRFLOW_PROVIDERS_DIR / provider_id.replace(".", "/") / "provider.yaml" + if new_structure_provider_path.exists(): + return new_structure_provider_path, True + else: + return ( + AIRFLOW_SOURCES_ROOT / "airflow" / "providers" / provider_id.replace(".", "/") / "provider.yaml", + False, + ) + + +def load_pyproject_toml(pyproject_toml_file_path: Path) -> dict[str, Any]: + try: + import tomllib + except ImportError: + import tomli as tomllib + return tomllib.loads(pyproject_toml_file_path.read_text()) + + def get_provider_details(provider_id: str) -> ProviderPackageDetails: provider_info = get_provider_packages_metadata().get(provider_id) if not provider_info: @@ -508,18 +585,27 @@ def get_provider_details(provider_id: str) -> ProviderPackageDetails: class_name=class_name, ) ) + provider_yaml_path, is_new_structure = get_provider_yaml(provider_id) + dependencies = provider_info["dependencies"] + root_provider_path = (AIRFLOW_SOURCES_ROOT / "airflow" / "providers").joinpath(*provider_id.split(".")) + changelog_path = root_provider_path / "CHANGELOG.rst" + base_provider_package_path = root_provider_path + pypi_name = f"apache-airflow-providers-{provider_id.replace('.', '-')}" return ProviderPackageDetails( provider_id=provider_id, + provider_yaml_path=provider_yaml_path, + is_new_structure=is_new_structure, source_date_epoch=provider_info["source-date-epoch"], full_package_name=f"airflow.providers.{provider_id}", - pypi_package_name=f"apache-airflow-providers-{provider_id.replace('.', '-')}", - source_provider_package_path=get_source_package_path(provider_id), - documentation_provider_package_path=get_documentation_package_path(provider_id), - changelog_path=get_source_package_path(provider_id) / "CHANGELOG.rst", + pypi_package_name=pypi_name, + root_provider_path=root_provider_path, + base_provider_package_path=base_provider_package_path, + changelog_path=changelog_path, + documentation_provider_package_path=AIRFLOW_SOURCES_ROOT / "docs" / pypi_name, provider_description=provider_info["description"], - dependencies=provider_info["dependencies"], + dependencies=dependencies, versions=provider_info["versions"], - excluded_python_versions=provider_info.get("excluded-python-versions") or [], + excluded_python_versions=provider_info.get("excluded-python-versions", []), plugins=plugins, removed=provider_info["state"] == "removed", ) @@ -542,7 +628,7 @@ def get_min_airflow_version(provider_id: str) -> str: def get_python_requires(provider_id: str) -> str: - python_requires = "~=3.8" + python_requires = "~=3.9" provider_details = get_provider_details(provider_id=provider_id) for p in provider_details.excluded_python_versions: python_requires += f", !={p}" @@ -582,6 +668,27 @@ def get_cross_provider_dependent_packages(provider_package_id: str) -> list[str] return PROVIDER_DEPENDENCIES[provider_package_id]["cross-providers-deps"] +def format_version_suffix(version_suffix: str) -> str: + """ + Formats the version suffix by adding a dot prefix unless it is a local prefix. If no version suffix is + passed in, an empty string is returned. + + Args: + version_suffix (str): The version suffix to be formatted. + + Returns: + str: The formatted version suffix. + + """ + if version_suffix: + if version_suffix[0] == "." or version_suffix[0] == "+": + return version_suffix + else: + return f".{version_suffix}" + else: + return "" + + def get_provider_jinja_context( provider_id: str, current_release_version: str, @@ -601,7 +708,7 @@ def get_provider_jinja_context( "FULL_PACKAGE_NAME": provider_details.full_package_name, "RELEASE": current_release_version, "RELEASE_NO_LEADING_ZEROS": release_version_no_leading_zeros, - "VERSION_SUFFIX": f".{version_suffix}" if version_suffix else "", + "VERSION_SUFFIX": format_version_suffix(version_suffix), "PIP_REQUIREMENTS": get_provider_requirements(provider_details.provider_id), "PROVIDER_DESCRIPTION": provider_details.provider_description, "INSTALL_REQUIREMENTS": get_install_requirements( @@ -611,7 +718,7 @@ def get_provider_jinja_context( provider_id=provider_details.provider_id, version_suffix=version_suffix ), "CHANGELOG_RELATIVE_PATH": os.path.relpath( - provider_details.source_provider_package_path, + provider_details.root_provider_path, provider_details.documentation_provider_package_path, ), "CHANGELOG": changelog, @@ -746,7 +853,7 @@ def tag_exists_for_provider(provider_id: str, current_tag: str) -> bool: provider_details = get_provider_details(provider_id) result = run_command( ["git", "rev-parse", current_tag], - cwd=provider_details.source_provider_package_path, + cwd=provider_details.root_provider_path, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL, check=False, diff --git a/dev/breeze/src/airflow_breeze/utils/parallel.py b/dev/breeze/src/airflow_breeze/utils/parallel.py index cf3e1e6d3f2a4..8da62198ef414 100644 --- a/dev/breeze/src/airflow_breeze/utils/parallel.py +++ b/dev/breeze/src/airflow_breeze/utils/parallel.py @@ -23,13 +23,14 @@ import textwrap import time from abc import ABCMeta, abstractmethod +from collections.abc import Generator from contextlib import contextmanager from enum import Enum from multiprocessing.pool import ApplyResult, Pool from pathlib import Path from tempfile import NamedTemporaryFile from threading import Thread -from typing import Any, Generator, NamedTuple +from typing import Any, NamedTuple from rich.table import Table diff --git a/dev/breeze/src/airflow_breeze/utils/path_utils.py b/dev/breeze/src/airflow_breeze/utils/path_utils.py index b86cb837cbe46..a2b9d1d75e096 100644 --- a/dev/breeze/src/airflow_breeze/utils/path_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/path_utils.py @@ -27,11 +27,11 @@ import subprocess import sys import tempfile -from functools import lru_cache from pathlib import Path from airflow_breeze import NAME from airflow_breeze.utils.console import get_console +from airflow_breeze.utils.functools_cache import clearable_cache from airflow_breeze.utils.reinstall import reinstall_breeze, warn_dependencies_changed, warn_non_editable from airflow_breeze.utils.shared_options import get_verbose, set_forced_answer @@ -96,8 +96,12 @@ def get_package_setup_metadata_hash() -> str: from importlib_metadata import distribution # type: ignore[no-redef, assignment] prefix = "Package config hash: " - - for line in distribution("apache-airflow-breeze").metadata.as_string().splitlines(keepends=False): + metadata = distribution("apache-airflow-breeze").metadata + try: + description = metadata.json["description"] # type: ignore[attr-defined] + except AttributeError: + description = metadata.as_string() + for line in description.splitlines(keepends=False): if line.startswith(prefix): return line[len(prefix) :] return "NOT FOUND" @@ -222,7 +226,7 @@ def get_used_airflow_sources() -> Path: return current_sources -@lru_cache(maxsize=None) +@clearable_cache def find_airflow_sources_root_to_operate_on() -> Path: """ Find the root of airflow sources we operate on. Handle the case when Breeze is installed via @@ -281,9 +285,9 @@ def find_airflow_sources_root_to_operate_on() -> Path: AIRFLOW_SOURCES_ROOT = find_airflow_sources_root_to_operate_on().resolve() AIRFLOW_WWW_DIR = AIRFLOW_SOURCES_ROOT / "airflow" / "www" -TESTS_PROVIDERS_ROOT = AIRFLOW_SOURCES_ROOT / "tests" / "providers" -SYSTEM_TESTS_PROVIDERS_ROOT = AIRFLOW_SOURCES_ROOT / "tests" / "system" / "providers" -AIRFLOW_PROVIDERS_ROOT = AIRFLOW_SOURCES_ROOT / "airflow" / "providers" +AIRFLOW_UI_DIR = AIRFLOW_SOURCES_ROOT / "airflow" / "ui" +AIRFLOW_ORIGINAL_PROVIDERS_DIR = AIRFLOW_SOURCES_ROOT / "airflow" / "providers" +AIRFLOW_PROVIDERS_DIR = AIRFLOW_SOURCES_ROOT / "airflow" / "providers" DOCS_ROOT = AIRFLOW_SOURCES_ROOT / "docs" BUILD_CACHE_DIR = AIRFLOW_SOURCES_ROOT / ".build" GENERATED_DIR = AIRFLOW_SOURCES_ROOT / "generated" @@ -291,6 +295,7 @@ def find_airflow_sources_root_to_operate_on() -> Path: PROVIDER_DEPENDENCIES_JSON_FILE_PATH = GENERATED_DIR / "provider_dependencies.json" PROVIDER_METADATA_JSON_FILE_PATH = GENERATED_DIR / "provider_metadata.json" WWW_CACHE_DIR = BUILD_CACHE_DIR / "www" +UI_CACHE_DIR = BUILD_CACHE_DIR / "ui" AIRFLOW_TMP_DIR_PATH = AIRFLOW_SOURCES_ROOT / "tmp" WWW_ASSET_COMPILE_LOCK = WWW_CACHE_DIR / ".asset_compile.lock" WWW_ASSET_OUT_FILE = WWW_CACHE_DIR / "asset_compile.out" @@ -298,6 +303,12 @@ def find_airflow_sources_root_to_operate_on() -> Path: WWW_ASSET_HASH_FILE = AIRFLOW_SOURCES_ROOT / ".build" / "www" / "hash.txt" WWW_NODE_MODULES_DIR = AIRFLOW_SOURCES_ROOT / "airflow" / "www" / "node_modules" WWW_STATIC_DIST_DIR = AIRFLOW_SOURCES_ROOT / "airflow" / "www" / "static" / "dist" +UI_ASSET_COMPILE_LOCK = UI_CACHE_DIR / ".asset_compile.lock" +UI_ASSET_OUT_FILE = UI_CACHE_DIR / "asset_compile.out" +UI_ASSET_OUT_DEV_MODE_FILE = UI_CACHE_DIR / "asset_compile_dev_mode.out" +UI_ASSET_HASH_FILE = AIRFLOW_SOURCES_ROOT / ".build" / "ui" / "hash.txt" +UI_NODE_MODULES_DIR = AIRFLOW_SOURCES_ROOT / "airflow" / "ui" / "node_modules" +UI_DIST_DIR = AIRFLOW_SOURCES_ROOT / "airflow" / "ui" / "dist" DAGS_DIR = AIRFLOW_SOURCES_ROOT / "dags" FILES_DIR = AIRFLOW_SOURCES_ROOT / "files" FILES_SBOM_DIR = FILES_DIR / "sbom" diff --git a/dev/breeze/src/airflow_breeze/utils/platforms.py b/dev/breeze/src/airflow_breeze/utils/platforms.py index f32ccd20a6426..02cbdfa576c81 100644 --- a/dev/breeze/src/airflow_breeze/utils/platforms.py +++ b/dev/breeze/src/airflow_breeze/utils/platforms.py @@ -21,12 +21,12 @@ from pathlib import Path -def get_real_platform(single_platform: str) -> str: +def get_normalized_platform(single_platform: str) -> str: """ Replace different platform variants of the platform provided platforms with the two canonical ones we - are using: amd64 and arm64. + are using: linux/amd64 and linux/arm64. """ - return single_platform.replace("x86_64", "amd64").replace("aarch64", "arm64").replace("/", "-") + return single_platform.replace("x86_64", "amd64").replace("aarch64", "arm64") def _exists_no_permission_error(p: str) -> bool: diff --git a/dev/breeze/src/airflow_breeze/utils/projects_google_spreadsheet.py b/dev/breeze/src/airflow_breeze/utils/projects_google_spreadsheet.py new file mode 100644 index 0000000000000..1a7992e893de8 --- /dev/null +++ b/dev/breeze/src/airflow_breeze/utils/projects_google_spreadsheet.py @@ -0,0 +1,252 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +from __future__ import annotations + +import string +from enum import Enum, auto +from pathlib import Path +from typing import TYPE_CHECKING, Any + +if TYPE_CHECKING: + from googleapiclient.discovery import Resource + +from airflow_breeze.utils.console import get_console + +INTERESTING_OPSF_FIELDS = [ + "Score", + "Code-Review", + "Maintained", + "Dangerous-Workflow", + "Security-Policy", + "Packaging", + "Vulnerabilities", +] + +INTERESTING_OPSF_SCORES = ["OPSF-" + field for field in INTERESTING_OPSF_FIELDS] +INTERESTING_OPSF_DETAILS = ["OPSF-Details-" + field for field in INTERESTING_OPSF_FIELDS] + + +class MetadataFromSpreadsheet(Enum): + KNOWN_REPUTABLE_FOUNDATIONS = auto() + KNOWN_STRONG_COMMUNITIES = auto() + KNOWN_COMPANIES = auto() + KNOWN_STABLE_PROJECTS = auto() + KNOWN_LOW_IMPORTANCE_PROJECTS = auto() + KNOWN_MEDIUM_IMPORTANCE_PROJECTS = auto() + KNOWN_HIGH_IMPORTANCE_PROJECTS = auto() + RELATIONSHIP_PROJECTS = auto() + CONTACTED_PROJECTS = auto() + + +metadata_from_spreadsheet: dict[MetadataFromSpreadsheet, list[str]] = {} + + +def get_project_metadata(metadata_type: MetadataFromSpreadsheet) -> list[str]: + return metadata_from_spreadsheet[metadata_type] + + +# This is a spreadsheet where we store metadata about projects that we want to use in our analysis +METADATA_SPREADSHEET_ID = "1Hg6_B_irfnqNltnu1OUmt7Ph-K6x-DTWF7GZ5t-G0iI" +# This is the named range where we keep metadata +METADATA_RANGE_NAME = "SpreadsheetMetadata" + + +def read_metadata_from_google_spreadsheet(sheets: Resource): + get_console().print( + "[info]Reading metadata from Google Spreadsheet: " + f"https://docs.google.com/spreadsheets/d/{METADATA_SPREADSHEET_ID}" + ) + range = sheets.values().get(spreadsheetId=METADATA_SPREADSHEET_ID, range=METADATA_RANGE_NAME).execute() + metadata_types: list[MetadataFromSpreadsheet] = [] + for metadata_field in range["values"][0]: + metadata_types.append(MetadataFromSpreadsheet[metadata_field]) + metadata_from_spreadsheet[MetadataFromSpreadsheet[metadata_field]] = [] + for row in range["values"][1:]: + for index, value in enumerate(row): + value = value.strip() + if value: + metadata_from_spreadsheet[metadata_types[index]].append(value) + get_console().print("[success]Metadata read from Google Spreadsheet.") + + +def authorize_google_spreadsheets(json_credentials_file: Path, token_path: Path) -> Resource: + from google.auth.transport.requests import Request + from google.oauth2.credentials import Credentials + from google_auth_oauthlib.flow import InstalledAppFlow + from googleapiclient.discovery import build + + SCOPES = ["https://www.googleapis.com/auth/spreadsheets"] + creds = None + if token_path.exists(): + creds = Credentials.from_authorized_user_file(token_path.as_posix(), SCOPES) + if not creds or not creds.valid: + if creds and creds.expired and creds.refresh_token: + creds.refresh(Request()) + else: + flow = InstalledAppFlow.from_client_secrets_file(json_credentials_file.as_posix(), SCOPES) + creds = flow.run_local_server(port=0) + # Save the credentials for the next run + token_path.write_text(creds.to_json()) + service = build("sheets", "v4", credentials=creds) + sheets = service.spreadsheets() + return sheets + + +def get_sheets(json_credentials_file: Path) -> Resource: + token_path = Path.home() / ".config" / "gsheet" / "token.json" + sheets = authorize_google_spreadsheets(json_credentials_file, token_path) + return sheets + + +def write_sbom_information_to_google_spreadsheet( + sheets: Resource, + docs: dict[str, str], + google_spreadsheet_id: str, + all_dependencies: list[dict[str, Any]], + fieldnames: list[str], + include_opsf_scorecard: bool = False, +): + # Use only interesting values from the scorecard + cell_field_names = [ + fieldname + for fieldname in fieldnames + if fieldname in INTERESTING_OPSF_SCORES or not fieldname.startswith("OPSF-") + ] + + num_rows = update_field_values(all_dependencies, cell_field_names, google_spreadsheet_id, sheets) + if include_opsf_scorecard: + get_console().print("[info]Updating OPSF detailed comments.") + update_opsf_detailed_comments( + all_dependencies, fieldnames, num_rows, google_spreadsheet_id, docs, sheets + ) + + +def update_opsf_detailed_comments( + all_dependencies: list[dict[str, Any]], + fieldnames: list[str], + num_rows: int, + google_spreadsheet_id: str, + docs: dict[str, str], + sheets: Resource, +): + opsf_details_field_names = [ + fieldname for fieldname in fieldnames if fieldname in INTERESTING_OPSF_DETAILS + ] + start_opsf_column = fieldnames.index(opsf_details_field_names[0]) - 1 + opsf_details = [] + opsf_details.append( + { + "values": [ + {"note": docs[check]} + for check in INTERESTING_OPSF_FIELDS + if check != INTERESTING_OPSF_FIELDS[0] + ] + } + ) + get_console().print("[info]Adding notes to all cells.") + for dependency in all_dependencies: + note_row = convert_sbom_dict_to_spreadsheet_data(opsf_details_field_names, dependency) + opsf_details.append({"values": [{"note": note} for note in note_row]}) + notes = { + "updateCells": { + "range": { + "startRowIndex": 1, + "endRowIndex": num_rows + 1, + "startColumnIndex": start_opsf_column, + "endColumnIndex": start_opsf_column + len(opsf_details_field_names) + 1, + }, + "rows": opsf_details, + "fields": "note", + }, + } + update_note_body = {"requests": [notes]} + get_console().print("[info]Updating notes in google spreadsheet.") + sheets.batchUpdate(spreadsheetId=google_spreadsheet_id, body=update_note_body).execute() + + +def calculate_range(num_columns: int, row: int) -> str: + # Generate column letters + columns = list(string.ascii_uppercase) + if num_columns > 26: + columns += [f"{a}{b}" for a in string.ascii_uppercase for b in string.ascii_uppercase] + + # Calculate the range + end_column = columns[num_columns - 1] + return f"A{row}:{end_column}{row}" + + +def convert_sbom_dict_to_spreadsheet_data(headers: list[str], value_dict: dict[str, Any]): + return [value_dict.get(header, "") for header in headers] + + +def update_field_values( + all_dependencies: list[dict[str, Any]], + cell_field_names: list[str], + google_spreadsheet_id: str, + sheets: Resource, +) -> int: + get_console().print(f"[info]Updating {len(all_dependencies)} dependencies in the Google spreadsheet.") + num_fields = len(cell_field_names) + data = [] + top_header = [] + top_opsf_header_added = False + top_actions_header_added = False + possible_action_fields = [field[1] for field in ACTIONS.values()] + for field in cell_field_names: + if field.startswith("OPSF-") and not top_opsf_header_added: + top_header.append("Relevant OPSF Scores and details") + top_opsf_header_added = True + elif field in possible_action_fields and not top_actions_header_added: + top_header.append("Recommended actions") + top_actions_header_added = True + else: + top_header.append("") + + simplified_cell_field_names = [simplify_field_names(field) for field in cell_field_names] + get_console().print("[info]Adding top header.") + data.append({"range": calculate_range(num_fields, 1), "values": [top_header]}) + get_console().print("[info]Adding second header.") + data.append({"range": calculate_range(num_fields, 2), "values": [simplified_cell_field_names]}) + row = 3 + get_console().print("[info]Adding all rows.") + for dependency in all_dependencies: + spreadsheet_row = convert_sbom_dict_to_spreadsheet_data(cell_field_names, dependency) + data.append({"range": calculate_range(num_fields, row), "values": [spreadsheet_row]}) + row += 1 + get_console().print("[info]Writing data.") + body = {"valueInputOption": "RAW", "data": data} + result = sheets.values().batchUpdate(spreadsheetId=google_spreadsheet_id, body=body).execute() + get_console().print( + f"[info]Updated {result.get('totalUpdatedCells')} cells values in the Google spreadsheet." + ) + return row + + +def simplify_field_names(fieldname: str): + if fieldname.startswith("OPSF-"): + return fieldname[5:] + return fieldname + + +ACTIONS: dict[str, tuple[int, str]] = { + "Security-Policy": (9, "Add Security Policy to the repository"), + "Vulnerabilities": (10, "Follow up with vulnerabilities"), + "Packaging": (10, "Propose Trusted Publishing"), + "Dangerous-Workflow": (10, "Follow up with dangerous workflow"), + "Code-Review": (7, "Propose mandatory code review"), +} diff --git a/dev/breeze/src/airflow_breeze/utils/provider_dependencies.py b/dev/breeze/src/airflow_breeze/utils/provider_dependencies.py index cad78f1e6d2d9..4ccc233ceba8d 100644 --- a/dev/breeze/src/airflow_breeze/utils/provider_dependencies.py +++ b/dev/breeze/src/airflow_breeze/utils/provider_dependencies.py @@ -19,11 +19,10 @@ import json -import yaml - from airflow_breeze.utils.console import get_console from airflow_breeze.utils.github import get_tag_date -from airflow_breeze.utils.path_utils import AIRFLOW_PROVIDERS_ROOT, PROVIDER_DEPENDENCIES_JSON_FILE_PATH +from airflow_breeze.utils.packages import get_provider_info_dict +from airflow_breeze.utils.path_utils import PROVIDER_DEPENDENCIES_JSON_FILE_PATH DEPENDENCIES = json.loads(PROVIDER_DEPENDENCIES_JSON_FILE_PATH.read_text()) @@ -66,9 +65,7 @@ def generate_providers_metadata_for_package( airflow_release_dates: dict[str, str], ) -> dict[str, dict[str, str]]: get_console().print(f"[info]Generating metadata for {provider_id}") - provider_yaml_dict = yaml.safe_load( - (AIRFLOW_PROVIDERS_ROOT.joinpath(*provider_id.split(".")) / "provider.yaml").read_text() - ) + provider_yaml_dict = get_provider_info_dict(provider_id) provider_metadata: dict[str, dict[str, str]] = {} last_airflow_version = START_AIRFLOW_VERSION_FROM package_name = "apache-airflow-providers-" + provider_id.replace(".", "-") diff --git a/dev/breeze/src/airflow_breeze/utils/publish_docs_helpers.py b/dev/breeze/src/airflow_breeze/utils/publish_docs_helpers.py index 8c5d63748cb74..b390ab93c440e 100644 --- a/dev/breeze/src/airflow_breeze/utils/publish_docs_helpers.py +++ b/dev/breeze/src/airflow_breeze/utils/publish_docs_helpers.py @@ -17,41 +17,23 @@ from __future__ import annotations -import json import os -from glob import glob from pathlib import Path -from typing import Any -CONSOLE_WIDTH = 180 - -ROOT_DIR = Path(__file__).parents[5].resolve() -PROVIDER_DATA_SCHEMA_PATH = ROOT_DIR / "airflow" / "provider.yaml.schema.json" +from airflow_breeze.utils.path_utils import ( + AIRFLOW_SOURCES_ROOT, +) +CONSOLE_WIDTH = 180 -def _load_schema() -> dict[str, Any]: - with open(PROVIDER_DATA_SCHEMA_PATH) as schema_file: - content = json.load(schema_file) - return content +# TODO(potiuk): remove it when we move all providers to the new structure +OLD_PROVIDER_DATA_SCHEMA_PATH = AIRFLOW_SOURCES_ROOT / "airflow" / "provider.yaml.schema.json" +NEW_PROVIDER_DATA_SCHEMA_PATH = AIRFLOW_SOURCES_ROOT / "airflow" / "new_provider.yaml.schema.json" def _filepath_to_module(filepath: str): - return str(Path(filepath).relative_to(ROOT_DIR)).replace("/", ".") - - -def _filepath_to_system_tests(filepath: str): - return str( - ROOT_DIR - / "tests" - / "system" - / "providers" - / Path(filepath).relative_to(ROOT_DIR / "airflow" / "providers") - ) - - -def get_provider_yaml_paths(): - """Returns list of provider.yaml files""" - return sorted(glob(f"{ROOT_DIR}/airflow/providers/**/provider.yaml", recursive=True)) + # TODO: handle relative to providers project + return str(Path(filepath).relative_to(AIRFLOW_SOURCES_ROOT)).replace("/", ".") def pretty_format_path(path: str, start: str) -> str: diff --git a/dev/breeze/src/airflow_breeze/utils/python_versions.py b/dev/breeze/src/airflow_breeze/utils/python_versions.py index d84c4f932ba8d..3ac3f8be30ff3 100644 --- a/dev/breeze/src/airflow_breeze/utils/python_versions.py +++ b/dev/breeze/src/airflow_breeze/utils/python_versions.py @@ -43,16 +43,3 @@ def get_python_version_list(python_versions: str) -> list[str]: ) sys.exit(1) return python_version_list - - -def check_python_version(): - if not sys.version_info >= (3, 9): - get_console().print("[error]At least Python 3.9 is required to prepare reproducible archives.\n") - get_console().print( - "[warning]Please reinstall Breeze using Python 3.9 - 3.12 environment.[/]\n\n" - "If you are using uv:\n\n" - " uv tool install --force --reinstall --python 3.9 -e ./dev/breeze\n\n" - "If you are using pipx:\n\n" - " pipx install --python $(which python3.9) --force -e ./dev/breeze\n" - ) - sys.exit(1) diff --git a/dev/breeze/src/airflow_breeze/utils/run_tests.py b/dev/breeze/src/airflow_breeze/utils/run_tests.py index b34fa3b341020..8b991ff834abe 100644 --- a/dev/breeze/src/airflow_breeze/utils/run_tests.py +++ b/dev/breeze/src/airflow_breeze/utils/run_tests.py @@ -22,16 +22,39 @@ from itertools import chain from subprocess import DEVNULL -from airflow_breeze.global_constants import PIP_VERSION, UV_VERSION +from airflow_breeze.global_constants import ( + ALL_TEST_SUITES, + ALL_TEST_TYPE, + NONE_TEST_TYPE, + PIP_VERSION, + UV_VERSION, + GroupOfTests, + SelectiveCoreTestType, + all_helm_test_packages, +) from airflow_breeze.utils.console import Output, get_console from airflow_breeze.utils.packages import get_excluded_provider_folders, get_suspended_provider_folders -from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT +from airflow_breeze.utils.path_utils import ( + AIRFLOW_PROVIDERS_DIR, + AIRFLOW_SOURCES_ROOT, +) from airflow_breeze.utils.run_utils import run_command from airflow_breeze.utils.virtualenv_utils import create_temp_venv DOCKER_TESTS_ROOT = AIRFLOW_SOURCES_ROOT / "docker_tests" DOCKER_TESTS_REQUIREMENTS = DOCKER_TESTS_ROOT / "requirements.txt" +IGNORE_DB_INIT_FOR_TEST_GROUPS = [ + GroupOfTests.HELM, + GroupOfTests.PYTHON_API_CLIENT, + GroupOfTests.SYSTEM, +] + +IGNORE_WARNING_OUTPUT_FOR_TEST_GROUPS = [ + GroupOfTests.HELM, + GroupOfTests.PYTHON_API_CLIENT, +] + def verify_an_image( image_name: str, @@ -86,7 +109,9 @@ def run_docker_compose_tests( env["DOCKER_IMAGE"] = image_name if skip_docker_compose_deletion: env["SKIP_DOCKER_COMPOSE_DELETION"] = "true" - with create_temp_venv(pip_version=PIP_VERSION, requirements_file=DOCKER_TESTS_REQUIREMENTS) as py_exe: + with create_temp_venv( + pip_version=PIP_VERSION, uv_version=UV_VERSION, requirements_file=DOCKER_TESTS_REQUIREMENTS + ) as py_exe: command_result = run_command( [py_exe, "-m", "pytest", str(test_path), *pytest_args, *extra_pytest_args], env=env, @@ -100,59 +125,41 @@ def file_name_from_test_type(test_type: str): return re.sub("[,.]", "_", test_type_no_brackets)[:30] -def test_paths(test_type: str, backend: str, helm_test_package: str | None) -> tuple[str, str, str]: +def test_paths(test_type: str, backend: str) -> tuple[str, str, str]: file_friendly_test_type = file_name_from_test_type(test_type) - extra_package = f"-{helm_test_package}" if helm_test_package else "" random_suffix = os.urandom(4).hex() - result_log_file = f"/files/test_result-{file_friendly_test_type}{extra_package}-{backend}.xml" - warnings_file = f"/files/warnings-{file_friendly_test_type}{extra_package}-{backend}.txt" + result_log_file = f"/files/test_result-{file_friendly_test_type}-{backend}.xml" + warnings_file = f"/files/warnings-{file_friendly_test_type}-{backend}.txt" coverage_file = f"/files/coverage-{file_friendly_test_type}-{backend}-{random_suffix}.xml" return result_log_file, warnings_file, coverage_file -def get_suspended_provider_args() -> list[str]: - pytest_args = [] - suspended_folders = get_suspended_provider_folders() - for providers in suspended_folders: - pytest_args.extend( +def get_ignore_switches_for_provider(provider_folders: list[str]) -> list[str]: + args = [] + for providers in provider_folders: + args.extend( [ - "--ignore", - f"tests/providers/{providers}", - "--ignore", - f"tests/system/providers/{providers}", - "--ignore", - f"tests/integration/providers/{providers}", + f"--ignore=tests/providers/{providers}", + f"--ignore=tests/providers/system/{providers}", + f"--ignore=tests/providers/integration/{providers}", ] ) - return pytest_args + return args + + +def get_suspended_provider_args() -> list[str]: + suspended_folders = get_suspended_provider_folders() + return get_ignore_switches_for_provider(suspended_folders) def get_excluded_provider_args(python_version: str) -> list[str]: - pytest_args = [] excluded_folders = get_excluded_provider_folders(python_version) - for providers in excluded_folders: - pytest_args.extend( - [ - "--ignore", - f"tests/providers/{providers}", - "--ignore", - f"tests/system/providers/{providers}", - "--ignore", - f"tests/integration/providers/{providers}", - ] - ) - return pytest_args + return get_ignore_switches_for_provider(excluded_folders) -TEST_TYPE_MAP_TO_PYTEST_ARGS: dict[str, list[str]] = { +TEST_TYPE_CORE_MAP_TO_PYTEST_ARGS: dict[str, list[str]] = { "Always": ["tests/always"], - "API": ["tests/api", "tests/api_experimental", "tests/api_connexion", "tests/api_internal"], - "BranchPythonVenv": [ - "tests/operators/test_python.py::TestBranchPythonVirtualenvOperator", - ], - "BranchExternalPython": [ - "tests/operators/test_python.py::TestBranchExternalPythonOperator", - ], + "API": ["tests/api", "tests/api_connexion", "tests/api_experimental", "tests/api_internal"], "CLI": ["tests/cli"], "Core": [ "tests/core", @@ -162,145 +169,158 @@ def get_excluded_provider_args(python_version: str) -> list[str]: "tests/ti_deps", "tests/utils", ], - "ExternalPython": [ - "tests/operators/test_python.py::TestExternalPythonOperator", - ], "Integration": ["tests/integration"], - # Operators test type excludes Virtualenv/External tests - they have their own test types - "Operators": ["tests/operators", "--exclude-virtualenv-operator", "--exclude-external-python-operator"], - # this one is mysteriously failing dill serialization. It could be removed once - # https://github.com/pytest-dev/pytest/issues/10845 is fixed - "PlainAsserts": [ - "tests/operators/test_python.py::TestPythonVirtualenvOperator::test_airflow_context", - "--assert=plain", - ], - "Providers": ["tests/providers"], - "PythonVenv": [ - "tests/operators/test_python.py::TestPythonVirtualenvOperator", - ], + "Operators": ["tests/operators"], "Serialization": [ "tests/serialization", ], - "System": ["tests/system"], "WWW": [ "tests/www", ], + "OpenAPI": ["clients/python"], +} + +TEST_GROUP_TO_TEST_FOLDERS: dict[GroupOfTests, list[str]] = { + GroupOfTests.CORE: ["tests"], + # TODO(potiuk): remove me when we migrate all providers to new structure + GroupOfTests.PROVIDERS: ["tests/providers"], + GroupOfTests.HELM: ["helm_tests"], + GroupOfTests.INTEGRATION_CORE: ["tests/integration"], + GroupOfTests.INTEGRATION_PROVIDERS: ["tests/providers/integration"], + GroupOfTests.PYTHON_API_CLIENT: ["clients/python"], } -HELM_TESTS = "helm_tests" -INTEGRATION_TESTS = "tests/integration" -SYSTEM_TESTS = "tests/system" # Those directories are already ignored vu pyproject.toml. We want to exclude them here as well. NO_RECURSE_DIRS = [ "tests/_internals", "tests/dags_with_system_exit", - "tests/test_utils", "tests/dags_corrupted", "tests/dags", - "tests/system/providers/google/cloud/dataproc/resources", - "tests/system/providers/google/cloud/gcs/resources", + "providers/tests/system/google/cloud/dataproc/resources", + "providers/tests/system/google/cloud/gcs/resources", ] def find_all_other_tests() -> list[str]: - all_named_test_folders = list(chain.from_iterable(TEST_TYPE_MAP_TO_PYTEST_ARGS.values())) - all_named_test_folders.append(HELM_TESTS) - all_named_test_folders.append(INTEGRATION_TESTS) - all_named_test_folders.append(SYSTEM_TESTS) + all_named_test_folders = list(chain.from_iterable(TEST_TYPE_CORE_MAP_TO_PYTEST_ARGS.values())) + all_named_test_folders.extend(TEST_GROUP_TO_TEST_FOLDERS[GroupOfTests.PROVIDERS]) + all_named_test_folders.extend(TEST_GROUP_TO_TEST_FOLDERS[GroupOfTests.HELM]) + all_named_test_folders.extend(TEST_GROUP_TO_TEST_FOLDERS[GroupOfTests.INTEGRATION_CORE]) + all_named_test_folders.extend(TEST_GROUP_TO_TEST_FOLDERS[GroupOfTests.INTEGRATION_PROVIDERS]) + all_named_test_folders.append("tests/system") + all_named_test_folders.append("providers/tests/system") all_named_test_folders.extend(NO_RECURSE_DIRS) - all_curent_test_folders = [ + all_current_test_folders = [ str(path.relative_to(AIRFLOW_SOURCES_ROOT)) for path in AIRFLOW_SOURCES_ROOT.glob("tests/*") if path.is_dir() and path.name != "__pycache__" ] for named_test_folder in all_named_test_folders: - if named_test_folder in all_curent_test_folders: - all_curent_test_folders.remove(named_test_folder) - return sorted(all_curent_test_folders) + if named_test_folder in all_current_test_folders: + all_current_test_folders.remove(named_test_folder) + return sorted(all_current_test_folders) +PROVIDERS_PREFIX = "Providers" PROVIDERS_LIST_PREFIX = "Providers[" PROVIDERS_LIST_EXCLUDE_PREFIX = "Providers[-" -ALL_TEST_SUITES: dict[str, tuple[str, ...]] = { - "All": ("tests",), - "All-Long": ("tests", "-m", "long_running", "--include-long-running"), - "All-Quarantined": ("tests", "-m", "quarantined", "--include-quarantined"), - "All-Postgres": ("tests", "--backend", "postgres"), - "All-MySQL": ("tests", "--backend", "mysql"), -} - def convert_test_type_to_pytest_args( *, + test_group: GroupOfTests, test_type: str, - skip_provider_tests: bool, - python_version: str, - helm_test_package: str | None = None, ) -> list[str]: if test_type == "None": return [] if test_type in ALL_TEST_SUITES: return [ + *TEST_GROUP_TO_TEST_FOLDERS[test_group], *ALL_TEST_SUITES[test_type], ] - if test_type == "Helm": - if helm_test_package and helm_test_package != "all": - return [f"helm_tests/{helm_test_package}"] - else: - return [HELM_TESTS] - if test_type == "Integration": - if skip_provider_tests: - return [ - "tests/integration/api_experimental", - "tests/integration/cli", - "tests/integration/executors", - "tests/integration/security", - ] + if test_group == GroupOfTests.SYSTEM and test_type != NONE_TEST_TYPE: + get_console().print(f"[error]Only {NONE_TEST_TYPE} should be allowed as test type[/]") + sys.exit(1) + if test_group == GroupOfTests.HELM: + if test_type not in all_helm_test_packages(): + get_console().print(f"[error]Unknown helm test type: {test_type}[/]") + sys.exit(1) + helm_folder = TEST_GROUP_TO_TEST_FOLDERS[test_group][0] + if test_type and test_type != ALL_TEST_TYPE: + return [f"{helm_folder}/{test_type}"] else: - return [INTEGRATION_TESTS] - if test_type == "System": - return [SYSTEM_TESTS] - if skip_provider_tests and test_type.startswith("Providers"): - return [] - if test_type.startswith(PROVIDERS_LIST_EXCLUDE_PREFIX): - excluded_provider_list = test_type[len(PROVIDERS_LIST_EXCLUDE_PREFIX) : -1].split(",") - providers_with_exclusions = TEST_TYPE_MAP_TO_PYTEST_ARGS["Providers"].copy() - for excluded_provider in excluded_provider_list: - providers_with_exclusions.append( - "--ignore=tests/providers/" + excluded_provider.replace(".", "/") - ) - return providers_with_exclusions - if test_type.startswith(PROVIDERS_LIST_PREFIX): - provider_list = test_type[len(PROVIDERS_LIST_PREFIX) : -1].split(",") - providers_to_test = [] - for provider in provider_list: - provider_path = "tests/providers/" + provider.replace(".", "/") - if (AIRFLOW_SOURCES_ROOT / provider_path).is_dir(): - providers_to_test.append(provider_path) - else: - get_console().print( - f"[error]Provider directory {provider_path} does not exist for {provider}. " - f"This is bad. Please add it (all providers should have a package in tests)" - ) - sys.exit(1) - return providers_to_test - if test_type == "Other": + return [helm_folder] + if test_type == SelectiveCoreTestType.OTHER.value and test_group == GroupOfTests.CORE: return find_all_other_tests() - test_dirs = TEST_TYPE_MAP_TO_PYTEST_ARGS.get(test_type) + if test_group in [ + GroupOfTests.INTEGRATION_CORE, + GroupOfTests.INTEGRATION_PROVIDERS, + ]: + if test_type != ALL_TEST_TYPE: + get_console().print(f"[error]Unknown test type for {test_group}: {test_type}[/]") + sys.exit(1) + if test_group == GroupOfTests.PROVIDERS: + if test_type.startswith(PROVIDERS_LIST_EXCLUDE_PREFIX): + excluded_provider_list = test_type[len(PROVIDERS_LIST_EXCLUDE_PREFIX) : -1].split(",") + providers_with_exclusions = TEST_GROUP_TO_TEST_FOLDERS[GroupOfTests.PROVIDERS].copy() + for excluded_provider in excluded_provider_list: + # TODO(potiuk): remove me when all providers are migrated + providers_with_exclusions.append( + "--ignore=tests/providers/" + excluded_provider.replace(".", "/") + ) + return providers_with_exclusions + if test_type.startswith(PROVIDERS_LIST_PREFIX): + provider_list = test_type[len(PROVIDERS_LIST_PREFIX) : -1].split(",") + providers_to_test = [] + for provider in provider_list: + # TODO(potiuk): remove me when all providers are migrated + provider_path = ( + (AIRFLOW_SOURCES_ROOT / "tests" / "providers") + .joinpath(provider.replace(".", "/")) + .relative_to(AIRFLOW_SOURCES_ROOT) + ) + if provider_path.is_dir(): + providers_to_test.append(provider_path.as_posix()) + else: + old_provider_path = provider_path + provider_path = ( + AIRFLOW_PROVIDERS_DIR.joinpath(provider.replace(".", "/")).relative_to( + AIRFLOW_SOURCES_ROOT + ) + / "tests" + ) + if provider_path.is_dir(): + providers_to_test.append(provider_path.as_posix()) + else: + get_console().print( + f"[error]Neither {old_provider_path} nor {provider_path} exist for {provider} " + "- which means that provider has no tests. This is bad idea. " + "Please add it (all providers should have a package in tests)" + ) + sys.exit(1) + return providers_to_test + if not test_type.startswith(PROVIDERS_PREFIX): + get_console().print(f"[error]Unknown test type for {GroupOfTests.PROVIDERS}: {test_type}[/]") + sys.exit(1) + return TEST_GROUP_TO_TEST_FOLDERS[test_group] + if test_group == GroupOfTests.PYTHON_API_CLIENT: + return TEST_GROUP_TO_TEST_FOLDERS[test_group] + if test_group != GroupOfTests.CORE: + get_console().print(f"[error]Only {GroupOfTests.CORE} should be allowed here[/]") + test_dirs = TEST_TYPE_CORE_MAP_TO_PYTEST_ARGS.get(test_type) if test_dirs: - return test_dirs + return test_dirs.copy() get_console().print(f"[error]Unknown test type: {test_type}[/]") sys.exit(1) def generate_args_for_pytest( *, + test_group: GroupOfTests, test_type: str, test_timeout: int, - skip_provider_tests: bool, skip_db_tests: bool, run_db_tests_only: bool, backend: str, @@ -310,38 +330,30 @@ def generate_args_for_pytest( parallelism: int, parallel_test_types_list: list[str], python_version: str, - helm_test_package: str | None, keep_env_variables: bool, no_db_cleanup: bool, - database_isolation: bool, ): - result_log_file, warnings_file, coverage_file = test_paths(test_type, backend, helm_test_package) - if skip_db_tests: - if parallel_test_types_list: - args = convert_parallel_types_to_folders( - parallel_test_types_list, skip_provider_tests, python_version=python_version - ) - else: - args = ["tests"] if test_type != "None" else [] + result_log_file, warnings_file, coverage_file = test_paths(test_type, backend) + if skip_db_tests and parallel_test_types_list: + args = convert_parallel_types_to_folders( + test_group=test_group, + parallel_test_types_list=parallel_test_types_list, + ) else: args = convert_test_type_to_pytest_args( + test_group=test_group, test_type=test_type, - skip_provider_tests=skip_provider_tests, - helm_test_package=helm_test_package, - python_version=python_version, ) - max_fail = 50 args.extend( [ "--verbosity=0", "--strict-markers", "--durations=100", - f"--maxfail={max_fail}", + "--maxfail=50", "--color=yes", f"--junitxml={result_log_file}", # timeouts in seconds for individual tests - "--timeouts-order", - "moi", + "--timeouts-order=moi", f"--setup-timeout={test_timeout}", f"--execution-timeout={test_timeout}", f"--teardown-timeout={test_timeout}", @@ -364,16 +376,22 @@ def generate_args_for_pytest( args.append("--skip-db-tests") if run_db_tests_only: args.append("--run-db-tests-only") - if test_type != "System": - args.append(f"--ignore={SYSTEM_TESTS}") - if test_type != "Integration": - args.append(f"--ignore={INTEGRATION_TESTS}") - if test_type != "Helm": - # do not produce warnings output for helm tests + if test_group not in [GroupOfTests.SYSTEM]: + args.append("--ignore-glob=tests/system/*") + if test_group != GroupOfTests.INTEGRATION_CORE: + for group_folder in TEST_GROUP_TO_TEST_FOLDERS[GroupOfTests.INTEGRATION_CORE]: + args.append(f"--ignore-glob={group_folder}/*") + if test_group != GroupOfTests.INTEGRATION_PROVIDERS: + for group_folder in TEST_GROUP_TO_TEST_FOLDERS[GroupOfTests.INTEGRATION_PROVIDERS]: + args.append(f"--ignore-glob={group_folder}/*") + if test_group not in IGNORE_WARNING_OUTPUT_FOR_TEST_GROUPS: args.append(f"--warning-output-path={warnings_file}") - args.append(f"--ignore={HELM_TESTS}") - if test_type not in ("Helm", "System"): + for group_folder in TEST_GROUP_TO_TEST_FOLDERS[GroupOfTests.HELM]: + args.append(f"--ignore={group_folder}") + if test_group not in IGNORE_DB_INIT_FOR_TEST_GROUPS: args.append("--with-db-init") + if test_group == GroupOfTests.PYTHON_API_CLIENT: + args.append("--ignore-glob=clients/python/tmp/*") args.extend(get_suspended_provider_args()) args.extend(get_excluded_provider_args(python_version)) if use_xdist: @@ -408,26 +426,26 @@ def generate_args_for_pytest( return args -def convert_parallel_types_to_folders( - parallel_test_types_list: list[str], skip_provider_tests: bool, python_version: str -): +def convert_parallel_types_to_folders(test_group: GroupOfTests, parallel_test_types_list: list[str]): args = [] for _test_type in parallel_test_types_list: args.extend( convert_test_type_to_pytest_args( + test_group=test_group, test_type=_test_type, - skip_provider_tests=skip_provider_tests, - helm_test_package=None, - python_version=python_version, ) ) + all_test_prefixes: list[str] = [] # leave only folders, strip --pytest-args that exclude some folders with `-' prefix - folders = [arg for arg in args if arg.startswith("test")] - # remove specific provider sub-folders if "tests/providers" is already in the list + for group_folders in TEST_GROUP_TO_TEST_FOLDERS.values(): + for group_folder in group_folders: + all_test_prefixes.append(group_folder) + folders = [arg for arg in args if any(arg.startswith(prefix) for prefix in all_test_prefixes)] + # remove specific provider sub-folders if "providers/tests" is already in the list # This workarounds pytest issues where it will only run tests from specific subfolders # if both parent and child folders are in the list # The issue in Pytest (changed behaviour in Pytest 8.2 is tracked here # https://github.com/pytest-dev/pytest/issues/12605 - if "tests/providers" in folders: - folders = [folder for folder in folders if not folder.startswith("tests/providers/")] + if "providers/tests" in folders: + folders = [folder for folder in folders if not folder.startswith("providers/tests/")] return folders diff --git a/dev/breeze/src/airflow_breeze/utils/run_utils.py b/dev/breeze/src/airflow_breeze/utils/run_utils.py index f98eedad937a3..ca6060295ecd7 100644 --- a/dev/breeze/src/airflow_breeze/utils/run_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/run_utils.py @@ -28,16 +28,23 @@ import stat import subprocess import sys -from functools import lru_cache +from collections.abc import Mapping from pathlib import Path -from typing import Mapping, Union +from typing import Union from rich.markup import escape from airflow_breeze.utils.ci_group import ci_group from airflow_breeze.utils.console import Output, get_console +from airflow_breeze.utils.functools_cache import clearable_cache from airflow_breeze.utils.path_utils import ( AIRFLOW_SOURCES_ROOT, + UI_ASSET_COMPILE_LOCK, + UI_ASSET_HASH_FILE, + UI_ASSET_OUT_DEV_MODE_FILE, + UI_ASSET_OUT_FILE, + UI_DIST_DIR, + UI_NODE_MODULES_DIR, WWW_ASSET_COMPILE_LOCK, WWW_ASSET_HASH_FILE, WWW_ASSET_OUT_DEV_MODE_FILE, @@ -60,7 +67,7 @@ def run_command( no_output_dump_on_exception: bool = False, env: Mapping[str, str] | None = None, cwd: Path | str | None = None, - input: str | None = None, + input: str | bytes | None = None, output: Output | None = None, output_outside_the_group: bool = False, verbose_override: bool | None = None, @@ -84,7 +91,7 @@ def run_command( :param no_output_dump_on_exception: whether to suppress printing logs from output when command fails :param env: mapping of environment variables to set for the run command :param cwd: working directory to set for the command - :param input: input string to pass to stdin of the process + :param input: input string to pass to stdin of the process (bytes if text=False, str, otherwise) :param output: redirects stderr/stdout to Output if set to Output class. :param output_outside_the_group: if this is set to True, then output of the command will be done outside the "CI folded group" in CI - so that it is immediately visible without unfolding. @@ -211,48 +218,44 @@ def assert_pre_commit_installed(): python_executable = sys.executable get_console().print(f"[info]Checking pre-commit installed for {python_executable}[/]") - command_result = run_command( - ["pre-commit", "--version"], - capture_output=True, - text=True, - check=False, - ) - if command_result.returncode == 0: - if command_result.stdout: - pre_commit_version = command_result.stdout.split(" ")[1].strip() - if Version(pre_commit_version) >= Version(min_pre_commit_version): - get_console().print( - f"\n[success]Package pre_commit is installed. " - f"Good version {pre_commit_version} (>= {min_pre_commit_version})[/]\n" - ) + need_to_reinstall_precommit = False + try: + command_result = run_command( + ["pre-commit", "--version"], + capture_output=True, + text=True, + check=False, + ) + if command_result.returncode == 0: + if command_result.stdout: + pre_commit_version = command_result.stdout.split(" ")[1].strip() + if Version(pre_commit_version) >= Version(min_pre_commit_version): + get_console().print( + f"\n[success]Package pre_commit is installed. " + f"Good version {pre_commit_version} (>= {min_pre_commit_version})[/]\n" + ) + else: + get_console().print( + f"\n[error]Package name pre_commit version is wrong. It should be" + f"aat least {min_pre_commit_version} and is {pre_commit_version}.[/]\n\n" + ) + sys.exit(1) else: get_console().print( - f"\n[error]Package name pre_commit version is wrong. It should be" - f"aat least {min_pre_commit_version} and is {pre_commit_version}.[/]\n\n" - ) - sys.exit(1) - if "pre-commit-uv" not in command_result.stdout: - get_console().print( - "\n[warning]You can significantly improve speed of installing your pre-commit envs " - "by installing `pre-commit-uv` with it.[/]\n" - ) - get_console().print( - "\n[warning]With uv you can install it with:[/]\n\n" - " uv tool install pre-commit --with pre-commit-uv --force-reinstall\n" - ) - get_console().print( - "\n[warning]With pipx you can install it with:[/]\n\n" - " pipx inject\n" - " pipx inject pre-commit pre-commit-uv\n" + "\n[warning]Could not determine version of pre-commit. You might need to update it![/]\n" ) else: - get_console().print( - "\n[warning]Could not determine version of pre-commit. You might need to update it![/]\n" - ) - else: - get_console().print("\n[error]Error checking for pre-commit-installation:[/]\n") - get_console().print(command_result.stderr) - get_console().print("\nMake sure to run:\n breeze setup self-upgrade\n\n") + need_to_reinstall_precommit = True + get_console().print("\n[error]Error checking for pre-commit-installation:[/]\n") + get_console().print(command_result.stderr) + except FileNotFoundError as e: + need_to_reinstall_precommit = True + get_console().print(f"\n[error]Error checking for pre-commit-installation: [/]\n{e}\n") + if need_to_reinstall_precommit: + get_console().print("[info]Make sure to install pre-commit. For example by running:\n") + get_console().print(" uv tool install pre-commit\n") + get_console().print("Or if you prefer pipx:\n") + get_console().print(" pipx install pre-commit") sys.exit(1) @@ -370,7 +373,7 @@ def check_if_buildx_plugin_installed() -> bool: return False -@lru_cache(maxsize=None) +@clearable_cache def commit_sha(): """Returns commit SHA of current repo. Cached for various usages.""" command_result = run_command(["git", "rev-parse", "HEAD"], capture_output=True, text=True, check=False) @@ -390,7 +393,9 @@ def check_if_image_exists(image: str) -> bool: return cmd_result.returncode == 0 -def _run_compile_internally(command_to_execute: list[str], dev: bool) -> RunCommandResult: +def _run_compile_internally( + command_to_execute: list[str], dev: bool, compile_lock: Path, asset_out: Path +) -> RunCommandResult: from filelock import SoftFileLock, Timeout env = os.environ.copy() @@ -403,11 +408,11 @@ def _run_compile_internally(command_to_execute: list[str], dev: bool) -> RunComm env=env, ) else: - WWW_ASSET_COMPILE_LOCK.parent.mkdir(parents=True, exist_ok=True) - WWW_ASSET_COMPILE_LOCK.unlink(missing_ok=True) + compile_lock.parent.mkdir(parents=True, exist_ok=True) + compile_lock.unlink(missing_ok=True) try: - with SoftFileLock(WWW_ASSET_COMPILE_LOCK, timeout=5): - with open(WWW_ASSET_OUT_FILE, "w") as output_file: + with SoftFileLock(compile_lock, timeout=5): + with open(asset_out, "w") as output_file: result = run_command( command_to_execute, check=False, @@ -418,13 +423,13 @@ def _run_compile_internally(command_to_execute: list[str], dev: bool) -> RunComm stdout=output_file, ) if result.returncode == 0: - WWW_ASSET_OUT_FILE.unlink(missing_ok=True) + asset_out.unlink(missing_ok=True) return result except Timeout: get_console().print("[error]Another asset compilation is running. Exiting[/]\n") get_console().print("[warning]If you are sure there is no other compilation,[/]") get_console().print("[warning]Remove the lock file and re-run compilation:[/]") - get_console().print(WWW_ASSET_COMPILE_LOCK) + get_console().print(compile_lock) get_console().print() sys.exit(1) @@ -486,7 +491,58 @@ def run_compile_www_assets( if os.getpid() != os.getsid(0): # and create a new process group where we are the leader os.setpgid(0, 0) - _run_compile_internally(command_to_execute, dev) + _run_compile_internally(command_to_execute, dev, WWW_ASSET_COMPILE_LOCK, WWW_ASSET_OUT_FILE) + sys.exit(0) + else: + return _run_compile_internally(command_to_execute, dev, WWW_ASSET_COMPILE_LOCK, WWW_ASSET_OUT_FILE) + + +def clean_ui_assets(): + get_console().print("[info]Cleaning ui assets[/]") + UI_ASSET_HASH_FILE.unlink(missing_ok=True) + shutil.rmtree(UI_NODE_MODULES_DIR, ignore_errors=True) + shutil.rmtree(UI_DIST_DIR, ignore_errors=True) + get_console().print("[success]Cleaned ui assets[/]") + + +def run_compile_ui_assets( + dev: bool, + run_in_background: bool, + force_clean: bool, +): + if force_clean: + clean_ui_assets() + if dev: + get_console().print("\n[warning] The command below will run forever until you press Ctrl-C[/]\n") + get_console().print( + "\n[info]If you want to see output of the compilation command,\n" + "[info]cancel it, go to airflow/ui folder and run 'pnpm dev'.\n" + "[info]However, it requires you to have local pnpm installation.\n" + ) + command_to_execute = [ + "pre-commit", + "run", + "--hook-stage", + "manual", + "compile-ui-assets-dev" if dev else "compile-ui-assets", + "--all-files", + "--verbose", + ] + get_console().print( + "[info]The output of the asset compilation is stored in: [/]" + f"{UI_ASSET_OUT_DEV_MODE_FILE if dev else UI_ASSET_OUT_FILE}\n" + ) + if run_in_background: + pid = os.fork() + if pid: + # Parent process - send signal to process group of the child process + atexit.register(kill_process_group, pid) + else: + # Check if we are not a group leader already (We should not be) + if os.getpid() != os.getsid(0): + # and create a new process group where we are the leader + os.setpgid(0, 0) + _run_compile_internally(command_to_execute, dev, UI_ASSET_COMPILE_LOCK, UI_ASSET_OUT_FILE) sys.exit(0) else: - return _run_compile_internally(command_to_execute, dev) + return _run_compile_internally(command_to_execute, dev, UI_ASSET_COMPILE_LOCK, UI_ASSET_OUT_FILE) diff --git a/dev/breeze/src/airflow_breeze/utils/selective_checks.py b/dev/breeze/src/airflow_breeze/utils/selective_checks.py index f09c74579191f..8e0f84361ca38 100644 --- a/dev/breeze/src/airflow_breeze/utils/selective_checks.py +++ b/dev/breeze/src/airflow_breeze/utils/selective_checks.py @@ -21,8 +21,9 @@ import os import re import sys +from collections import defaultdict from enum import Enum -from functools import cached_property, lru_cache +from functools import cached_property from pathlib import Path from typing import Any, Dict, List, TypeVar @@ -30,7 +31,6 @@ from airflow_breeze.global_constants import ( ALL_PYTHON_MAJOR_MINOR_VERSIONS, APACHE_AIRFLOW_GITHUB_REPOSITORY, - BASE_PROVIDERS_COMPATIBILITY_CHECKS, CHICKEN_EGG_PROVIDERS, COMMITTERS, CURRENT_KUBERNETES_VERSIONS, @@ -44,26 +44,28 @@ DISABLE_TESTABLE_INTEGRATIONS_FROM_CI, HELM_VERSION, KIND_VERSION, + PROVIDERS_COMPATIBILITY_TESTS_MATRIX, RUNS_ON_PUBLIC_RUNNER, RUNS_ON_SELF_HOSTED_ASF_RUNNER, RUNS_ON_SELF_HOSTED_RUNNER, - TESTABLE_INTEGRATIONS, + TESTABLE_CORE_INTEGRATIONS, + TESTABLE_PROVIDERS_INTEGRATIONS, GithubEvents, - SelectiveUnitTestTypes, + SelectiveCoreTestType, + SelectiveProvidersTestType, all_helm_test_packages, - all_selective_test_types, - all_selective_test_types_except_providers, + all_selective_core_test_types, + providers_test_type, ) from airflow_breeze.utils.console import get_console from airflow_breeze.utils.exclude_from_matrix import excluded_combos +from airflow_breeze.utils.functools_cache import clearable_cache from airflow_breeze.utils.kubernetes_utils import get_kubernetes_python_combos from airflow_breeze.utils.packages import get_available_packages from airflow_breeze.utils.path_utils import ( - AIRFLOW_PROVIDERS_ROOT, + AIRFLOW_PROVIDERS_DIR, AIRFLOW_SOURCES_ROOT, DOCS_DIR, - SYSTEM_TESTS_PROVIDERS_ROOT, - TESTS_PROVIDERS_ROOT, ) from airflow_breeze.utils.provider_dependencies import DEPENDENCIES, get_related_providers from airflow_breeze.utils.run_utils import run_command @@ -77,25 +79,19 @@ FULL_TESTS_NEEDED_LABEL = "full tests needed" INCLUDE_SUCCESS_OUTPUTS_LABEL = "include success outputs" LATEST_VERSIONS_ONLY_LABEL = "latest versions only" +LEGACY_UI_LABEL = "legacy ui" +LEGACY_API_LABEL = "legacy api" NON_COMMITTER_BUILD_LABEL = "non committer build" UPGRADE_TO_NEWER_DEPENDENCIES_LABEL = "upgrade to newer dependencies" USE_PUBLIC_RUNNERS_LABEL = "use public runners" USE_SELF_HOSTED_RUNNERS_LABEL = "use self-hosted runners" +ALL_CI_SELECTIVE_TEST_TYPES = "API Always CLI Core Operators Other Serialization WWW" -ALL_CI_SELECTIVE_TEST_TYPES = ( - "API Always BranchExternalPython BranchPythonVenv " - "CLI Core ExternalPython Operators Other PlainAsserts " - "Providers[-amazon,google] Providers[amazon] Providers[google] " - "PythonVenv Serialization WWW" +ALL_PROVIDERS_SELECTIVE_TEST_TYPES = ( + "Providers[-amazon,google,standard] Providers[amazon] Providers[google] Providers[standard]" ) -ALL_CI_SELECTIVE_TEST_TYPES_WITHOUT_PROVIDERS = ( - "API Always BranchExternalPython BranchPythonVenv CLI Core " - "ExternalPython Operators Other PlainAsserts PythonVenv Serialization WWW" -) -ALL_PROVIDERS_SELECTIVE_TEST_TYPES = "Providers[-amazon,google] Providers[amazon] Providers[google]" - class FileGroupForCi(Enum): ENVIRONMENT_FILES = "environment_files" @@ -104,10 +100,12 @@ class FileGroupForCi(Enum): ALWAYS_TESTS_FILES = "always_test_files" API_TEST_FILES = "api_test_files" API_CODEGEN_FILES = "api_codegen_files" + LEGACY_API_FILES = "legacy_api_files" HELM_FILES = "helm_files" DEPENDENCY_FILES = "dependency_files" DOC_FILES = "doc_files" - WWW_FILES = "www_files" + UI_FILES = "ui_files" + LEGACY_WWW_FILES = "legacy_www_files" SYSTEM_TEST_FILES = "system_tests" KUBERNETES_FILES = "kubernetes_files" ALL_PYTHON_FILES = "all_python_files" @@ -118,9 +116,16 @@ class FileGroupForCi(Enum): ALL_PROVIDER_YAML_FILES = "all_provider_yaml_files" ALL_DOCS_PYTHON_FILES = "all_docs_python_files" TESTS_UTILS_FILES = "test_utils_files" + ASSET_FILES = "asset_files" + + +class AllProvidersSentinel: + pass -T = TypeVar("T", FileGroupForCi, SelectiveUnitTestTypes) +ALL_PROVIDERS_SENTINEL = AllProvidersSentinel() + +T = TypeVar("T", FileGroupForCi, SelectiveCoreTestType) class HashableDict(Dict[T, List[str]]): @@ -149,6 +154,7 @@ def __hash__(self): FileGroupForCi.JAVASCRIPT_PRODUCTION_FILES: [ r"^airflow/.*\.[jt]sx?", r"^airflow/.*\.lock", + r"^airflow/ui/.*\.yaml$", ], FileGroupForCi.API_TEST_FILES: [ r"^airflow/api/", @@ -158,6 +164,9 @@ def __hash__(self): r"^airflow/api_connexion/openapi/v1\.yaml", r"^clients/gen", ], + FileGroupForCi.LEGACY_API_FILES: [ + r"^airflow/api_connexion/", + ], FileGroupForCi.HELM_FILES: [ r"^chart", r"^airflow/kubernetes", @@ -172,7 +181,6 @@ def __hash__(self): r"^\.github/SECURITY\.rst$", r"^airflow/.*\.py$", r"^chart", - r"^providers", r"^tests/system", r"^CHANGELOG\.txt", r"^airflow/config_templates/config\.yml", @@ -180,7 +188,8 @@ def __hash__(self): r"^chart/values\.schema\.json", r"^chart/values\.json", ], - FileGroupForCi.WWW_FILES: [ + FileGroupForCi.UI_FILES: [r"^airflow/ui/"], + FileGroupForCi.LEGACY_WWW_FILES: [ r"^airflow/www/.*\.ts[x]?$", r"^airflow/www/.*\.js[x]?$", r"^airflow/www/[^/]+\.json$", @@ -197,16 +206,14 @@ def __hash__(self): r".*\.py$", ], FileGroupForCi.ALL_AIRFLOW_PYTHON_FILES: [ - r".*\.py$", + r"airflow/.*\.py$", + r"tests/.*\.py$", ], FileGroupForCi.ALL_PROVIDERS_PYTHON_FILES: [ r"^airflow/providers/.*\.py$", r"^tests/providers/.*\.py$", - r"^tests/system/providers/.*\.py$", - ], - FileGroupForCi.ALL_DOCS_PYTHON_FILES: [ - r"^docs/.*\.py$", ], + FileGroupForCi.ALL_DOCS_PYTHON_FILES: [r"^docs/.*\.py$", r"^providers/.*/docs/.*\.py"], FileGroupForCi.ALL_DEV_PYTHON_FILES: [ r"^dev/.*\.py$", ], @@ -215,6 +222,7 @@ def __hash__(self): r"^airflow", r"^chart", r"^tests", + r"^tests_common", r"^kubernetes_tests", ], FileGroupForCi.SYSTEM_TEST_FILES: [ @@ -228,6 +236,12 @@ def __hash__(self): ], FileGroupForCi.TESTS_UTILS_FILES: [ r"^tests/utils/", + r"^tests_common/.*\.py$", + ], + FileGroupForCi.ASSET_FILES: [ + r"^airflow/assets/", + r"^airflow/models/assets/", + r"^airflow/datasets/", ], } ) @@ -240,8 +254,8 @@ def __hash__(self): r"^airflow/providers/.*", r"^dev/.*", r"^docs/.*", - r"^provider_packages/.*", r"^tests/providers/.*", + r"^tests/integration/providers/.*", r"^tests/system/providers/.*", r"^tests/dags/test_imports.py", ] @@ -249,42 +263,39 @@ def __hash__(self): ) PYTHON_OPERATOR_FILES = [ - r"^airflow/operators/python.py", - r"^tests/operators/test_python.py", + r"^airflow/providers/standard/operators/python.py", + r"^tests/providers/standard/operators/test_python.py", ] TEST_TYPE_MATCHES = HashableDict( { - SelectiveUnitTestTypes.API: [ + SelectiveCoreTestType.API: [ r"^airflow/api/", r"^airflow/api_connexion/", + r"^airflow/api_experimental/", r"^airflow/api_internal/", r"^tests/api/", r"^tests/api_connexion/", + r"^tests/api_experimental/", r"^tests/api_internal/", ], - SelectiveUnitTestTypes.CLI: [ + SelectiveCoreTestType.CLI: [ r"^airflow/cli/", r"^tests/cli/", ], - SelectiveUnitTestTypes.OPERATORS: [ + SelectiveCoreTestType.OPERATORS: [ r"^airflow/operators/", r"^tests/operators/", ], - SelectiveUnitTestTypes.PROVIDERS: [ + SelectiveProvidersTestType.PROVIDERS: [ r"^airflow/providers/", - r"^tests/system/providers/", - r"^tests/providers/", + r"^tests/providers", ], - SelectiveUnitTestTypes.SERIALIZATION: [ + SelectiveCoreTestType.SERIALIZATION: [ r"^airflow/serialization/", r"^tests/serialization/", ], - SelectiveUnitTestTypes.PYTHON_VENV: PYTHON_OPERATOR_FILES, - SelectiveUnitTestTypes.BRANCH_PYTHON_VENV: PYTHON_OPERATOR_FILES, - SelectiveUnitTestTypes.EXTERNAL_PYTHON: PYTHON_OPERATOR_FILES, - SelectiveUnitTestTypes.EXTERNAL_BRANCH_PYTHON: PYTHON_OPERATOR_FILES, - SelectiveUnitTestTypes.WWW: [r"^airflow/www", r"^tests/www"], + SelectiveCoreTestType.WWW: [r"^airflow/www", r"^tests/www"], } ) @@ -293,30 +304,39 @@ def __hash__(self): def find_provider_affected(changed_file: str, include_docs: bool) -> str | None: file_path = AIRFLOW_SOURCES_ROOT / changed_file - # is_relative_to is only available in Python 3.9 - we should simplify this check when we are Python 3.9+ - for provider_root in (TESTS_PROVIDERS_ROOT, SYSTEM_TESTS_PROVIDERS_ROOT, AIRFLOW_PROVIDERS_ROOT): - try: - file_path.relative_to(provider_root) - relative_base_path = provider_root + # Check providers in SRC/SYSTEM_TESTS/TESTS/(optionally) DOCS + # TODO(potiuk) - this should be removed once we have all providers in the new structure (OLD + docs) + for provider_root in (AIRFLOW_PROVIDERS_DIR,): + if file_path.is_relative_to(provider_root): + provider_base_path = provider_root break - except ValueError: - pass else: - if include_docs: - try: - relative_path = file_path.relative_to(DOCS_DIR) - if relative_path.parts[0].startswith("apache-airflow-providers-"): - return relative_path.parts[0].replace("apache-airflow-providers-", "").replace("-", ".") - except ValueError: - pass + if include_docs and file_path.is_relative_to(DOCS_DIR): + relative_path = file_path.relative_to(DOCS_DIR) + if relative_path.parts[0].startswith("apache-airflow-providers-"): + return relative_path.parts[0].replace("apache-airflow-providers-", "").replace("-", ".") + # This is neither providers nor provider docs files - not a provider change return None + if not include_docs: + for parent_dir_path in file_path.parents: + if parent_dir_path.name == "docs" and (parent_dir_path.parent / "provider.yaml").exists(): + # Skip Docs changes if include_docs is not set + return None + + # Find if the path under src/system tests/tests belongs to provider or is a common code across + # multiple providers for parent_dir_path in file_path.parents: - if parent_dir_path == relative_base_path: + if parent_dir_path == provider_base_path: + # We have not found any provider specific path up to the root of the provider base folder break - relative_path = parent_dir_path.relative_to(relative_base_path) - if (AIRFLOW_PROVIDERS_ROOT / relative_path / "provider.yaml").exists(): - return str(parent_dir_path.relative_to(relative_base_path)).replace(os.sep, ".") + relative_path = parent_dir_path.relative_to(provider_base_path) + # check if this path belongs to a specific provider + # TODO(potiuk) - this should be removed once we have all providers in the new structure + if (parent_dir_path / "provider.yaml").exists(): + # new providers structure + return str(relative_path).replace(os.sep, ".") + # If we got here it means that some "common" files were modified. so we need to test all Providers return "Providers" @@ -334,7 +354,7 @@ def _exclude_files_with_regexps(files: tuple[str, ...], matched_files, exclude_r matched_files.remove(file) -@lru_cache(maxsize=None) +@clearable_cache def _matching_files( files: tuple[str, ...], match_group: FileGroupForCi, match_dict: HashableDict, exclude_dict: HashableDict ) -> list[str]: @@ -585,13 +605,17 @@ def kubernetes_versions_list_as_string(self) -> str: return " ".join(self.kubernetes_versions) @cached_property - def kubernetes_combos_list_as_string(self) -> str: + def kubernetes_combos(self) -> list[str]: python_version_array: list[str] = self.python_versions_list_as_string.split(" ") kubernetes_version_array: list[str] = self.kubernetes_versions_list_as_string.split(" ") combo_titles, short_combo_titles, combos = get_kubernetes_python_combos( kubernetes_version_array, python_version_array ) - return " ".join(short_combo_titles) + return short_combo_titles + + @cached_property + def kubernetes_combos_list_as_string(self) -> str: + return " ".join(self.kubernetes_combos) def _matching_files( self, match_group: FileGroupForCi, match_dict: HashableDict, exclude_dict: HashableDict @@ -663,21 +687,29 @@ def needs_javascript_scans(self) -> bool: def needs_api_tests(self) -> bool: return self._should_be_run(FileGroupForCi.API_TEST_FILES) + @cached_property + def needs_ol_tests(self) -> bool: + return self._should_be_run(FileGroupForCi.ASSET_FILES) + @cached_property def needs_api_codegen(self) -> bool: return self._should_be_run(FileGroupForCi.API_CODEGEN_FILES) + @cached_property + def run_ui_tests(self) -> bool: + return self._should_be_run(FileGroupForCi.UI_FILES) + @cached_property def run_www_tests(self) -> bool: - return self._should_be_run(FileGroupForCi.WWW_FILES) + return self._should_be_run(FileGroupForCi.LEGACY_WWW_FILES) @cached_property def run_amazon_tests(self) -> bool: - if self.parallel_test_types_list_as_string is None: + if self.providers_test_types_list_as_string is None: return False return ( - "amazon" in self.parallel_test_types_list_as_string - or "Providers" in self.parallel_test_types_list_as_string.split(" ") + "amazon" in self.providers_test_types_list_as_string + or "Providers" in self.providers_test_types_list_as_string.split(" ") ) @cached_property @@ -694,8 +726,17 @@ def needs_helm_tests(self) -> bool: @cached_property def run_tests(self) -> bool: + if self._is_canary_run(): + return True + if self.only_new_ui_files: + return False + # we should run all test return self._should_be_run(FileGroupForCi.ALL_SOURCE_FILES) + @cached_property + def run_system_tests(self) -> bool: + return self.run_tests + @cached_property def ci_image_build(self) -> bool: # in case pyproject.toml changed, CI image should be built - even if no build dependencies @@ -714,7 +755,7 @@ def prod_image_build(self) -> bool: return self.run_kubernetes_tests or self.needs_helm_tests def _select_test_type_if_matching( - self, test_types: set[str], test_type: SelectiveUnitTestTypes + self, test_types: set[str], test_type: SelectiveCoreTestType ) -> list[str]: matched_files = self._matching_files(test_type, TEST_TYPE_MATCHES, TEST_TYPE_EXCLUDES) count = len(matched_files) @@ -726,23 +767,22 @@ def _select_test_type_if_matching( def _are_all_providers_affected(self) -> bool: # if "Providers" test is present in the list of tests, it means that we should run all providers tests # prepare all providers packages and build all providers documentation - return "Providers" in self._get_test_types_to_run() + return "Providers" in self._get_providers_test_types_to_run() def _fail_if_suspended_providers_affected(self) -> bool: return "allow suspended provider changes" not in self._pr_labels - def _get_test_types_to_run(self, split_to_individual_providers: bool = False) -> list[str]: + def _get_core_test_types_to_run(self) -> list[str]: if self.full_tests_needed: - return list(all_selective_test_types()) + return list(all_selective_core_test_types()) candidate_test_types: set[str] = {"Always"} matched_files: set[str] = set() - for test_type in SelectiveUnitTestTypes: + for test_type in SelectiveCoreTestType: if test_type not in [ - SelectiveUnitTestTypes.ALWAYS, - SelectiveUnitTestTypes.CORE, - SelectiveUnitTestTypes.OTHER, - SelectiveUnitTestTypes.PLAIN_ASSERTS, + SelectiveCoreTestType.ALWAYS, + SelectiveCoreTestType.CORE, + SelectiveCoreTestType.OTHER, ]: matched_files.update(self._select_test_type_if_matching(candidate_test_types, test_type)) @@ -755,15 +795,24 @@ def _get_test_types_to_run(self, split_to_individual_providers: bool = False) -> all_source_files = self._matching_files( FileGroupForCi.ALL_SOURCE_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES ) + all_providers_source_files = self._matching_files( + FileGroupForCi.ALL_PROVIDERS_PYTHON_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES + ) test_always_files = self._matching_files( FileGroupForCi.ALWAYS_TESTS_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES ) + test_ui_files = self._matching_files( + FileGroupForCi.UI_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES + ) + remaining_files = ( set(all_source_files) + - set(all_providers_source_files) - set(matched_files) - set(kubernetes_files) - set(system_test_files) - set(test_always_files) + - set(test_ui_files) ) get_console().print(f"[warning]Remaining non test/always files: {len(remaining_files)}[/]") count_remaining_files = len(remaining_files) @@ -780,29 +829,59 @@ def _get_test_types_to_run(self, split_to_individual_providers: bool = False) -> f"into Core/Other category[/]" ) get_console().print(remaining_files) - candidate_test_types.update(all_selective_test_types_except_providers()) + candidate_test_types.update(all_selective_core_test_types()) else: - if "Providers" in candidate_test_types or "API" in candidate_test_types: - affected_providers = self._find_all_providers_affected( - include_docs=False, - ) - if affected_providers != "ALL_PROVIDERS" and affected_providers is not None: - candidate_test_types.discard("Providers") - if split_to_individual_providers: - for provider in affected_providers: - candidate_test_types.add(f"Providers[{provider}]") - else: - candidate_test_types.add(f"Providers[{','.join(sorted(affected_providers))}]") - elif split_to_individual_providers and "Providers" in candidate_test_types: - candidate_test_types.discard("Providers") - for provider in get_available_packages(): - candidate_test_types.add(f"Providers[{provider}]") get_console().print( "[warning]There are no core/other files. Only tests relevant to the changed files are run.[/]" ) # sort according to predefined order sorted_candidate_test_types = sorted(candidate_test_types) - get_console().print("[warning]Selected test type candidates to run:[/]") + get_console().print("[warning]Selected core test type candidates to run:[/]") + get_console().print(sorted_candidate_test_types) + return sorted_candidate_test_types + + def _get_providers_test_types_to_run(self, split_to_individual_providers: bool = False) -> list[str]: + if self._default_branch != "main": + return [] + if self.full_tests_needed: + if split_to_individual_providers: + return list(providers_test_type()) + else: + return ["Providers"] + else: + all_providers_source_files = self._matching_files( + FileGroupForCi.ALL_PROVIDERS_PYTHON_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES + ) + assets_source_files = self._matching_files( + FileGroupForCi.ASSET_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES + ) + + if ( + len(all_providers_source_files) == 0 + and len(assets_source_files) == 0 + and not self.needs_api_tests + ): + # IF API tests are needed, that will trigger extra provider checks + return [] + else: + affected_providers = self._find_all_providers_affected( + include_docs=False, + ) + candidate_test_types: set[str] = set() + if isinstance(affected_providers, AllProvidersSentinel): + if split_to_individual_providers: + for provider in get_available_packages(): + candidate_test_types.add(f"Providers[{provider}]") + else: + candidate_test_types.add("Providers") + elif affected_providers: + if split_to_individual_providers: + for provider in affected_providers: + candidate_test_types.add(f"Providers[{provider}]") + else: + candidate_test_types.add(f"Providers[{','.join(sorted(affected_providers))}]") + sorted_candidate_test_types = sorted(candidate_test_types) + get_console().print("[warning]Selected providers test type candidates to run:[/]") get_console().print(sorted_candidate_test_types) return sorted_candidate_test_types @@ -817,7 +896,7 @@ def _extract_long_provider_tests(current_test_types: set[str]): in case of Providers[list_of_tests] we need to remove the long tests from the list. """ - long_tests = ["amazon", "google"] + long_tests = ["amazon", "google", "standard"] for original_test_type in tuple(current_test_types): if original_test_type == "Providers": current_test_types.remove(original_test_type) @@ -837,10 +916,17 @@ def _extract_long_provider_tests(current_test_types: set[str]): current_test_types.add(f"Providers[{','.join(provider_tests_to_run)}]") @cached_property - def parallel_test_types_list_as_string(self) -> str | None: + def core_test_types_list_as_string(self) -> str | None: if not self.run_tests: return None - current_test_types = set(self._get_test_types_to_run()) + current_test_types = set(self._get_core_test_types_to_run()) + return " ".join(sorted(current_test_types)) + + @cached_property + def providers_test_types_list_as_string(self) -> str | None: + if not self.run_tests: + return None + current_test_types = set(self._get_providers_test_types_to_run()) if self._default_branch != "main": test_types_to_remove: set[str] = set() for test_type in current_test_types: @@ -851,33 +937,19 @@ def parallel_test_types_list_as_string(self) -> str | None: ) test_types_to_remove.add(test_type) current_test_types = current_test_types - test_types_to_remove - self._extract_long_provider_tests(current_test_types) return " ".join(sorted(current_test_types)) @cached_property - def providers_test_types_list_as_string(self) -> str | None: - all_test_types = self.parallel_test_types_list_as_string - if all_test_types is None: - return None - return " ".join( - test_type for test_type in all_test_types.split(" ") if test_type.startswith("Providers") - ) - - @cached_property - def separate_test_types_list_as_string(self) -> str | None: + def individual_providers_test_types_list_as_string(self) -> str | None: if not self.run_tests: return None - current_test_types = set(self._get_test_types_to_run(split_to_individual_providers=True)) + current_test_types = set(self._get_providers_test_types_to_run(split_to_individual_providers=True)) if "Providers" in current_test_types: current_test_types.remove("Providers") current_test_types.update( {f"Providers[{provider}]" for provider in get_available_packages(include_not_ready=True)} ) - if self.skip_provider_tests: - current_test_types = { - test_type for test_type in current_test_types if not test_type.startswith("Providers") - } return " ".join(sorted(current_test_types)) @cached_property @@ -1007,7 +1079,7 @@ def docs_list_as_string(self) -> str | None: include_docs=True, ) if ( - providers_affected == "ALL_PROVIDERS" + isinstance(providers_affected, AllProvidersSentinel) or "docs/conf.py" in self._files or "docs/build_docs.py" in self._files or self._are_all_providers_affected() @@ -1055,8 +1127,17 @@ def skip_pre_commits(self) -> str: # when full tests are needed, we do not want to skip any checks and we should # run all the pre-commits just to be sure everything is ok when some structural changes occurred return ",".join(sorted(pre_commits_to_skip)) - if not self._matching_files(FileGroupForCi.WWW_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES): + if not self._matching_files( + FileGroupForCi.LEGACY_WWW_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES + ): pre_commits_to_skip.add("ts-compile-format-lint-www") + if not ( + self._matching_files(FileGroupForCi.UI_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES) + or self._matching_files( + FileGroupForCi.API_CODEGEN_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES + ) + ): + pre_commits_to_skip.add("ts-compile-format-lint-ui") if not self._matching_files( FileGroupForCi.ALL_PYTHON_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES ): @@ -1081,43 +1162,54 @@ def skip_pre_commits(self) -> str: return ",".join(sorted(pre_commits_to_skip)) @cached_property - def skip_provider_tests(self) -> bool: + def skip_providers_tests(self) -> bool: if self._default_branch != "main": return True if self.full_tests_needed: return False - if any(test_type.startswith("Providers") for test_type in self._get_test_types_to_run()): + if self._get_providers_test_types_to_run(): return False return True + @cached_property + def test_groups(self): + if self.skip_providers_tests: + if self.run_tests: + return "['core']" + else: + if self.run_tests: + return "['core', 'providers']" + return "[]" + @cached_property def docker_cache(self) -> str: - return ( - "disabled" - if (self._github_event == GithubEvents.SCHEDULE or DISABLE_IMAGE_CACHE_LABEL in self._pr_labels) - else "registry" - ) + return "disabled" if DISABLE_IMAGE_CACHE_LABEL in self._pr_labels else "registry" @cached_property def debug_resources(self) -> bool: return DEBUG_CI_RESOURCES_LABEL in self._pr_labels + @cached_property + def disable_airflow_repo_cache(self) -> bool: + return self.docker_cache == "disabled" + @cached_property def helm_test_packages(self) -> str: return json.dumps(all_helm_test_packages()) @cached_property - def affected_providers_list_as_string(self) -> str | None: - _ALL_PROVIDERS_LIST = "" + def selected_providers_list_as_string(self) -> str | None: + if self._default_branch != "main": + return None if self.full_tests_needed: - return _ALL_PROVIDERS_LIST + return "" if self._are_all_providers_affected(): - return _ALL_PROVIDERS_LIST + return "" affected_providers = self._find_all_providers_affected(include_docs=True) if not affected_providers: return None - if affected_providers == "ALL_PROVIDERS": - return _ALL_PROVIDERS_LIST + if isinstance(affected_providers, AllProvidersSentinel): + return "" return " ".join(sorted(affected_providers)) @cached_property @@ -1212,10 +1304,10 @@ def is_amd_runner(self) -> bool: """ return any( [ - "amd" == label.lower() - or "amd64" == label.lower() - or "x64" == label.lower() - or "asf-runner" == label + label.lower() == "amd" + or label.lower() == "amd64" + or label.lower() == "x64" + or label == "asf-runner" or ("ubuntu" in label and "arm" not in label.lower()) for label in json.loads(self.runs_on_as_json_public) ] @@ -1232,7 +1324,7 @@ def is_arm_runner(self) -> bool: """ return any( [ - "arm" == label.lower() or "arm64" == label.lower() or "asf-arm" == label + label.lower() == "arm" or label.lower() == "arm64" or label == "asf-arm" for label in json.loads(self.runs_on_as_json_public) ] ) @@ -1261,23 +1353,66 @@ def chicken_egg_providers(self) -> str: return CHICKEN_EGG_PROVIDERS @cached_property - def providers_compatibility_checks(self) -> str: - """Provider compatibility input checks for the current run. Filter out python versions not built""" + def providers_compatibility_tests_matrix(self) -> str: + """Provider compatibility input matrix for the current run. Filter out python versions not built""" return json.dumps( [ check - for check in BASE_PROVIDERS_COMPATIBILITY_CHECKS + for check in PROVIDERS_COMPATIBILITY_TESTS_MATRIX if check["python-version"] in self.python_versions ] ) @cached_property - def testable_integrations(self) -> list[str]: - return [ - integration - for integration in TESTABLE_INTEGRATIONS - if integration not in DISABLE_TESTABLE_INTEGRATIONS_FROM_CI - ] + def excluded_providers_as_string(self) -> str: + providers_to_exclude = defaultdict(list) + for provider, provider_info in DEPENDENCIES.items(): + if "excluded-python-versions" in provider_info: + for python_version in provider_info["excluded-python-versions"]: + providers_to_exclude[python_version].append(provider) + sorted_providers_to_exclude = dict( + sorted(providers_to_exclude.items(), key=lambda item: int(item[0].split(".")[1])) + ) # ^ sort by Python minor version + return json.dumps(sorted_providers_to_exclude) + + @cached_property + def only_new_ui_files(self) -> bool: + all_source_files = set( + self._matching_files( + FileGroupForCi.ALL_SOURCE_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES + ) + ) + new_ui_source_files = set( + self._matching_files(FileGroupForCi.UI_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES) + ) + remaining_files = all_source_files - new_ui_source_files + + if all_source_files and new_ui_source_files and not remaining_files: + return True + else: + return False + + @cached_property + def testable_core_integrations(self) -> list[str]: + if not self.run_tests: + return [] + else: + return [ + integration + for integration in TESTABLE_CORE_INTEGRATIONS + if integration not in DISABLE_TESTABLE_INTEGRATIONS_FROM_CI + ] + + @cached_property + def testable_providers_integrations(self) -> list[str]: + if not self.run_tests: + return [] + else: + return [ + integration + for integration in TESTABLE_PROVIDERS_INTEGRATIONS + if integration not in DISABLE_TESTABLE_INTEGRATIONS_FROM_CI + ] @cached_property def is_committer_build(self): @@ -1285,8 +1420,8 @@ def is_committer_build(self): return False return self._github_actor in COMMITTERS - def _find_all_providers_affected(self, include_docs: bool) -> list[str] | str | None: - all_providers: set[str] = set() + def _find_all_providers_affected(self, include_docs: bool) -> list[str] | AllProvidersSentinel | None: + affected_providers: set[str] = set() all_providers_affected = False suspended_providers: set[str] = set() @@ -1298,11 +1433,13 @@ def _find_all_providers_affected(self, include_docs: bool) -> list[str] | str | if provider not in DEPENDENCIES: suspended_providers.add(provider) else: - all_providers.add(provider) + affected_providers.add(provider) if self.needs_api_tests: - all_providers.add("fab") + affected_providers.add("fab") + if self.needs_ol_tests: + affected_providers.add("openlineage") if all_providers_affected: - return "ALL_PROVIDERS" + return ALL_PROVIDERS_SENTINEL if suspended_providers: # We check for suspended providers only after we have checked if all providers are affected. # No matter if we found that we are modifying a suspended provider individually, @@ -1332,16 +1469,56 @@ def _find_all_providers_affected(self, include_docs: bool) -> list[str] | str | get_console().print( "[info]This PR had `allow suspended provider changes` label set so it will continue" ) - if not all_providers: + if not affected_providers: return None - for provider in list(all_providers): - all_providers.update( + + for provider in list(affected_providers): + affected_providers.update( get_related_providers(provider, upstream_dependencies=True, downstream_dependencies=True) ) - return sorted(all_providers) + return sorted(affected_providers) def _is_canary_run(self): return ( self._github_event in [GithubEvents.SCHEDULE, GithubEvents.PUSH] and self._github_repository == APACHE_AIRFLOW_GITHUB_REPOSITORY ) or CANARY_LABEL in self._pr_labels + + @cached_property + def is_legacy_ui_api_labeled(self) -> bool: + # Selective check for legacy UI/API updates. + # It is to ping the maintainer to add the label and make them aware of the changes. + if self._is_canary_run() or self._github_event not in ( + GithubEvents.PULL_REQUEST, + GithubEvents.PULL_REQUEST_TARGET, + ): + return False + + if ( + self._matching_files( + FileGroupForCi.LEGACY_API_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES + ) + and LEGACY_API_LABEL not in self._pr_labels + ): + get_console().print( + f"[error]Please ask maintainer to assign " + f"the '{LEGACY_API_LABEL}' label to the PR in order to continue" + ) + sys.exit(1) + elif ( + self._matching_files( + FileGroupForCi.LEGACY_WWW_FILES, CI_FILE_GROUP_MATCHES, CI_FILE_GROUP_EXCLUDES + ) + and LEGACY_UI_LABEL not in self._pr_labels + ): + get_console().print( + f"[error]Please ask maintainer to assign " + f"the '{LEGACY_UI_LABEL}' label to the PR in order to continue" + ) + sys.exit(1) + else: + return True + + @cached_property + def force_pip(self): + return FORCE_PIP_LABEL in self._pr_labels diff --git a/dev/breeze/src/airflow_breeze/utils/spelling_checks.py b/dev/breeze/src/airflow_breeze/utils/spelling_checks.py index 1dd9d7ec8cb2b..ee4c37e338f44 100644 --- a/dev/breeze/src/airflow_breeze/utils/spelling_checks.py +++ b/dev/breeze/src/airflow_breeze/utils/spelling_checks.py @@ -167,9 +167,9 @@ def display_spelling_error_summary(spelling_errors: dict[str, list[SpellingError """ console.print(msg) console.print() - console.print + console.print() console.print("[red]" + "#" * 30 + " End docs build errors summary " + "#" * 30 + "[/]") - console.print + console.print() def _display_error(error: SpellingError): diff --git a/dev/breeze/src/airflow_breeze/utils/version_utils.py b/dev/breeze/src/airflow_breeze/utils/version_utils.py index 7b41fa46bdc57..4ed887dc1ea51 100644 --- a/dev/breeze/src/airflow_breeze/utils/version_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/version_utils.py @@ -35,3 +35,57 @@ def get_latest_airflow_version(): response.raise_for_status() latest_released_version = response.json()["info"]["version"] return latest_released_version + + +def create_package_version(version_suffix_for_pypi: str, version_suffix_for_local: str) -> str: + """ + Creates a package version by combining the version suffix for PyPI and the version suffix for local. If + either one is an empty string, it is ignored. If the local suffix does not have a leading plus sign, + the leading plus sign will be added. + + Args: + version_suffix_for_pypi (str): The version suffix for PyPI. + version_suffix_for_local (str): The version suffix for local. + + Returns: + str: The combined package version. + + """ + # if there is no local version suffix, return the PyPi version suffix + if not version_suffix_for_local: + return version_suffix_for_pypi + + # ensure the local version suffix starts with a plus sign + if version_suffix_for_local[0] != "+": + version_suffix_for_local = "+" + version_suffix_for_local + + # if there is a PyPi version suffix, return the combined version. Otherwise just return the local version. + if version_suffix_for_pypi: + return version_suffix_for_pypi + version_suffix_for_local + else: + return version_suffix_for_local + + +def remove_local_version_suffix(version_suffix: str) -> str: + if "+" in version_suffix: + return version_suffix.split("+")[0] + else: + return version_suffix + + +def is_local_package_version(version_suffix: str) -> bool: + """ + Check if the given version suffix is a local version suffix. A local version suffix will contain a + plus sign ('+'). This function does not guarantee that the version suffix is a valid local version suffix. + + Args: + version_suffix (str): The version suffix to check. + + Returns: + bool: True if the version suffix contains a '+', False otherwise. Please note this does not + guarantee that the version suffix is a valid local version suffix. + """ + if version_suffix and ("+" in version_suffix): + return True + else: + return False diff --git a/dev/breeze/src/airflow_breeze/utils/versions.py b/dev/breeze/src/airflow_breeze/utils/versions.py index 70dc6ad77d38b..f3601f6386312 100644 --- a/dev/breeze/src/airflow_breeze/utils/versions.py +++ b/dev/breeze/src/airflow_breeze/utils/versions.py @@ -27,7 +27,7 @@ def strip_leading_zeros_from_version(version: str) -> str: :param version: version number in CALVER format (potentially with leading 0s in date and month) :return: string with leading 0s after dot replaced. """ - return ".".join(str(int(i)) for i in version.split(".")) + return ".".join(i.lstrip("0") or "0" for i in version.split(".")) def get_version_tag(version: str, provider_package_id: str, version_suffix: str = ""): diff --git a/dev/breeze/src/airflow_breeze/utils/virtualenv_utils.py b/dev/breeze/src/airflow_breeze/utils/virtualenv_utils.py index 3c6a175a0fcd0..04622c4f11fec 100644 --- a/dev/breeze/src/airflow_breeze/utils/virtualenv_utils.py +++ b/dev/breeze/src/airflow_breeze/utils/virtualenv_utils.py @@ -20,8 +20,8 @@ import contextlib import sys import tempfile +from collections.abc import Generator from pathlib import Path -from typing import Generator from airflow_breeze.utils.cache import check_if_cache_exists from airflow_breeze.utils.console import get_console @@ -59,13 +59,9 @@ def create_venv( if not python_path.exists(): get_console().print(f"\n[errors]Python interpreter is not exist in path {python_path}. Exiting!\n") sys.exit(1) - if check_if_cache_exists("use_uv"): - command = create_uv_command(python_path) - else: - command = create_pip_command(python_path) if pip_version: result = run_command( - [*command, "install", f"pip=={pip_version}", "-q"], + [python_path.as_posix(), "-m", "pip", "install", f"pip=={pip_version}", "-q"], check=False, capture_output=False, text=True, @@ -78,7 +74,7 @@ def create_venv( sys.exit(result.returncode) if uv_version: result = run_command( - [*command, "install", f"uv=={uv_version}", "-q"], + [python_path.as_posix(), "-m", "pip", "install", f"uv=={uv_version}", "-q"], check=False, capture_output=False, text=True, @@ -89,6 +85,10 @@ def create_venv( f"{result.stdout}\n{result.stderr}" ) sys.exit(result.returncode) + if check_if_cache_exists("use_uv"): + command = create_uv_command(python_path) + else: + command = create_pip_command(python_path) if requirements_file: requirements_file = Path(requirements_file).absolute().as_posix() result = run_command( diff --git a/dev/breeze/tests/conftest.py b/dev/breeze/tests/conftest.py index 17d7366abdc81..f968e10c812c7 100644 --- a/dev/breeze/tests/conftest.py +++ b/dev/breeze/tests/conftest.py @@ -16,6 +16,10 @@ # under the License. from __future__ import annotations +import pytest + +from airflow_breeze.utils.functools_cache import clear_all_cached_functions + def pytest_configure(config): import sys @@ -27,3 +31,8 @@ def pytest_unconfigure(config): import sys # This was missing from the manual del sys._called_from_test + + +@pytest.fixture(autouse=True) +def clear_clearable_cache(): + clear_all_cached_functions() diff --git a/dev/breeze/tests/test_cache.py b/dev/breeze/tests/test_cache.py index 52ba6ada1e09e..9215c896bb36d 100644 --- a/dev/breeze/tests/test_cache.py +++ b/dev/breeze/tests/test_cache.py @@ -36,6 +36,7 @@ [ ("backend", "mysql", (True, ["sqlite", "mysql", "postgres", "none"]), None), ("backend", "xxx", (False, ["sqlite", "mysql", "postgres", "none"]), None), + ("python_major_minor_version", "3.9", (True, ["3.8", "3.9", "3.10", "3.11", "3.12"]), None), ("python_major_minor_version", "3.8", (True, ["3.8", "3.9", "3.10", "3.11", "3.12"]), None), ("python_major_minor_version", "3.7", (False, ["3.8", "3.9", "3.10", "3.11", "3.12"]), None), ("missing", "value", None, AttributeError), @@ -66,7 +67,7 @@ def test_check_if_cache_exists(path): def test_read_from_cache_file(param): param_value = read_from_cache_file(param.upper()) if param_value is None: - assert None is param_value + assert param_value is None else: allowed, param_list = check_if_values_allowed(param, param_value) if allowed: diff --git a/dev/breeze/tests/test_docker_command_utils.py b/dev/breeze/tests/test_docker_command_utils.py index 731935ec019af..a50c04352637f 100644 --- a/dev/breeze/tests/test_docker_command_utils.py +++ b/dev/breeze/tests/test_docker_command_utils.py @@ -127,6 +127,28 @@ def test_check_docker_version_higher(mock_get_console, mock_run_command, mock_ch mock_get_console.return_value.print.assert_called_with("[success]Good version of Docker: 24.0.0.[/]") +@mock.patch("airflow_breeze.utils.docker_command_utils.check_docker_permission_denied") +@mock.patch("airflow_breeze.utils.docker_command_utils.run_command") +@mock.patch("airflow_breeze.utils.docker_command_utils.get_console") +def test_check_docker_version_higher_rancher_desktop( + mock_get_console, mock_run_command, mock_check_docker_permission_denied +): + mock_check_docker_permission_denied.return_value = False + mock_run_command.return_value.returncode = 0 + mock_run_command.return_value.stdout = "24.0.0-rd" + check_docker_version() + mock_check_docker_permission_denied.assert_called() + mock_run_command.assert_called_with( + ["docker", "version", "--format", "{{.Client.Version}}"], + no_output_dump_on_exception=True, + capture_output=True, + text=True, + check=False, + dry_run_override=False, + ) + mock_get_console.return_value.print.assert_called_with("[success]Good version of Docker: 24.0.0-r.[/]") + + @mock.patch("airflow_breeze.utils.docker_command_utils.run_command") @mock.patch("airflow_breeze.utils.docker_command_utils.get_console") def test_check_docker_compose_version_unknown(mock_get_console, mock_run_command): diff --git a/dev/breeze/tests/test_packages.py b/dev/breeze/tests/test_packages.py index 7a5b22c3bc7e3..4328541a1cffd 100644 --- a/dev/breeze/tests/test_packages.py +++ b/dev/breeze/tests/test_packages.py @@ -16,7 +16,7 @@ # under the License. from __future__ import annotations -from typing import Iterable +from collections.abc import Iterable import pytest @@ -30,7 +30,6 @@ get_available_packages, get_cross_provider_dependent_packages, get_dist_package_name_prefix, - get_documentation_package_path, get_install_requirements, get_long_package_name, get_min_airflow_version, @@ -42,12 +41,11 @@ get_provider_requirements, get_removed_provider_ids, get_short_package_name, - get_source_package_path, get_suspended_provider_folders, get_suspended_provider_ids, validate_provider_info_with_runtime_schema, ) -from airflow_breeze.utils.path_utils import AIRFLOW_PROVIDERS_ROOT, AIRFLOW_SOURCES_ROOT, DOCS_ROOT +from airflow_breeze.utils.path_utils import AIRFLOW_SOURCES_ROOT def test_get_available_packages(): @@ -109,17 +107,17 @@ def test_get_provider_requirements(): def test_get_removed_providers(): # Modify it every time we schedule provider for removal or remove it - assert [] == get_removed_provider_ids() + assert get_removed_provider_ids() == [] def test_get_suspended_provider_ids(): # Modify it every time we suspend/resume provider - assert [] == get_suspended_provider_ids() + assert get_suspended_provider_ids() == [] def test_get_suspended_provider_folders(): # Modify it every time we suspend/resume provider - assert [] == get_suspended_provider_folders() + assert get_suspended_provider_folders() == [] @pytest.mark.parametrize( @@ -150,14 +148,6 @@ def test_find_matching_long_package_name_bad_filter(): find_matching_long_package_names(short_packages=(), filters=("bad-filter-*",)) -def test_get_source_package_path(): - assert get_source_package_path("apache.hdfs") == AIRFLOW_PROVIDERS_ROOT / "apache" / "hdfs" - - -def test_get_documentation_package_path(): - assert get_documentation_package_path("apache.hdfs") == DOCS_ROOT / "apache-airflow-providers-apache-hdfs" - - @pytest.mark.parametrize( "provider, version_suffix, expected", [ @@ -223,7 +213,8 @@ def test_get_documentation_package_path(): ], ) def test_get_install_requirements(provider: str, version_suffix: str, expected: str): - assert get_install_requirements(provider, version_suffix).strip() == expected.strip() + actual = get_install_requirements(provider, version_suffix) + assert actual.strip() == expected.strip() @pytest.mark.parametrize( @@ -236,8 +227,6 @@ def test_get_install_requirements(provider: str, version_suffix: str, expected: "apache.beam": ["apache-airflow-providers-apache-beam", "apache-beam[gcp]"], "apache.cassandra": ["apache-airflow-providers-apache-cassandra"], "cncf.kubernetes": ["apache-airflow-providers-cncf-kubernetes>=7.2.0"], - "common.compat": ["apache-airflow-providers-common-compat"], - "common.sql": ["apache-airflow-providers-common-sql"], "facebook": ["apache-airflow-providers-facebook>=2.2.0"], "leveldb": ["plyvel"], "microsoft.azure": ["apache-airflow-providers-microsoft-azure"], @@ -261,8 +250,6 @@ def test_get_install_requirements(provider: str, version_suffix: str, expected: "apache.beam": ["apache-airflow-providers-apache-beam", "apache-beam[gcp]"], "apache.cassandra": ["apache-airflow-providers-apache-cassandra"], "cncf.kubernetes": ["apache-airflow-providers-cncf-kubernetes>=7.2.0.dev0"], - "common.compat": ["apache-airflow-providers-common-compat"], - "common.sql": ["apache-airflow-providers-common-sql"], "facebook": ["apache-airflow-providers-facebook>=2.2.0.dev0"], "leveldb": ["plyvel"], "microsoft.azure": ["apache-airflow-providers-microsoft-azure"], @@ -286,8 +273,6 @@ def test_get_install_requirements(provider: str, version_suffix: str, expected: "apache.beam": ["apache-airflow-providers-apache-beam", "apache-beam[gcp]"], "apache.cassandra": ["apache-airflow-providers-apache-cassandra"], "cncf.kubernetes": ["apache-airflow-providers-cncf-kubernetes>=7.2.0b0"], - "common.compat": ["apache-airflow-providers-common-compat"], - "common.sql": ["apache-airflow-providers-common-sql"], "facebook": ["apache-airflow-providers-facebook>=2.2.0b0"], "leveldb": ["plyvel"], "microsoft.azure": ["apache-airflow-providers-microsoft-azure"], @@ -307,26 +292,33 @@ def test_get_install_requirements(provider: str, version_suffix: str, expected: ], ) def test_get_package_extras(version_suffix: str, expected: dict[str, list[str]]): - assert get_package_extras("google", version_suffix=version_suffix) == expected - - -def test_get_provider_details(): - provider_details = get_provider_details("asana") - assert provider_details.provider_id == "asana" - assert provider_details.full_package_name == "airflow.providers.asana" - assert provider_details.pypi_package_name == "apache-airflow-providers-asana" - assert ( - provider_details.source_provider_package_path - == AIRFLOW_SOURCES_ROOT / "airflow" / "providers" / "asana" + actual = get_package_extras("google", version_suffix=version_suffix) + assert actual == expected + + +def test_get_new_provider_details(): + provider_details = get_provider_details("airbyte") + assert provider_details.provider_id == "airbyte" + assert provider_details.full_package_name == "airflow.providers.airbyte" + assert provider_details.pypi_package_name == "apache-airflow-providers-airbyte" + assert provider_details.root_provider_path == AIRFLOW_SOURCES_ROOT.joinpath( + "airflow", + "providers", + "airbyte", ) - assert ( - provider_details.documentation_provider_package_path == DOCS_ROOT / "apache-airflow-providers-asana" + assert provider_details.base_provider_package_path == AIRFLOW_SOURCES_ROOT.joinpath( + "airflow", + "providers", + "airbyte", + ) + assert provider_details.documentation_provider_package_path == AIRFLOW_SOURCES_ROOT.joinpath( + "docs", "apache-airflow-providers-airbyte" ) - assert "Asana" in provider_details.provider_description + assert "Airbyte" in provider_details.provider_description assert len(provider_details.versions) > 11 assert provider_details.excluded_python_versions == [] assert provider_details.plugins == [] - assert provider_details.changelog_path == provider_details.source_provider_package_path / "CHANGELOG.rst" + assert provider_details.changelog_path == provider_details.root_provider_path / "CHANGELOG.rst" assert not provider_details.removed @@ -372,8 +364,8 @@ def test_get_dist_package_name_prefix(provider_id: str, expected_package_name: s id="version-with-platform-marker", ), pytest.param( - "backports.zoneinfo>=0.2.1;python_version<'3.9'", - ("backports.zoneinfo", '>=0.2.1; python_version < "3.9"'), + "pendulum>=2.1.2,<4.0;python_version<'3.12'", + ("pendulum", '>=2.1.2,<4.0; python_version < "3.12"'), id="version-with-python-marker", ), pytest.param( @@ -431,8 +423,8 @@ def test_validate_provider_info_with_schema(): @pytest.mark.parametrize( "provider_id, min_version", [ - ("amazon", "2.7.0"), - ("common.io", "2.8.0"), + ("amazon", "2.9.0"), + ("fab", "2.9.0"), ], ) def test_get_min_airflow_version(provider_id: str, min_version: str): @@ -496,7 +488,7 @@ def test_provider_jinja_context(): "CHANGELOG_RELATIVE_PATH": "../../airflow/providers/amazon", "SUPPORTED_PYTHON_VERSIONS": ["3.8", "3.9", "3.10", "3.11", "3.12"], "PLUGINS": [], - "MIN_AIRFLOW_VERSION": "2.7.0", + "MIN_AIRFLOW_VERSION": "2.9.0", "PROVIDER_REMOVED": False, "PROVIDER_INFO": provider_info, } diff --git a/dev/breeze/tests/test_provider_documentation.py b/dev/breeze/tests/test_provider_documentation.py index e2de9fee9fbf3..fa723f9f50e96 100644 --- a/dev/breeze/tests/test_provider_documentation.py +++ b/dev/breeze/tests/test_provider_documentation.py @@ -18,6 +18,7 @@ import random import string +from pathlib import Path import pytest @@ -97,28 +98,47 @@ def test_get_version_tag(version: str, provider_id: str, suffix: str, tag: str): @pytest.mark.parametrize( - "from_commit, to_commit, git_command", + "folder_paths, from_commit, to_commit, git_command", [ - (None, None, ["git", "log", "--pretty=format:%H %h %cd %s", "--date=short", "--", "."]), + (None, None, None, ["git", "log", "--pretty=format:%H %h %cd %s", "--date=short", "--", "."]), ( + None, "from_tag", None, ["git", "log", "--pretty=format:%H %h %cd %s", "--date=short", "from_tag", "--", "."], ), ( + None, "from_tag", "to_tag", ["git", "log", "--pretty=format:%H %h %cd %s", "--date=short", "from_tag...to_tag", "--", "."], ), + ( + [Path("a"), Path("b")], + "from_tag", + "to_tag", + [ + "git", + "log", + "--pretty=format:%H %h %cd %s", + "--date=short", + "from_tag...to_tag", + "--", + "a", + "b", + ], + ), ], ) -def test_get_git_log_command(from_commit: str | None, to_commit: str | None, git_command: list[str]): - assert _get_git_log_command(from_commit, to_commit) == git_command +def test_get_git_log_command( + folder_paths: list[str] | None, from_commit: str | None, to_commit: str | None, git_command: list[str] +): + assert _get_git_log_command(folder_paths, from_commit, to_commit) == git_command def test_get_git_log_command_wrong(): with pytest.raises(ValueError, match=r"to_commit without from_commit"): - _get_git_log_command(None, "to_commit") + _get_git_log_command(None, None, "to_commit") @pytest.mark.parametrize( diff --git a/dev/breeze/tests/test_pytest_args_for_test_types.py b/dev/breeze/tests/test_pytest_args_for_test_types.py index fbb3785949e85..30913dc69db72 100644 --- a/dev/breeze/tests/test_pytest_args_for_test_types.py +++ b/dev/breeze/tests/test_pytest_args_for_test_types.py @@ -18,15 +18,16 @@ import pytest -from airflow_breeze.global_constants import DEFAULT_PYTHON_MAJOR_MINOR_VERSION +from airflow_breeze.global_constants import GroupOfTests from airflow_breeze.utils.run_tests import convert_parallel_types_to_folders, convert_test_type_to_pytest_args @pytest.mark.parametrize( - "test_type, pytest_args, skip_provider_tests", + "test_group, test_type, pytest_args", [ # Those list needs to be updated every time we add a new directory to tests/ folder ( + GroupOfTests.CORE, "Core", [ "tests/core", @@ -36,69 +37,54 @@ "tests/ti_deps", "tests/utils", ], - False, ), ( - "Integration", - ["tests/integration"], - False, + GroupOfTests.INTEGRATION_PROVIDERS, + "All", + ["tests/providers/integration"], ), ( - "Integration", - [ - "tests/integration/api_experimental", - "tests/integration/cli", - "tests/integration/executors", - "tests/integration/security", - ], - True, + GroupOfTests.INTEGRATION_CORE, + "All", + ["tests/integration"], ), ( + GroupOfTests.CORE, "API", - ["tests/api", "tests/api_experimental", "tests/api_connexion", "tests/api_internal"], - False, + ["tests/api", "tests/api_connexion", "tests/api_experimental", "tests/api_internal"], ), ( + GroupOfTests.CORE, "Serialization", ["tests/serialization"], - False, - ), - ( - "System", - ["tests/system"], - False, ), ( + GroupOfTests.CORE, "Operators", - ["tests/operators", "--exclude-virtualenv-operator", "--exclude-external-python-operator"], - False, + ["tests/operators"], ), ( + GroupOfTests.PROVIDERS, "Providers", ["tests/providers"], - False, - ), - ( - "Providers", - [], - True, ), ( + GroupOfTests.PROVIDERS, "Providers[amazon]", ["tests/providers/amazon"], - False, ), ( + GroupOfTests.PROVIDERS, "Providers[common.io]", ["tests/providers/common/io"], - False, ), ( + GroupOfTests.PROVIDERS, "Providers[amazon,google,apache.hive]", ["tests/providers/amazon", "tests/providers/google", "tests/providers/apache/hive"], - False, ), ( + GroupOfTests.PROVIDERS, "Providers[-amazon,google,microsoft.azure]", [ "tests/providers", @@ -106,50 +92,19 @@ "--ignore=tests/providers/google", "--ignore=tests/providers/microsoft/azure", ], - False, - ), - ( - "PlainAsserts", - [ - "tests/operators/test_python.py::TestPythonVirtualenvOperator::test_airflow_context", - "--assert=plain", - ], - False, ), ( + GroupOfTests.CORE, "All-Quarantined", ["tests", "-m", "quarantined", "--include-quarantined"], - False, ), ( - "PythonVenv", - [ - "tests/operators/test_python.py::TestPythonVirtualenvOperator", - ], - False, - ), - ( - "BranchPythonVenv", - [ - "tests/operators/test_python.py::TestBranchPythonVirtualenvOperator", - ], - False, - ), - ( - "ExternalPython", - [ - "tests/operators/test_python.py::TestExternalPythonOperator", - ], - False, - ), - ( - "BranchExternalPython", - [ - "tests/operators/test_python.py::TestBranchExternalPythonOperator", - ], - False, + GroupOfTests.PROVIDERS, + "All-Quarantined", + ["tests/providers", "-m", "quarantined", "--include-quarantined"], ), ( + GroupOfTests.CORE, "Other", [ "tests/auth", @@ -172,24 +127,33 @@ "tests/sensors", "tests/task", "tests/template", + "tests/test_utils", "tests/testconfig", "tests/timetables", "tests/triggers", ], - False, + ), + ( + GroupOfTests.HELM, + "All", + ["helm_tests"], + ), + ( + GroupOfTests.HELM, + "airflow_aux", + ["helm_tests/airflow_aux"], ), ], ) def test_pytest_args_for_regular_test_types( + test_group: GroupOfTests, test_type: str, pytest_args: list[str], - skip_provider_tests: bool, ): assert ( convert_test_type_to_pytest_args( + test_group=test_group, test_type=test_type, - skip_provider_tests=skip_provider_tests, - python_version=DEFAULT_PYTHON_MAJOR_MINOR_VERSION, ) == pytest_args ) @@ -198,160 +162,164 @@ def test_pytest_args_for_regular_test_types( def test_pytest_args_for_missing_provider(): with pytest.raises(SystemExit): convert_test_type_to_pytest_args( + test_group=GroupOfTests.PROVIDERS, test_type="Providers[missing.provider]", - skip_provider_tests=False, - python_version=DEFAULT_PYTHON_MAJOR_MINOR_VERSION, ) @pytest.mark.parametrize( - "helm_test_package, pytest_args", - [ - ( - None, - ["helm_tests"], - ), - ( - "airflow_aux", - ["helm_tests/airflow_aux"], - ), - ( - "all", - ["helm_tests"], - ), - ], -) -def test_pytest_args_for_helm_test_types(helm_test_package: str, pytest_args: list[str]): - assert ( - convert_test_type_to_pytest_args( - test_type="Helm", - skip_provider_tests=False, - helm_test_package=helm_test_package, - python_version=DEFAULT_PYTHON_MAJOR_MINOR_VERSION, - ) - == pytest_args - ) - - -@pytest.mark.parametrize( - "parallel_test_types, folders, skip_provider_tests", + "test_group, parallel_test_types, folders", [ ( + GroupOfTests.CORE, "API", - ["tests/api", "tests/api_experimental", "tests/api_connexion", "tests/api_internal"], - False, + ["tests/api", "tests/api_connexion", "tests/api_experimental", "tests/api_internal"], ), ( + GroupOfTests.CORE, "CLI", [ "tests/cli", ], - False, ), ( + GroupOfTests.CORE, "API CLI", [ "tests/api", - "tests/api_experimental", "tests/api_connexion", + "tests/api_experimental", "tests/api_internal", "tests/cli", ], - False, ), ( + GroupOfTests.CORE, "Core", ["tests/core", "tests/executors", "tests/jobs", "tests/models", "tests/ti_deps", "tests/utils"], - False, ), ( - "Core Providers", + GroupOfTests.PROVIDERS, + "Providers", [ - "tests/core", - "tests/executors", - "tests/jobs", - "tests/models", - "tests/ti_deps", - "tests/utils", "tests/providers", ], - False, ), ( - "Core Providers[amazon]", + GroupOfTests.PROVIDERS, + "Providers[amazon]", [ - "tests/core", - "tests/executors", - "tests/jobs", - "tests/models", - "tests/ti_deps", - "tests/utils", "tests/providers/amazon", ], - False, ), ( - "Core Providers[amazon] Providers[google]", + GroupOfTests.PROVIDERS, + "Providers[amazon] Providers[google]", [ - "tests/core", - "tests/executors", - "tests/jobs", - "tests/models", - "tests/ti_deps", - "tests/utils", "tests/providers/amazon", "tests/providers/google", ], - False, ), ( - "Core Providers[-amazon,google]", + GroupOfTests.PROVIDERS, + "Providers[-amazon,google]", [ - "tests/core", - "tests/executors", - "tests/jobs", - "tests/models", - "tests/ti_deps", - "tests/utils", "tests/providers", ], - False, ), ( - "Core Providers[amazon] Providers[google]", + GroupOfTests.PROVIDERS, + "Providers[-amazon,google] Providers[amazon] Providers[google]", [ - "tests/core", - "tests/executors", - "tests/jobs", - "tests/models", - "tests/ti_deps", - "tests/utils", + "tests/providers", + "tests/providers/amazon", + "tests/providers/google", ], - True, ), ( - "Core Providers[-amazon,google] Providers[amazon] Providers[google]", + GroupOfTests.INTEGRATION_PROVIDERS, + "All", [ - "tests/core", - "tests/executors", - "tests/jobs", - "tests/models", - "tests/ti_deps", - "tests/utils", - "tests/providers", + "tests/providers/integration", ], - False, + ), + ( + GroupOfTests.HELM, + "All", + [ + "helm_tests", + ], + ), + ( + GroupOfTests.INTEGRATION_CORE, + "All", + [ + "tests/integration", + ], + ), + ( + GroupOfTests.SYSTEM, + "None", + [], ), ], ) def test_folders_for_parallel_test_types( - parallel_test_types: str, folders: list[str], skip_provider_tests: bool + test_group: GroupOfTests, parallel_test_types: str, folders: list[str] ): assert ( convert_parallel_types_to_folders( + test_group=test_group, parallel_test_types_list=parallel_test_types.split(" "), - skip_provider_tests=skip_provider_tests, - python_version=DEFAULT_PYTHON_MAJOR_MINOR_VERSION, ) == folders ) + + +@pytest.mark.parametrize( + "test_group, parallel_test_types", + [ + ( + GroupOfTests.CORE, + "Providers", + ), + ( + GroupOfTests.CORE, + "Helm", + ), + ( + GroupOfTests.PROVIDERS, + "API CLI", + ), + ( + GroupOfTests.PROVIDERS, + "API CLI Providers", + ), + ( + GroupOfTests.HELM, + "API", + ), + ( + GroupOfTests.HELM, + "Providers", + ), + ( + GroupOfTests.INTEGRATION_PROVIDERS, + "API", + ), + ( + GroupOfTests.INTEGRATION_CORE, + "WWW", + ), + ( + GroupOfTests.SYSTEM, + "CLI", + ), + ], +) +def xtest_wrong_types_for_parallel_test_types(test_group: GroupOfTests, parallel_test_types: str): + with pytest.raises(SystemExit): + convert_parallel_types_to_folders( + test_group=test_group, + parallel_test_types_list=parallel_test_types.split(" "), + ) diff --git a/dev/breeze/tests/test_run_test_args.py b/dev/breeze/tests/test_run_test_args.py new file mode 100644 index 0000000000000..48b979a920d13 --- /dev/null +++ b/dev/breeze/tests/test_run_test_args.py @@ -0,0 +1,94 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from __future__ import annotations + +import re +from unittest.mock import patch + +import pytest + +from airflow_breeze.commands.testing_commands import _run_test +from airflow_breeze.global_constants import GroupOfTests +from airflow_breeze.params.shell_params import ShellParams + + +@pytest.fixture(autouse=True) +def mock_run_command(): + """We mock run_command to capture its call args; it returns nothing so mock training is unnecessary.""" + with patch("airflow_breeze.commands.testing_commands.run_command") as mck: + yield mck + + +@pytest.fixture(autouse=True) +def mock_get_suspended_provider_folders(): + with patch("airflow_breeze.utils.run_tests.get_suspended_provider_folders") as mck: + mck.return_value = [] + yield mck + + +@pytest.fixture(autouse=True) +def mock_get_excluded_provider_folders(): + with patch("airflow_breeze.utils.run_tests.get_excluded_provider_folders") as mck: + mck.return_value = [] + yield mck + + +@pytest.fixture(autouse=True) +def _mock_sleep(): + """_run_test does a 10-second sleep in CI, so we mock the sleep function to save CI test time.""" + with patch("airflow_breeze.commands.testing_commands.sleep"): + yield + + +@pytest.fixture(autouse=True) +def mock_remove_docker_networks(): + """We mock remove_docker_networks to avoid making actual docker calls during these tests; + it returns nothing so mock training is unnecessary.""" + with patch("airflow_breeze.commands.testing_commands.remove_docker_networks") as mck: + yield mck + + +def test_primary_test_arg_is_excluded_by_extra_pytest_arg(mock_run_command): + test_provider = "http" # "Providers[]" scans the source tree so we need to use a real provider id + test_provider_not_skipped = "ftp" + _run_test( + shell_params=ShellParams( + test_group=GroupOfTests.PROVIDERS, + test_type=f"Providers[{test_provider},{test_provider_not_skipped}]", + ), + extra_pytest_args=(f"--ignore=tests/providers/{test_provider}",), + python_version="3.8", + output=None, + test_timeout=60, + skip_docker_compose_down=True, + ) + + assert mock_run_command.call_count > 1 + run_cmd_call = mock_run_command.call_args_list[1] + arg_str = " ".join(run_cmd_call.args[0]) + + # The command pattern we look for is " --verbosity=0 \ + # <*other args we don't care about*> --ignore=providers/tests/" + # The providers/tests/http argument has been eliminated by the code that preps the args; this is a bug, + # bc without a directory or module arg, pytest tests everything (which we don't want!) + # We check "--verbosity=0" to ensure nothing is between the airflow container id and the verbosity arg, + # IOW that the primary test arg is removed + match_pattern = re.compile( + f"airflow tests/providers/{test_provider_not_skipped} --verbosity=0 .+ --ignore=tests/providers/{test_provider}" + ) + + assert match_pattern.search(arg_str) diff --git a/dev/breeze/tests/test_selective_checks.py b/dev/breeze/tests/test_selective_checks.py index 551374bd35502..bfa97dcb1ee50 100644 --- a/dev/breeze/tests/test_selective_checks.py +++ b/dev/breeze/tests/test_selective_checks.py @@ -18,22 +18,21 @@ import json import re -from functools import lru_cache from typing import Any import pytest from rich.console import Console from airflow_breeze.global_constants import ( - BASE_PROVIDERS_COMPATIBILITY_CHECKS, COMMITTERS, DEFAULT_PYTHON_MAJOR_MINOR_VERSION, + PROVIDERS_COMPATIBILITY_TESTS_MATRIX, GithubEvents, ) +from airflow_breeze.utils.functools_cache import clearable_cache from airflow_breeze.utils.packages import get_available_packages from airflow_breeze.utils.selective_checks import ( ALL_CI_SELECTIVE_TEST_TYPES, - ALL_CI_SELECTIVE_TEST_TYPES_WITHOUT_PROVIDERS, ALL_PROVIDERS_SELECTIVE_TEST_TYPES, SelectiveChecks, ) @@ -43,19 +42,24 @@ ALL_DOCS_SELECTED_FOR_BUILD = "" ALL_PROVIDERS_AFFECTED = "" -LIST_OF_ALL_PROVIDER_TESTS = " ".join(f"Providers[{provider}]" for provider in get_available_packages()) +LIST_OF_ALL_PROVIDER_TESTS = " ".join( + f"Providers[{provider}]" for provider in get_available_packages(include_not_ready=True) +) # commit that is neutral - allows to keep pyproject.toml-changing PRS neutral for unit tests NEUTRAL_COMMIT = "938f0c1f3cc4cbe867123ee8aa9f290f9f18100a" +# for is_legacy_ui_api_labeled tests +LEGACY_UI_LABEL = "legacy ui" +LEGACY_API_LABEL = "legacy api" + def escape_ansi_colors(line): return ANSI_COLORS_MATCHER.sub("", line) -# Can be replaced with cache when we move to Python 3.9 (when 3.8 is EOL) -@lru_cache(maxsize=None) +@clearable_cache def get_rich_console() -> Console: return Console(color_system="truecolor", force_terminal=True) @@ -86,9 +90,9 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): print_in_color("\nOutput received:") print_in_color(received_output_as_dict) print_in_color() - assert received_value == expected_value + assert received_value == expected_value, f"Correct value for {expected_key!r}" else: - print( + print_in_color( f"\n[red]ERROR: The key '{expected_key}' missing but " f"it is expected. Expected value:" ) @@ -108,7 +112,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): pytest.param( ("INTHEWILD.md",), { - "affected-providers-list-as-string": None, + "selected-providers-list-as-string": None, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "python-versions": "['3.8']", @@ -119,11 +123,11 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "run-amazon-tests": "false", "docs-build": "false", "skip-pre-commits": "check-provider-yaml-valid,flynt,identity,lint-helm-chart,mypy-airflow,mypy-dev," - "mypy-docs,mypy-providers,ts-compile-format-lint-www", + "mypy-docs,mypy-providers,ts-compile-format-lint-ui,ts-compile-format-lint-www", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": None, + "core-test-types-list-as-string": None, "providers-test-types-list-as-string": None, - "separate-test-types-list-as-string": None, + "individual-providers-test-types-list-as-string": None, "needs-mypy": "false", "mypy-checks": "[]", }, @@ -132,54 +136,11 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): ), ( pytest.param( - ("airflow/api/file.py",), + ("pyproject.toml",), { - "affected-providers-list-as-string": "fab", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "run-amazon-tests": "false", - "docs-build": "true", - "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev," - "mypy-docs,mypy-providers,ts-compile-format-lint-www", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "API Always Providers[fab]", - "providers-test-types-list-as-string": "Providers[fab]", - "separate-test-types-list-as-string": "API Always Providers[fab]", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow']", }, - id="Only API tests and DOCS and FAB provider should run", - ) - ), - ( - pytest.param( - ("airflow/api_internal/file.py",), - { - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "run-amazon-tests": "false", - "docs-build": "true", - "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev," - "mypy-docs,mypy-providers,ts-compile-format-lint-www", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "API Always", - "separate-test-types-list-as-string": "API Always", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow']", - }, - id="Only API tests and DOCS should run (no provider tests) when only internal_api changed", + id="CI image build and when pyproject.toml change", ) ), ( @@ -197,10 +158,11 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "run-amazon-tests": "false", "docs-build": "false", "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev," - "mypy-docs,mypy-providers,ts-compile-format-lint-www", + "mypy-docs,mypy-providers,ts-compile-format-lint-ui,ts-compile-format-lint-www", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "API Always", - "separate-test-types-list-as-string": "API Always", + "core-test-types-list-as-string": "API Always", + "providers-test-types-list-as-string": "", + "individual-providers-test-types-list-as-string": "", "needs-mypy": "true", "mypy-checks": "['mypy-airflow']", }, @@ -211,7 +173,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): pytest.param( ("airflow/operators/file.py",), { - "affected-providers-list-as-string": None, + "selected-providers-list-as-string": None, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "python-versions": "['3.8']", @@ -223,51 +185,22 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "run-amazon-tests": "false", "docs-build": "true", "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev," - "mypy-docs,mypy-providers,ts-compile-format-lint-www", + "mypy-docs,mypy-providers,ts-compile-format-lint-ui,ts-compile-format-lint-www", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always Operators", + "core-test-types-list-as-string": "Always Operators", "providers-test-types-list-as-string": "", - "separate-test-types-list-as-string": "Always Operators", + "individual-providers-test-types-list-as-string": "", "needs-mypy": "true", "mypy-checks": "['mypy-airflow']", }, id="Only Operator tests and DOCS should run", ) ), - ( - pytest.param( - ("airflow/operators/python.py",), - { - "affected-providers-list-as-string": None, - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "run-amazon-tests": "false", - "docs-build": "true", - "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev," - "mypy-docs,mypy-providers,ts-compile-format-lint-www", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always BranchExternalPython BranchPythonVenv " - "ExternalPython Operators PythonVenv", - "providers-test-types-list-as-string": "", - "separate-test-types-list-as-string": "Always BranchExternalPython BranchPythonVenv " - "ExternalPython Operators PythonVenv", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow']", - }, - id="Only Python tests", - ) - ), ( pytest.param( ("airflow/serialization/python.py",), { - "affected-providers-list-as-string": None, + "selected-providers-list-as-string": None, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "python-versions": "['3.8']", @@ -279,85 +212,22 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "run-amazon-tests": "false", "docs-build": "true", "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev," - "mypy-docs,mypy-providers,ts-compile-format-lint-www", + "mypy-docs,mypy-providers,ts-compile-format-lint-ui,ts-compile-format-lint-www", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always Serialization", + "core-test-types-list-as-string": "Always Serialization", "providers-test-types-list-as-string": "", - "separate-test-types-list-as-string": "Always Serialization", + "individual-providers-test-types-list-as-string": "", "needs-mypy": "true", "mypy-checks": "['mypy-airflow']", }, id="Only Serialization tests", ) ), - ( - pytest.param( - ( - "airflow/api/file.py", - "tests/providers/postgres/file.py", - ), - { - "affected-providers-list-as-string": "amazon common.sql fab google openlineage " - "pgvector postgres", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "run-amazon-tests": "true", - "docs-build": "true", - "skip-pre-commits": "identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers," - "ts-compile-format-lint-www", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "API Always Providers[amazon] " - "Providers[common.sql,fab,openlineage,pgvector,postgres] Providers[google]", - "providers-test-types-list-as-string": "Providers[amazon] " - "Providers[common.sql,fab,openlineage,pgvector,postgres] Providers[google]", - "separate-test-types-list-as-string": "API Always Providers[amazon] Providers[common.sql] " - "Providers[fab] Providers[google] Providers[openlineage] Providers[pgvector] " - "Providers[postgres]", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow', 'mypy-providers']", - }, - id="API and providers tests and docs should run", - ) - ), - ( - pytest.param( - ("tests/providers/apache/beam/file.py",), - { - "affected-providers-list-as-string": "apache.beam google", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "run-amazon-tests": "false", - "docs-build": "false", - "skip-pre-commits": "identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers," - "ts-compile-format-lint-www", - "run-kubernetes-tests": "false", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always Providers[apache.beam] Providers[google]", - "providers-test-types-list-as-string": "Providers[apache.beam] Providers[google]", - "separate-test-types-list-as-string": "Always Providers[apache.beam] Providers[google]", - "needs-mypy": "true", - "mypy-checks": "['mypy-providers']", - }, - id="Selected Providers and docs should run", - ) - ), ( pytest.param( ("docs/file.rst",), { - "affected-providers-list-as-string": None, + "selected-providers-list-as-string": None, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "python-versions": "['3.8']", @@ -369,10 +239,10 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "run-amazon-tests": "false", "docs-build": "true", "skip-pre-commits": "check-provider-yaml-valid,flynt,identity,lint-helm-chart,mypy-airflow,mypy-dev," - "mypy-docs,mypy-providers,ts-compile-format-lint-www", + "mypy-docs,mypy-providers,ts-compile-format-lint-ui,ts-compile-format-lint-www", "run-kubernetes-tests": "false", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": None, + "core-test-types-list-as-string": None, "providers-test-types-list-as-string": None, "needs-mypy": "false", "mypy-checks": "[]", @@ -380,115 +250,15 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): id="Only docs builds should run - no tests needed", ) ), - ( - pytest.param( - ( - "chart/aaaa.txt", - "tests/providers/postgres/file.py", - ), - { - "affected-providers-list-as-string": "amazon common.sql google openlineage " - "pgvector postgres", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "true", - "needs-helm-tests": "true", - "run-tests": "true", - "run-amazon-tests": "true", - "docs-build": "true", - "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "run-kubernetes-tests": "true", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always Providers[amazon] " - "Providers[common.sql,openlineage,pgvector,postgres] Providers[google]", - "providers-test-types-list-as-string": "Providers[amazon] " - "Providers[common.sql,openlineage,pgvector,postgres] Providers[google]", - "needs-mypy": "true", - "mypy-checks": "['mypy-providers']", - }, - id="Helm tests, providers (both upstream and downstream)," - "kubernetes tests and docs should run", - ) - ), ( pytest.param( ( "INTHEWILD.md", "chart/aaaa.txt", - "tests/providers/http/file.py", + "foo/other.py", ), { - "affected-providers-list-as-string": "airbyte amazon apache.livy " - "dbt.cloud dingding discord http", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "true", - "needs-helm-tests": "true", - "run-tests": "true", - "run-amazon-tests": "true", - "docs-build": "true", - "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "run-kubernetes-tests": "true", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always " - "Providers[airbyte,apache.livy,dbt.cloud,dingding,discord,http] Providers[amazon]", - "providers-test-types-list-as-string": "Providers[airbyte,apache.livy,dbt.cloud,dingding,discord,http] Providers[amazon]", - "separate-test-types-list-as-string": "Always Providers[airbyte] Providers[amazon] " - "Providers[apache.livy] Providers[dbt.cloud] " - "Providers[dingding] Providers[discord] Providers[http]", - "needs-mypy": "true", - "mypy-checks": "['mypy-providers']", - }, - id="Helm tests, http and all relevant providers, kubernetes tests and " - "docs should run even if unimportant files were added", - ) - ), - ( - pytest.param( - ( - "INTHEWILD.md", - "chart/aaaa.txt", - "tests/providers/airbyte/file.py", - ), - { - "affected-providers-list-as-string": "airbyte http", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "true", - "needs-helm-tests": "true", - "run-tests": "true", - "run-amazon-tests": "false", - "docs-build": "true", - "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "run-kubernetes-tests": "true", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always Providers[airbyte,http]", - "providers-test-types-list-as-string": "Providers[airbyte,http]", - "needs-mypy": "true", - "mypy-checks": "['mypy-providers']", - }, - id="Helm tests, airbyte/http providers, kubernetes tests and " - "docs should run even if unimportant files were added", - ) - ), - ( - pytest.param( - ( - "INTHEWILD.md", - "chart/aaaa.txt", - "tests/system/utils/file.py", - ), - { - "affected-providers-list-as-string": None, + "selected-providers-list-as-string": None, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "python-versions": "['3.8']", @@ -499,14 +269,14 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "run-tests": "true", "docs-build": "true", "skip-pre-commits": "check-provider-yaml-valid,identity,mypy-airflow,mypy-dev," - "mypy-docs,mypy-providers,ts-compile-format-lint-www", + "mypy-docs,mypy-providers,ts-compile-format-lint-ui,ts-compile-format-lint-www", "run-amazon-tests": "false", "run-kubernetes-tests": "true", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always", + "core-test-types-list-as-string": "Always", "providers-test-types-list-as-string": "", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow']", + "needs-mypy": "false", + "mypy-checks": "[]", }, id="Docs should run even if unimportant files were added and prod image " "should be build for chart changes", @@ -516,7 +286,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): pytest.param( ("generated/provider_dependencies.json",), { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8', '3.9', '3.10', '3.11', '3.12']", "all-python-versions-list-as-string": "3.8 3.9 3.10 3.11 3.12", "python-versions": "['3.8', '3.9', '3.10', '3.11', '3.12']", @@ -530,7 +300,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "full-tests-needed": "true", "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "upgrade-to-newer-dependencies": "true", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", @@ -543,7 +313,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): pytest.param( ("generated/provider_dependencies.json",), { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8', '3.9', '3.10', '3.11', '3.12']", "all-python-versions-list-as-string": "3.8 3.9 3.10 3.11 3.12", "python-versions": "['3.8', '3.9', '3.10', '3.11', '3.12']", @@ -557,7 +327,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "full-tests-needed": "true", "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "upgrade-to-newer-dependencies": "true", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", @@ -565,161 +335,11 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): id="Everything should run and upgrading to newer requirements as dependencies change", ) ), - pytest.param( - ("airflow/providers/amazon/__init__.py",), - { - "affected-providers-list-as-string": "amazon apache.hive cncf.kubernetes " - "common.compat common.sql exasol ftp google http imap microsoft.azure " - "mongo mysql openlineage postgres salesforce ssh teradata", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "docs-build": "true", - "skip-pre-commits": "identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "run-kubernetes-tests": "false", - "upgrade-to-newer-dependencies": "false", - "run-amazon-tests": "true", - "parallel-test-types-list-as-string": "Always Providers[amazon] " - "Providers[apache.hive,cncf.kubernetes,common.compat,common.sql,exasol,ftp,http," - "imap,microsoft.azure,mongo,mysql,openlineage,postgres,salesforce,ssh,teradata] Providers[google]", - "needs-mypy": "true", - "mypy-checks": "['mypy-providers']", - }, - id="Providers tests run including amazon tests if amazon provider files changed", - ), - pytest.param( - ("tests/providers/airbyte/__init__.py",), - { - "affected-providers-list-as-string": "airbyte http", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "run-amazon-tests": "false", - "docs-build": "false", - "skip-pre-commits": "identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "run-kubernetes-tests": "false", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always Providers[airbyte,http]", - "needs-mypy": "true", - "mypy-checks": "['mypy-providers']", - }, - id="Providers tests run without amazon tests if no amazon file changed", - ), - pytest.param( - ("airflow/providers/amazon/file.py",), - { - "affected-providers-list-as-string": "amazon apache.hive cncf.kubernetes " - "common.compat common.sql exasol ftp google http imap microsoft.azure " - "mongo mysql openlineage postgres salesforce ssh teradata", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "run-amazon-tests": "true", - "docs-build": "true", - "skip-pre-commits": "identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "run-kubernetes-tests": "false", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always Providers[amazon] " - "Providers[apache.hive,cncf.kubernetes,common.compat,common.sql,exasol,ftp,http," - "imap,microsoft.azure,mongo,mysql,openlineage,postgres,salesforce,ssh,teradata] Providers[google]", - "needs-mypy": "true", - "mypy-checks": "['mypy-providers']", - }, - id="Providers tests run including amazon tests if amazon provider files changed", - ), - pytest.param( - ( - "tests/always/test_project_structure.py", - "tests/providers/common/io/operators/__init__.py", - "tests/providers/common/io/operators/test_file_transfer.py", - ), - { - "affected-providers-list-as-string": "common.compat common.io openlineage", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "run-amazon-tests": "false", - "docs-build": "false", - "run-kubernetes-tests": "false", - "skip-pre-commits": "identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always Providers[common.compat,common.io,openlineage]", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow', 'mypy-providers']", - }, - id="Only Always and common providers tests should run when only common.io and tests/always changed", - ), - pytest.param( - ("airflow/operators/bash.py",), - { - "affected-providers-list-as-string": None, - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "run-amazon-tests": "false", - "docs-build": "true", - "run-kubernetes-tests": "false", - "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always Core Operators Serialization", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow']", - }, - id="Force Core and Serialization tests to run when airflow bash.py changed", - ), - pytest.param( - ("tests/operators/bash.py",), - { - "affected-providers-list-as-string": None, - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "python-versions": "['3.8']", - "python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "run-amazon-tests": "false", - "docs-build": "false", - "run-kubernetes-tests": "false", - "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "Always Core Operators Serialization", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow']", - }, - id="Force Core and Serialization tests to run when tests bash changed", - ), ( pytest.param( ("tests/utils/test_cli_util.py",), { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "python-versions": "['3.8']", @@ -733,7 +353,7 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): "full-tests-needed": "true", "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", @@ -741,6 +361,60 @@ def assert_outputs_are_printed(expected_outputs: dict[str, str], stderr: str): id="All tests should be run when tests/utils/ change", ) ), + ( + pytest.param( + ("tests_common/__init__.py",), + { + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "all-python-versions": "['3.8']", + "all-python-versions-list-as-string": "3.8", + "python-versions": "['3.8']", + "python-versions-list-as-string": "3.8", + "ci-image-build": "true", + "prod-image-build": "true", + "needs-helm-tests": "true", + "run-tests": "true", + "run-amazon-tests": "true", + "docs-build": "true", + "full-tests-needed": "true", + "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", + "upgrade-to-newer-dependencies": "false", + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, + "testable-core-integrations": "['celery', 'kerberos']", + "testable-providers-integrations": "['cassandra', 'drill', 'kafka', 'mongo', 'pinot', 'qdrant', 'redis', 'trino', 'ydb']", + "needs-mypy": "true", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", + }, + id="All tests should be run when tests_common/ change", + ) + ), + ( + pytest.param( + ("airflow/ui/src/index.tsx",), + { + "selected-providers-list-as-string": None, + "all-python-versions": "['3.8']", + "all-python-versions-list-as-string": "3.8", + "python-versions": "['3.8']", + "python-versions-list-as-string": "3.8", + "ci-image-build": "false", + "prod-image-build": "false", + "needs-helm-tests": "false", + "run-tests": "false", + "run-amazon-tests": "false", + "docs-build": "false", + "full-tests-needed": "false", + "skip-pre-commits": "check-provider-yaml-valid,flynt,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", + "upgrade-to-newer-dependencies": "false", + "needs-mypy": "false", + "mypy-checks": "[]", + "run-ui-tests": "true", + "only-new-ui-files": "true", + }, + id="Run only ui tests for PR with new UI only changes.", + ) + ), ], ) def test_expected_output_pull_request_main( @@ -811,6 +485,20 @@ def test_hatch_build_py_changes(): ) +def test_excluded_providers(): + stderr = SelectiveChecks( + files=(), + github_event=GithubEvents.PULL_REQUEST, + default_branch="main", + ) + assert_outputs_are_printed( + { + "excluded-providers-as-string": json.dumps({"3.12": ["apache.beam", "papermill"]}), + }, + str(stderr), + ) + + @pytest.mark.parametrize( "files, expected_outputs", [ @@ -871,26 +559,27 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ ("full tests needed", "all versions"), "main", { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-versions": "true", "all-python-versions": "['3.8', '3.9', '3.10', '3.11', '3.12']", "all-python-versions-list-as-string": "3.8 3.9 3.10 3.11 3.12", "mysql-versions": "['8.0', '8.4']", - "postgres-versions": "['12', '13', '14', '15', '16']", + "postgres-versions": "['13', '14', '15', '16', '17']", "python-versions": "['3.8', '3.9', '3.10', '3.11', '3.12']", "python-versions-list-as-string": "3.8 3.9 3.10 3.11 3.12", - "kubernetes-versions": "['v1.27.13', 'v1.28.9', 'v1.29.4', 'v1.30.0']", - "kubernetes-versions-list-as-string": "v1.27.13 v1.28.9 v1.29.4 v1.30.0", - "kubernetes-combos-list-as-string": "3.8-v1.27.13 3.9-v1.28.9 3.10-v1.29.4 3.11-v1.30.0 3.12-v1.27.13", + "kubernetes-versions": "['v1.28.15', 'v1.29.12', 'v1.30.8', 'v1.31.4', 'v1.32.0']", + "kubernetes-versions-list-as-string": "v1.28.15 v1.29.12 v1.30.8 v1.31.4 v1.32.0", + "kubernetes-combos-list-as-string": "3.8-v1.28.15 3.9-v1.29.12 3.10-v1.30.8 3.11-v1.31.4 3.12-v1.32.0", "ci-image-build": "true", "prod-image-build": "true", "run-tests": "true", + "skip-providers-tests": "false", + "test-groups": "['core', 'providers']", "docs-build": "true", "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD, "full-tests-needed": "true", "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", @@ -905,26 +594,28 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ ("full tests needed", "default versions only"), "main", { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "all-versions": "false", "mysql-versions": "['8.0']", - "postgres-versions": "['12']", + "postgres-versions": "['13']", "python-versions": "['3.8']", "python-versions-list-as-string": "3.8", - "kubernetes-versions": "['v1.27.13']", - "kubernetes-versions-list-as-string": "v1.27.13", - "kubernetes-combos-list-as-string": "3.8-v1.27.13", + "kubernetes-versions": "['v1.28.15']", + "kubernetes-versions-list-as-string": "v1.28.15", + "kubernetes-combos-list-as-string": "3.8-v1.28.15", "ci-image-build": "true", "prod-image-build": "true", "run-tests": "true", + "skip-providers-tests": "false", + "test-groups": "['core', 'providers']", "docs-build": "true", "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD, "full-tests-needed": "true", "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", @@ -939,26 +630,28 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ ("full tests needed",), "main", { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "all-versions": "false", "mysql-versions": "['8.0']", - "postgres-versions": "['12']", + "postgres-versions": "['13']", "python-versions": "['3.8']", "python-versions-list-as-string": "3.8", - "kubernetes-versions": "['v1.27.13']", - "kubernetes-versions-list-as-string": "v1.27.13", - "kubernetes-combos-list-as-string": "3.8-v1.27.13", + "kubernetes-versions": "['v1.28.15']", + "kubernetes-versions-list-as-string": "v1.28.15", + "kubernetes-combos-list-as-string": "3.8-v1.28.15", "ci-image-build": "true", "prod-image-build": "true", "run-tests": "true", + "skip-providers-tests": "false", + "test-groups": "['core', 'providers']", "docs-build": "true", "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD, "full-tests-needed": "true", "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", @@ -973,27 +666,29 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ ("full tests needed", "latest versions only"), "main", { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.12']", "all-python-versions-list-as-string": "3.12", "all-versions": "false", "default-python-version": "3.12", "mysql-versions": "['8.4']", - "postgres-versions": "['16']", + "postgres-versions": "['17']", "python-versions": "['3.12']", "python-versions-list-as-string": "3.12", - "kubernetes-versions": "['v1.30.0']", - "kubernetes-versions-list-as-string": "v1.30.0", - "kubernetes-combos-list-as-string": "3.12-v1.30.0", + "kubernetes-versions": "['v1.32.0']", + "kubernetes-versions-list-as-string": "v1.32.0", + "kubernetes-combos-list-as-string": "3.12-v1.32.0", "ci-image-build": "true", "prod-image-build": "true", "run-tests": "true", + "skip-providers-tests": "false", + "test-groups": "['core', 'providers']", "docs-build": "true", "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD, "full-tests-needed": "true", "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", @@ -1011,24 +706,26 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ ), "main", { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "all-versions": "false", "python-versions": "['3.8']", "python-versions-list-as-string": "3.8", - "kubernetes-versions": "['v1.27.13']", - "kubernetes-versions-list-as-string": "v1.27.13", - "kubernetes-combos-list-as-string": "3.8-v1.27.13", + "kubernetes-versions": "['v1.28.15']", + "kubernetes-versions-list-as-string": "v1.28.15", + "kubernetes-combos-list-as-string": "3.8-v1.28.15", "ci-image-build": "true", "prod-image-build": "true", "run-tests": "true", + "skip-providers-tests": "false", + "test-groups": "['core', 'providers']", "docs-build": "true", "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD, "full-tests-needed": "true", "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", @@ -1043,33 +740,32 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ ("full tests needed",), "main", { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "all-versions": "false", "python-versions": "['3.8']", "python-versions-list-as-string": "3.8", - "kubernetes-versions": "['v1.27.13']", - "kubernetes-versions-list-as-string": "v1.27.13", - "kubernetes-combos-list-as-string": "3.8-v1.27.13", + "kubernetes-versions": "['v1.28.15']", + "kubernetes-versions-list-as-string": "v1.28.15", + "kubernetes-combos-list-as-string": "3.8-v1.28.15", "ci-image-build": "true", "prod-image-build": "true", "run-tests": "true", + "skip-providers-tests": "false", + "test-groups": "['core', 'providers']", "docs-build": "true", "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD, "full-tests-needed": "true", "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "providers-test-types-list-as-string": ALL_PROVIDERS_SELECTIVE_TEST_TYPES, - "separate-test-types-list-as-string": "API Always BranchExternalPython BranchPythonVenv " - "CLI Core ExternalPython Operators Other PlainAsserts " - + LIST_OF_ALL_PROVIDER_TESTS - + " PythonVenv Serialization WWW", + "individual-providers-test-types-list-as-string": LIST_OF_ALL_PROVIDER_TESTS, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, - id="Everything should run including full providers when" + id="Everything should run including full providers when " "full tests are needed even if no files are changed", ) ), @@ -1079,7 +775,6 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ ("full tests needed",), "v2-7-stable", { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "python-versions": "['3.8']", @@ -1088,18 +783,15 @@ def test_full_test_needed_when_scripts_changes(files: tuple[str, ...], expected_ "ci-image-build": "true", "prod-image-build": "true", "run-tests": "true", + "skip-providers-tests": "true", + "test-groups": "['core']", "docs-build": "true", "docs-list-as-string": "apache-airflow docker-stack", "full-tests-needed": "true", "skip-pre-commits": "check-airflow-provider-compatibility,check-extra-packages-references,check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,validate-operators-init", - "skip-provider-tests": "true", "upgrade-to-newer-dependencies": "false", - "parallel-test-types-list-as-string": "API Always BranchExternalPython " - "BranchPythonVenv CLI Core ExternalPython Operators Other PlainAsserts " - "PythonVenv Serialization WWW", - "separate-test-types-list-as-string": "API Always BranchExternalPython " - "BranchPythonVenv CLI Core ExternalPython Operators Other PlainAsserts " - "PythonVenv Serialization WWW", + "core-test-types-list-as-string": "API Always CLI Core Operators Other " + "Serialization WWW", "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-docs', 'mypy-dev']", }, @@ -1131,18 +823,19 @@ def test_expected_output_full_tests_needed( pytest.param( ("INTHEWILD.md",), { - "affected-providers-list-as-string": None, + "selected-providers-list-as-string": None, "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "ci-image-build": "false", "needs-helm-tests": "false", "run-tests": "false", + "skip-providers-tests": "true", + "test-groups": "[]", "docs-build": "false", "docs-list-as-string": None, "full-tests-needed": "false", "upgrade-to-newer-dependencies": "false", - "skip-provider-tests": "true", - "parallel-test-types-list-as-string": None, + "core-test-types-list-as-string": None, "needs-mypy": "false", "mypy-checks": "[]", }, @@ -1154,231 +847,57 @@ def test_expected_output_full_tests_needed( "tests/providers/google/file.py", ), { - "affected-providers-list-as-string": "amazon apache.beam apache.cassandra cncf.kubernetes " - "common.compat common.sql facebook google hashicorp microsoft.azure microsoft.mssql " - "mysql openlineage oracle postgres presto salesforce samba sftp ssh trino", "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "needs-helm-tests": "false", "ci-image-build": "true", "prod-image-build": "true", "run-tests": "true", + "skip-providers-tests": "true", + "test-groups": "['core']", "docs-build": "true", "docs-list-as-string": "apache-airflow docker-stack", "full-tests-needed": "false", "run-kubernetes-tests": "true", - "upgrade-to-newer-dependencies": "false", - "skip-provider-tests": "true", - "parallel-test-types-list-as-string": "Always", - "needs-mypy": "false", - "mypy-checks": "[]", - }, - id="No Helm tests, No providers no lint charts, should run if " - "only chart/providers changed in non-main but PROD image should be built", - ), - pytest.param( - ( - "airflow/cli/test.py", - "chart/aaaa.txt", - "tests/providers/google/file.py", - ), - { - "affected-providers-list-as-string": "amazon apache.beam apache.cassandra " - "cncf.kubernetes common.compat common.sql facebook google " - "hashicorp microsoft.azure microsoft.mssql mysql openlineage oracle postgres " - "presto salesforce samba sftp ssh trino", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "true", - "needs-helm-tests": "false", - "run-tests": "true", - "docs-build": "true", - "docs-list-as-string": "apache-airflow docker-stack", - "full-tests-needed": "false", - "run-kubernetes-tests": "true", - "upgrade-to-newer-dependencies": "false", - "skip-provider-tests": "true", - "parallel-test-types-list-as-string": "Always CLI", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow']", - }, - id="Only CLI tests and Kubernetes tests should run if cli/chart files changed in non-main branch", - ), - pytest.param( - ( - "airflow/file.py", - "tests/providers/google/file.py", - ), - { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "docs-build": "true", - "docs-list-as-string": "apache-airflow docker-stack", - "full-tests-needed": "false", - "run-kubernetes-tests": "false", - "upgrade-to-newer-dependencies": "false", - "skip-provider-tests": "true", - "parallel-test-types-list-as-string": "API Always BranchExternalPython BranchPythonVenv " - "CLI Core ExternalPython Operators Other PlainAsserts PythonVenv Serialization WWW", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow']", - }, - id="All tests except Providers and helm lint pre-commit " - "should run if core file changed in non-main branch", - ), - ], -) -def test_expected_output_pull_request_v2_7( - files: tuple[str, ...], - expected_outputs: dict[str, str], -): - stderr = SelectiveChecks( - files=files, - commit_ref=NEUTRAL_COMMIT, - github_event=GithubEvents.PULL_REQUEST, - pr_labels=(), - default_branch="v2-7-stable", - ) - assert_outputs_are_printed(expected_outputs, str(stderr)) - - -@pytest.mark.parametrize( - "files, expected_outputs,", - [ - pytest.param( - ("INTHEWILD.md",), - { - "affected-providers-list-as-string": None, - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "ci-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "false", - "docs-build": "false", - "docs-list-as-string": None, - "upgrade-to-newer-dependencies": "false", - "skip-pre-commits": "check-provider-yaml-valid,flynt,identity,lint-helm-chart," - "mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "skip-provider-tests": "true", - "parallel-test-types-list-as-string": None, - "needs-mypy": "false", - "mypy-checks": "[]", - }, - id="Nothing should run if only non-important files changed", - ), - pytest.param( - ("tests/system/any_file.py",), - { - "affected-providers-list-as-string": None, - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "docs-build": "true", - "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD, - "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "upgrade-to-newer-dependencies": "false", - "skip-provider-tests": "true", - "parallel-test-types-list-as-string": "Always", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow']", - }, - id="Only Always and docs build should run if only system tests changed", - ), - pytest.param( - ( - "airflow/cli/test.py", - "chart/aaaa.txt", - "tests/providers/google/file.py", - ), - { - "affected-providers-list-as-string": "amazon apache.beam apache.cassandra " - "cncf.kubernetes common.compat common.sql " - "facebook google hashicorp microsoft.azure microsoft.mssql mysql " - "openlineage oracle postgres presto salesforce samba sftp ssh trino", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "true", - "needs-helm-tests": "true", - "run-tests": "true", - "docs-build": "true", - "docs-list-as-string": "apache-airflow helm-chart amazon apache.beam apache.cassandra " - "cncf.kubernetes common.compat common.sql facebook google hashicorp microsoft.azure " - "microsoft.mssql mysql openlineage oracle postgres " - "presto salesforce samba sftp ssh trino", - "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "run-kubernetes-tests": "true", - "upgrade-to-newer-dependencies": "false", - "skip-provider-tests": "false", - "parallel-test-types-list-as-string": "Always CLI Providers[amazon] " - "Providers[apache.beam,apache.cassandra,cncf.kubernetes,common.compat,common.sql,facebook," - "hashicorp,microsoft.azure,microsoft.mssql,mysql,openlineage,oracle,postgres,presto," - "salesforce,samba,sftp,ssh,trino] Providers[google]", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow', 'mypy-providers']", - }, - id="CLI tests and Google-related provider tests should run if cli/chart files changed but " - "prod image should be build too and k8s tests too", - ), - pytest.param( - ( - "airflow/cli/file.py", - "airflow/operators/file.py", - "airflow/www/file.py", - "airflow/api/file.py", - ), - { - "affected-providers-list-as-string": "fab", - "all-python-versions": "['3.8']", - "all-python-versions-list-as-string": "3.8", - "ci-image-build": "true", - "prod-image-build": "false", - "needs-helm-tests": "false", - "run-tests": "true", - "docs-build": "true", - "docs-list-as-string": "apache-airflow fab", - "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "run-kubernetes-tests": "false", - "upgrade-to-newer-dependencies": "false", - "skip-provider-tests": "false", - "parallel-test-types-list-as-string": "API Always CLI Operators Providers[fab] WWW", - "needs-mypy": "true", - "mypy-checks": "['mypy-airflow']", + "upgrade-to-newer-dependencies": "false", + "core-test-types-list-as-string": "Always", + "needs-mypy": "false", + "mypy-checks": "[]", }, - id="No providers tests except fab should run if only CLI/API/Operators/WWW file changed", + id="No Helm tests, No providers no lint charts, should run if " + "only chart/providers changed in non-main but PROD image should be built", ), pytest.param( - ("airflow/models/test.py",), + ( + "airflow/cli/test.py", + "chart/aaaa.txt", + "tests/providers/google/file.py", + ), { "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", "ci-image-build": "true", - "prod-image-build": "false", + "prod-image-build": "true", "needs-helm-tests": "false", "run-tests": "true", + "skip-providers-tests": "true", + "test-groups": "['core']", "docs-build": "true", - "docs-list-as-string": "apache-airflow", - "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", - "run-kubernetes-tests": "false", + "docs-list-as-string": "apache-airflow docker-stack", + "full-tests-needed": "false", + "run-kubernetes-tests": "true", "upgrade-to-newer-dependencies": "false", - "skip-provider-tests": "true", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES_WITHOUT_PROVIDERS, + "core-test-types-list-as-string": "Always CLI", "needs-mypy": "true", "mypy-checks": "['mypy-airflow']", }, - id="Tests for all airflow core types except providers should run if model file changed", + id="Only CLI tests and Kubernetes tests should run if cli/chart files changed in non-main branch", ), pytest.param( - ("airflow/file.py",), + ( + "airflow/file.py", + "tests/providers/google/file.py", + ), { "all-python-versions": "['3.8']", "all-python-versions-list-as-string": "3.8", @@ -1386,31 +905,32 @@ def test_expected_output_pull_request_v2_7( "prod-image-build": "false", "needs-helm-tests": "false", "run-tests": "true", + "skip-providers-tests": "true", + "test-groups": "['core']", "docs-build": "true", - "docs-list-as-string": "apache-airflow", - "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-www", + "docs-list-as-string": "apache-airflow docker-stack", + "full-tests-needed": "false", "run-kubernetes-tests": "false", "upgrade-to-newer-dependencies": "false", - "skip-provider-tests": "true", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES_WITHOUT_PROVIDERS, + "core-test-types-list-as-string": "API Always CLI Core Operators Other Serialization WWW", "needs-mypy": "true", "mypy-checks": "['mypy-airflow']", }, - id="Tests for all airflow core types except providers should run if " - "any other than API/WWW/CLI/Operators file changed.", + id="All tests except Providers and helm lint pre-commit " + "should run if core file changed in non-main branch", ), ], ) -def test_expected_output_pull_request_target( +def test_expected_output_pull_request_v2_7( files: tuple[str, ...], expected_outputs: dict[str, str], ): stderr = SelectiveChecks( files=files, commit_ref=NEUTRAL_COMMIT, - github_event=GithubEvents.PULL_REQUEST_TARGET, + github_event=GithubEvents.PULL_REQUEST, pr_labels=(), - default_branch="main", + default_branch="v2-7-stable", ) assert_outputs_are_printed(expected_outputs, str(stderr)) @@ -1423,7 +943,7 @@ def test_expected_output_pull_request_target( (), "main", { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8', '3.9', '3.10', '3.11', '3.12']", "all-python-versions-list-as-string": "3.8 3.9 3.10 3.11 3.12", "ci-image-build": "true", @@ -1434,7 +954,7 @@ def test_expected_output_pull_request_target( "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD, "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "upgrade-to-newer-dependencies": "true", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, @@ -1445,7 +965,6 @@ def test_expected_output_pull_request_target( (), "v2-3-stable", { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8', '3.9', '3.10', '3.11', '3.12']", "all-python-versions-list-as-string": "3.8 3.9 3.10 3.11 3.12", "ci-image-build": "true", @@ -1456,8 +975,7 @@ def test_expected_output_pull_request_target( "skip-pre-commits": "check-airflow-provider-compatibility,check-extra-packages-references,check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers,validate-operators-init", "docs-list-as-string": "apache-airflow docker-stack", "upgrade-to-newer-dependencies": "true", - "parallel-test-types-list-as-string": "API Always BranchExternalPython BranchPythonVenv " - "CLI Core ExternalPython Operators Other PlainAsserts PythonVenv Serialization WWW", + "core-test-types-list-as-string": "API Always CLI Core Operators Other Serialization WWW", "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-docs', 'mypy-dev']", }, @@ -1469,7 +987,7 @@ def test_expected_output_pull_request_target( (), "main", { - "affected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, + "selected-providers-list-as-string": ALL_PROVIDERS_AFFECTED, "all-python-versions": "['3.8', '3.9', '3.10', '3.11', '3.12']", "all-python-versions-list-as-string": "3.8 3.9 3.10 3.11 3.12", "ci-image-build": "true", @@ -1480,7 +998,7 @@ def test_expected_output_pull_request_target( "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD, "upgrade-to-newer-dependencies": "true", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, @@ -1504,6 +1022,117 @@ def test_expected_output_push( assert_outputs_are_printed(expected_outputs, str(stderr)) +@pytest.mark.parametrize( + "files, expected_outputs,", + [ + pytest.param( + ("INTHEWILD.md",), + { + "selected-providers-list-as-string": None, + "all-python-versions": "['3.8']", + "all-python-versions-list-as-string": "3.8", + "ci-image-build": "false", + "needs-helm-tests": "false", + "run-tests": "false", + "skip-providers-tests": "true", + "test-groups": "[]", + "docs-build": "false", + "docs-list-as-string": None, + "upgrade-to-newer-dependencies": "false", + "skip-pre-commits": "check-provider-yaml-valid,flynt,identity,lint-helm-chart," + "mypy-airflow,mypy-dev,mypy-docs,mypy-providers,ts-compile-format-lint-ui,ts-compile-format-lint-www", + "core-test-types-list-as-string": None, + "needs-mypy": "false", + "mypy-checks": "[]", + }, + id="Nothing should run if only non-important files changed", + ), + pytest.param( + ("tests/system/any_file.py",), + { + "selected-providers-list-as-string": None, + "all-python-versions": "['3.8']", + "all-python-versions-list-as-string": "3.8", + "ci-image-build": "true", + "prod-image-build": "false", + "needs-helm-tests": "false", + "run-tests": "true", + "skip-providers-tests": "true", + "test-groups": "['core']", + "docs-build": "true", + "docs-list-as-string": ALL_DOCS_SELECTED_FOR_BUILD, + "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers," + "ts-compile-format-lint-ui,ts-compile-format-lint-www", + "upgrade-to-newer-dependencies": "false", + "core-test-types-list-as-string": "Always", + "needs-mypy": "true", + "mypy-checks": "['mypy-airflow']", + }, + id="Only Always and docs build should run if only system tests changed", + ), + pytest.param( + ("airflow/models/test.py",), + { + "all-python-versions": "['3.8']", + "all-python-versions-list-as-string": "3.8", + "ci-image-build": "true", + "prod-image-build": "false", + "needs-helm-tests": "false", + "run-tests": "true", + "skip-providers-tests": "true", + "test-groups": "['core']", + "docs-build": "true", + "docs-list-as-string": "apache-airflow", + "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers," + "ts-compile-format-lint-ui,ts-compile-format-lint-www", + "run-kubernetes-tests": "false", + "upgrade-to-newer-dependencies": "false", + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "needs-mypy": "true", + "mypy-checks": "['mypy-airflow']", + }, + id="Tests for all airflow core types except providers should run if model file changed", + ), + pytest.param( + ("airflow/file.py",), + { + "all-python-versions": "['3.8']", + "all-python-versions-list-as-string": "3.8", + "ci-image-build": "true", + "prod-image-build": "false", + "needs-helm-tests": "false", + "run-tests": "true", + "skip-providers-tests": "true", + "test-groups": "['core']", + "docs-build": "true", + "docs-list-as-string": "apache-airflow", + "skip-pre-commits": "check-provider-yaml-valid,identity,lint-helm-chart,mypy-airflow,mypy-dev,mypy-docs,mypy-providers," + "ts-compile-format-lint-ui,ts-compile-format-lint-www", + "run-kubernetes-tests": "false", + "upgrade-to-newer-dependencies": "false", + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "needs-mypy": "true", + "mypy-checks": "['mypy-airflow']", + }, + id="Tests for all airflow core types except providers should run if " + "any other than API/WWW/CLI/Operators file changed.", + ), + ], +) +def test_expected_output_pull_request_target( + files: tuple[str, ...], + expected_outputs: dict[str, str], +): + stderr = SelectiveChecks( + files=files, + commit_ref=NEUTRAL_COMMIT, + github_event=GithubEvents.PULL_REQUEST_TARGET, + pr_labels=(), + default_branch="main", + ) + assert_outputs_are_printed(expected_outputs, str(stderr)) + + @pytest.mark.parametrize( "github_event", [ @@ -1532,10 +1161,46 @@ def test_no_commit_provided_trigger_full_build_for_any_event_type(github_event): "run-tests": "true", "docs-build": "true", "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", - "upgrade-to-newer-dependencies": "true" - if github_event in [GithubEvents.PUSH, GithubEvents.SCHEDULE] - else "false", - "parallel-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "upgrade-to-newer-dependencies": ( + "true" if github_event in [GithubEvents.PUSH, GithubEvents.SCHEDULE] else "false" + ), + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, + "needs-mypy": "true", + "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", + }, + str(stderr), + ) + + +@pytest.mark.parametrize( + "github_event", + [ + GithubEvents.PUSH, + GithubEvents.SCHEDULE, + ], +) +def test_files_provided_trigger_full_build_for_any_event_type(github_event): + stderr = SelectiveChecks( + files=("airflow/ui/src/pages/Run/Details.tsx", "airflow/ui/src/router.tsx"), + commit_ref="", + github_event=github_event, + pr_labels=(), + default_branch="main", + ) + assert_outputs_are_printed( + { + "all-python-versions": "['3.8', '3.9', '3.10', '3.11', '3.12']", + "all-python-versions-list-as-string": "3.8 3.9 3.10 3.11 3.12", + "ci-image-build": "true", + "prod-image-build": "true", + "needs-helm-tests": "true", + "run-tests": "true", + "docs-build": "true", + "skip-pre-commits": "identity,mypy-airflow,mypy-dev,mypy-docs,mypy-providers", + "upgrade-to-newer-dependencies": ( + "true" if github_event in [GithubEvents.PUSH, GithubEvents.SCHEDULE] else "false" + ), + "core-test-types-list-as-string": ALL_CI_SELECTIVE_TEST_TYPES, "needs-mypy": "true", "mypy-checks": "['mypy-airflow', 'mypy-providers', 'mypy-docs', 'mypy-dev']", }, @@ -1613,51 +1278,6 @@ def test_upgrade_to_newer_dependencies( @pytest.mark.parametrize( "files, expected_outputs,", [ - pytest.param( - ("docs/apache-airflow-providers-google/docs.rst",), - { - "docs-list-as-string": "amazon apache.beam apache.cassandra " - "cncf.kubernetes common.compat common.sql facebook google hashicorp " - "microsoft.azure microsoft.mssql mysql openlineage oracle " - "postgres presto salesforce samba sftp ssh trino", - }, - id="Google provider docs changed", - ), - pytest.param( - ("airflow/providers/common/sql/common_sql_python.py",), - { - "docs-list-as-string": "apache-airflow amazon apache.drill apache.druid apache.hive " - "apache.impala apache.pinot common.sql databricks elasticsearch " - "exasol google jdbc microsoft.mssql mysql odbc openlineage " - "oracle pgvector postgres presto slack snowflake sqlite teradata trino vertica ydb", - }, - id="Common SQL provider package python files changed", - ), - pytest.param( - ("docs/apache-airflow-providers-airbyte/docs.rst",), - { - "docs-list-as-string": "airbyte http", - }, - id="Airbyte provider docs changed", - ), - pytest.param( - ("docs/apache-airflow-providers-airbyte/docs.rst", "docs/apache-airflow/docs.rst"), - { - "docs-list-as-string": "apache-airflow airbyte http", - }, - id="Airbyte provider and airflow core docs changed", - ), - pytest.param( - ( - "docs/apache-airflow-providers-airbyte/docs.rst", - "docs/apache-airflow/docs.rst", - "docs/apache-airflow-providers/docs.rst", - ), - { - "docs-list-as-string": "apache-airflow apache-airflow-providers airbyte http", - }, - id="Airbyte provider and airflow core and common provider docs changed", - ), pytest.param( ("docs/apache-airflow/docs.rst",), { @@ -1665,11 +1285,6 @@ def test_upgrade_to_newer_dependencies( }, id="Only Airflow docs changed", ), - pytest.param( - ("airflow/providers/celery/file.py",), - {"docs-list-as-string": "apache-airflow celery cncf.kubernetes"}, - id="Celery python files changed", - ), pytest.param( ("docs/conf.py",), { @@ -1761,11 +1376,8 @@ def test_helm_tests_trigger_ci_build(files: tuple[str, ...], expected_outputs: d "apache/airflow", (), dict(), - # TODO: revert it when we fix self-hosted runners '["ubuntu-22.04"]', '["ubuntu-22.04"]', - # '["self-hosted", "Linux", "X64"]', - # TODO: revert it when we fix self-hosted runners "false", "false", # "true", @@ -2080,10 +1692,10 @@ def test_has_migrations(files: tuple[str, ...], has_migrations: bool): pytest.param( (), { - "providers-compatibility-checks": json.dumps( + "providers-compatibility-tests-matrix": json.dumps( [ check - for check in BASE_PROVIDERS_COMPATIBILITY_CHECKS + for check in PROVIDERS_COMPATIBILITY_TESTS_MATRIX if check["python-version"] == DEFAULT_PYTHON_MAJOR_MINOR_VERSION ] ), @@ -2092,7 +1704,7 @@ def test_has_migrations(files: tuple[str, ...], has_migrations: bool): ), pytest.param( ("all versions",), - {"providers-compatibility-checks": json.dumps(BASE_PROVIDERS_COMPATIBILITY_CHECKS)}, + {"providers-compatibility-tests-matrix": json.dumps(PROVIDERS_COMPATIBILITY_TESTS_MATRIX)}, id="full tests", ), ], @@ -2229,6 +1841,26 @@ def test_mypy_matches( ("non committer build",), id="Committer regular PR - forcing non-committer build", ), + pytest.param( + ("README.md",), + { + "docker-cache": "disabled", + "disable-airflow-repo-cache": "true", + }, + "potiuk", + ("disable image cache",), + id="Disabled cache", + ), + pytest.param( + ("README.md",), + { + "docker-cache": "registry", + "disable-airflow-repo-cache": "false", + }, + "potiuk", + (), + id="Standard cache", + ), ], ) def test_pr_labels( @@ -2243,3 +1875,117 @@ def test_pr_labels( pr_labels=pr_labels, ) assert_outputs_are_printed(expected_outputs, str(stderr)) + + +@pytest.mark.parametrize( + "files, pr_labels, github_event, expected_label", + [ + pytest.param( + ("airflow/www/package.json",), + (), + GithubEvents.PULL_REQUEST, + LEGACY_UI_LABEL, + id="Legacy UI file without label", + ), + pytest.param( + ("airflow/api_connexion/endpoints/health_endpoint.py", "airflow/www/package.json"), + (LEGACY_UI_LABEL,), + GithubEvents.PULL_REQUEST, + LEGACY_API_LABEL, + id="Legacy API and UI files without one of the labels API missing", + ), + pytest.param( + ("airflow/api_connexion/endpoints/health_endpoint.py",), + (), + GithubEvents.PULL_REQUEST, + LEGACY_API_LABEL, + id="Legacy API file without label", + ), + pytest.param( + ("airflow/api_connexion/endpoints/health_endpoint.py", "airflow/www/package.json"), + (LEGACY_API_LABEL,), + GithubEvents.PULL_REQUEST, + LEGACY_UI_LABEL, + id="Legacy API and UI files without one of the labels UI missing", + ), + ], +) +def test_is_legacy_ui_api_labeled_should_fail( + files: tuple[str, ...], pr_labels: tuple[str, ...], github_event: GithubEvents, expected_label: str +): + try: + stdout = SelectiveChecks( + files=files, + commit_ref=NEUTRAL_COMMIT, + github_event=github_event, + pr_labels=pr_labels, + default_branch="main", + ) + except SystemExit: + assert ( + f"[error]Please ask maintainer to assign the '{expected_label}' label to the PR in order to continue" + in escape_ansi_colors(str(stdout)) + ) + + +@pytest.mark.parametrize( + "files, pr_labels, github_event, expected_label", + [ + pytest.param( + ("airflow/www/package.json",), + (LEGACY_UI_LABEL,), + GithubEvents.PULL_REQUEST, + LEGACY_UI_LABEL, + id="Legacy UI file with label", + ), + pytest.param( + ("airflow/api_connexion/endpoints/health_endpoint.py",), + (LEGACY_API_LABEL,), + GithubEvents.PULL_REQUEST, + LEGACY_API_LABEL, + id="Legacy API file with label", + ), + pytest.param( + ("airflow/api_connexion/endpoints/health_endpoint.py",), + (), + GithubEvents.SCHEDULE, + LEGACY_API_LABEL, + id="Legacy API file in canary schedule", + ), + pytest.param( + ("airflow/www/package.json",), + (LEGACY_UI_LABEL,), + GithubEvents.SCHEDULE, + LEGACY_API_LABEL, + id="Legacy UI file in canary schedule", + ), + pytest.param( + ("airflow/api_connexion/endpoints/health_endpoint.py",), + (LEGACY_API_LABEL,), + GithubEvents.PUSH, + LEGACY_API_LABEL, + id="Legacy API file in canary push", + ), + pytest.param( + ("airflow/www/package.json",), + (LEGACY_UI_LABEL,), + GithubEvents.PUSH, + LEGACY_UI_LABEL, + id="Legacy UI file in canary push", + ), + ], +) +def test_is_legacy_ui_api_labeled_should_not_fail( + files: tuple[str, ...], pr_labels: tuple[str, ...], github_event: GithubEvents, expected_label: str +): + stdout = SelectiveChecks( + files=files, + commit_ref=NEUTRAL_COMMIT, + github_event=github_event, + pr_labels=pr_labels, + default_branch="main", + ) + assert ( + f"[error]Please ask maintainer to assign the '{expected_label}' label to the PR in order to continue" + not in escape_ansi_colors(str(stdout)) + ) diff --git a/dev/breeze/tests/test_shell_params.py b/dev/breeze/tests/test_shell_params.py index 987884f4c8425..c7ea5109c1c25 100644 --- a/dev/breeze/tests/test_shell_params.py +++ b/dev/breeze/tests/test_shell_params.py @@ -37,7 +37,6 @@ { "DEFAULT_BRANCH": AIRFLOW_BRANCH, "AIRFLOW_CI_IMAGE": f"ghcr.io/apache/airflow/{AIRFLOW_BRANCH}/ci/python3.12", - "AIRFLOW_CI_IMAGE_WITH_TAG": f"ghcr.io/apache/airflow/{AIRFLOW_BRANCH}/ci/python3.12", "PYTHON_MAJOR_MINOR_VERSION": "3.12", }, id="python3.12", @@ -47,21 +46,10 @@ {"python": 3.9}, { "AIRFLOW_CI_IMAGE": f"ghcr.io/apache/airflow/{AIRFLOW_BRANCH}/ci/python3.9", - "AIRFLOW_CI_IMAGE_WITH_TAG": f"ghcr.io/apache/airflow/{AIRFLOW_BRANCH}/ci/python3.9", "PYTHON_MAJOR_MINOR_VERSION": "3.9", }, id="python3.9", ), - pytest.param( - {}, - {"python": 3.9, "image_tag": "a_tag"}, - { - "AIRFLOW_CI_IMAGE": f"ghcr.io/apache/airflow/{AIRFLOW_BRANCH}/ci/python3.9", - "AIRFLOW_CI_IMAGE_WITH_TAG": f"ghcr.io/apache/airflow/{AIRFLOW_BRANCH}/ci/python3.9:a_tag", - "PYTHON_MAJOR_MINOR_VERSION": "3.9", - }, - id="With tag", - ), pytest.param( {}, {"airflow_branch": "v2-7-test"}, @@ -162,14 +150,6 @@ }, id="Unless it's overridden by environment variable", ), - pytest.param( - {}, - {}, - { - "ENABLED_SYSTEMS": "", - }, - id="ENABLED_SYSTEMS empty by default even if they are None in ShellParams", - ), pytest.param( {}, {}, diff --git a/dev/breeze/uv.lock b/dev/breeze/uv.lock index a5a252063646c..c3c0d29e3e0c6 100644 --- a/dev/breeze/uv.lock +++ b/dev/breeze/uv.lock @@ -1,9 +1,13 @@ version = 1 -requires-python = ">=3.8, <4" +requires-python = ">=3.9, <4" +resolution-markers = [ + "python_full_version < '3.13'", + "python_full_version >= '3.13'", +] [[package]] name = "anyio" -version = "4.5.2" +version = "4.6.2.post1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, @@ -11,9 +15,9 @@ dependencies = [ { name = "sniffio" }, { name = "typing-extensions", marker = "python_full_version < '3.11'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/4d/f9/9a7ce600ebe7804daf90d4d48b1c0510a4561ddce43a596be46676f82343/anyio-4.5.2.tar.gz", hash = "sha256:23009af4ed04ce05991845451e11ef02fc7c5ed29179ac9a420e5ad0ac7ddc5b", size = 171293 } +sdist = { url = "https://files.pythonhosted.org/packages/9f/09/45b9b7a6d4e45c6bcb5bf61d19e3ab87df68e0601fa8c5293de3542546cc/anyio-4.6.2.post1.tar.gz", hash = "sha256:4c8bc31ccdb51c7f7bd251f51c609e038d63e34219b44aa86e47576389880b4c", size = 173422 } wheels = [ - { url = "https://files.pythonhosted.org/packages/1b/b4/f7e396030e3b11394436358ca258a81d6010106582422f23443c16ca1873/anyio-4.5.2-py3-none-any.whl", hash = "sha256:c011ee36bc1e8ba40e5a81cb9df91925c218fe9b778554e0b56a21e1b5d4716f", size = 89766 }, + { url = "https://files.pythonhosted.org/packages/e4/f5/f2b75d2fc6f1a260f340f0e7c6a060f4dd2961cc16884ed851b0d18da06a/anyio-4.6.2.post1-py3-none-any.whl", hash = "sha256:6d170c36fba3bdd840c73d3868c1e777e33676a69c3a72cf0a0d5d6d8009b61d", size = 90377 }, ] [[package]] @@ -27,13 +31,14 @@ dependencies = [ { name = "flit" }, { name = "flit-core" }, { name = "gitpython" }, + { name = "google-api-python-client" }, + { name = "google-auth-httplib2" }, + { name = "google-auth-oauthlib" }, { name = "hatch" }, - { name = "importlib-resources", marker = "python_full_version < '3.9'" }, { name = "inputimeout" }, { name = "jinja2" }, { name = "jsonschema" }, { name = "packaging" }, - { name = "pipx" }, { name = "pre-commit" }, { name = "psutil" }, { name = "pygithub" }, @@ -46,25 +51,28 @@ dependencies = [ { name = "semver" }, { name = "tabulate" }, { name = "tomli", marker = "python_full_version < '3.11'" }, + { name = "tqdm" }, { name = "twine" }, ] [package.metadata] requires-dist = [ { name = "black", specifier = ">=23.11.0" }, - { name = "click", specifier = ">=8.1.7" }, + { name = "click", specifier = ">=8.1.8" }, { name = "filelock", specifier = ">=3.13.0" }, { name = "flit", specifier = "==3.10.1" }, { name = "flit-core", specifier = "==3.10.1" }, { name = "gitpython", specifier = ">=3.1.40" }, - { name = "hatch", specifier = "==1.9.4" }, - { name = "importlib-resources", marker = "python_full_version < '3.9'", specifier = ">=5.2,!=6.2.0,!=6.3.0,!=6.3.1" }, + { name = "google-api-python-client", specifier = ">=2.142.0" }, + { name = "google-auth-httplib2", specifier = ">=0.2.0" }, + { name = "google-auth-oauthlib", specifier = ">=1.2.0" }, + { name = "hatch", specifier = "==1.14.0" }, { name = "inputimeout", specifier = ">=1.0.4" }, { name = "jinja2", specifier = ">=3.1.0" }, { name = "jsonschema", specifier = ">=4.19.1" }, { name = "packaging", specifier = ">=23.2" }, - { name = "pipx", specifier = ">=1.4.1" }, { name = "pre-commit", specifier = ">=3.5.0" }, + { name = "pre-commit-uv", specifier = ">=4.1.3" }, { name = "psutil", specifier = ">=5.9.6" }, { name = "pygithub", specifier = ">=2.1.1" }, { name = "pytest", specifier = ">=8.2,<9" }, @@ -76,18 +84,10 @@ requires-dist = [ { name = "semver", specifier = ">=3.0.2" }, { name = "tabulate", specifier = ">=0.9.0" }, { name = "tomli", marker = "python_full_version < '3.11'", specifier = ">=2.0.1" }, + { name = "tqdm", specifier = ">=4.67.1" }, { name = "twine", specifier = ">=4.0.2" }, ] -[[package]] -name = "argcomplete" -version = "3.5.1" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/5f/39/27605e133e7f4bb0c8e48c9a6b87101515e3446003e0442761f6a02ac35e/argcomplete-3.5.1.tar.gz", hash = "sha256:eb1ee355aa2557bd3d0145de7b06b2a45b0ce461e1e7813f5d066039ab4177b4", size = 82280 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/f7/be/a606a6701d491cfae75583c80a6583f8abe9c36c0b9666e867e7cdd62fe8/argcomplete-3.5.1-py3-none-any.whl", hash = "sha256:1a1d148bdaa3e3b93454900163403df41448a248af01b6e849edc5ac08e6c363", size = 43498 }, -] - [[package]] name = "attrs" version = "24.2.0" @@ -108,7 +108,7 @@ wheels = [ [[package]] name = "black" -version = "24.8.0" +version = "24.10.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "click" }, @@ -119,29 +119,38 @@ dependencies = [ { name = "tomli", marker = "python_full_version < '3.11'" }, { name = "typing-extensions", marker = "python_full_version < '3.11'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/04/b0/46fb0d4e00372f4a86a6f8efa3cb193c9f64863615e39010b1477e010578/black-24.8.0.tar.gz", hash = "sha256:2500945420b6784c38b9ee885af039f5e7471ef284ab03fa35ecdde4688cd83f", size = 644810 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/47/6e/74e29edf1fba3887ed7066930a87f698ffdcd52c5dbc263eabb06061672d/black-24.8.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:09cdeb74d494ec023ded657f7092ba518e8cf78fa8386155e4a03fdcc44679e6", size = 1632092 }, - { url = "https://files.pythonhosted.org/packages/ab/49/575cb6c3faee690b05c9d11ee2e8dba8fbd6d6c134496e644c1feb1b47da/black-24.8.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:81c6742da39f33b08e791da38410f32e27d632260e599df7245cccee2064afeb", size = 1457529 }, - { url = "https://files.pythonhosted.org/packages/7a/b4/d34099e95c437b53d01c4aa37cf93944b233066eb034ccf7897fa4e5f286/black-24.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:707a1ca89221bc8a1a64fb5e15ef39cd755633daa672a9db7498d1c19de66a42", size = 1757443 }, - { url = "https://files.pythonhosted.org/packages/87/a0/6d2e4175ef364b8c4b64f8441ba041ed65c63ea1db2720d61494ac711c15/black-24.8.0-cp310-cp310-win_amd64.whl", hash = "sha256:d6417535d99c37cee4091a2f24eb2b6d5ec42b144d50f1f2e436d9fe1916fe1a", size = 1418012 }, - { url = "https://files.pythonhosted.org/packages/08/a6/0a3aa89de9c283556146dc6dbda20cd63a9c94160a6fbdebaf0918e4a3e1/black-24.8.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:fb6e2c0b86bbd43dee042e48059c9ad7830abd5c94b0bc518c0eeec57c3eddc1", size = 1615080 }, - { url = "https://files.pythonhosted.org/packages/db/94/b803d810e14588bb297e565821a947c108390a079e21dbdcb9ab6956cd7a/black-24.8.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:837fd281f1908d0076844bc2b801ad2d369c78c45cf800cad7b61686051041af", size = 1438143 }, - { url = "https://files.pythonhosted.org/packages/a5/b5/f485e1bbe31f768e2e5210f52ea3f432256201289fd1a3c0afda693776b0/black-24.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:62e8730977f0b77998029da7971fa896ceefa2c4c4933fcd593fa599ecbf97a4", size = 1738774 }, - { url = "https://files.pythonhosted.org/packages/a8/69/a000fc3736f89d1bdc7f4a879f8aaf516fb03613bb51a0154070383d95d9/black-24.8.0-cp311-cp311-win_amd64.whl", hash = "sha256:72901b4913cbac8972ad911dc4098d5753704d1f3c56e44ae8dce99eecb0e3af", size = 1427503 }, - { url = "https://files.pythonhosted.org/packages/a2/a8/05fb14195cfef32b7c8d4585a44b7499c2a4b205e1662c427b941ed87054/black-24.8.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:7c046c1d1eeb7aea9335da62472481d3bbf3fd986e093cffd35f4385c94ae368", size = 1646132 }, - { url = "https://files.pythonhosted.org/packages/41/77/8d9ce42673e5cb9988f6df73c1c5c1d4e9e788053cccd7f5fb14ef100982/black-24.8.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:649f6d84ccbae73ab767e206772cc2d7a393a001070a4c814a546afd0d423aed", size = 1448665 }, - { url = "https://files.pythonhosted.org/packages/cc/94/eff1ddad2ce1d3cc26c162b3693043c6b6b575f538f602f26fe846dfdc75/black-24.8.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2b59b250fdba5f9a9cd9d0ece6e6d993d91ce877d121d161e4698af3eb9c1018", size = 1762458 }, - { url = "https://files.pythonhosted.org/packages/28/ea/18b8d86a9ca19a6942e4e16759b2fa5fc02bbc0eb33c1b866fcd387640ab/black-24.8.0-cp312-cp312-win_amd64.whl", hash = "sha256:6e55d30d44bed36593c3163b9bc63bf58b3b30e4611e4d88a0c3c239930ed5b2", size = 1436109 }, - { url = "https://files.pythonhosted.org/packages/9f/d4/ae03761ddecc1a37d7e743b89cccbcf3317479ff4b88cfd8818079f890d0/black-24.8.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:505289f17ceda596658ae81b61ebbe2d9b25aa78067035184ed0a9d855d18afd", size = 1617322 }, - { url = "https://files.pythonhosted.org/packages/14/4b/4dfe67eed7f9b1ddca2ec8e4418ea74f0d1dc84d36ea874d618ffa1af7d4/black-24.8.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:b19c9ad992c7883ad84c9b22aaa73562a16b819c1d8db7a1a1a49fb7ec13c7d2", size = 1442108 }, - { url = "https://files.pythonhosted.org/packages/97/14/95b3f91f857034686cae0e73006b8391d76a8142d339b42970eaaf0416ea/black-24.8.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1f13f7f386f86f8121d76599114bb8c17b69d962137fc70efe56137727c7047e", size = 1745786 }, - { url = "https://files.pythonhosted.org/packages/95/54/68b8883c8aa258a6dde958cd5bdfada8382bec47c5162f4a01e66d839af1/black-24.8.0-cp38-cp38-win_amd64.whl", hash = "sha256:f490dbd59680d809ca31efdae20e634f3fae27fba3ce0ba3208333b713bc3920", size = 1426754 }, - { url = "https://files.pythonhosted.org/packages/13/b2/b3f24fdbb46f0e7ef6238e131f13572ee8279b70f237f221dd168a9dba1a/black-24.8.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:eab4dd44ce80dea27dc69db40dab62d4ca96112f87996bca68cd75639aeb2e4c", size = 1631706 }, - { url = "https://files.pythonhosted.org/packages/d9/35/31010981e4a05202a84a3116423970fd1a59d2eda4ac0b3570fbb7029ddc/black-24.8.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3c4285573d4897a7610054af5a890bde7c65cb466040c5f0c8b732812d7f0e5e", size = 1457429 }, - { url = "https://files.pythonhosted.org/packages/27/25/3f706b4f044dd569a20a4835c3b733dedea38d83d2ee0beb8178a6d44945/black-24.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e84e33b37be070ba135176c123ae52a51f82306def9f7d063ee302ecab2cf47", size = 1756488 }, - { url = "https://files.pythonhosted.org/packages/63/72/79375cd8277cbf1c5670914e6bd4c1b15dea2c8f8e906dc21c448d0535f0/black-24.8.0-cp39-cp39-win_amd64.whl", hash = "sha256:73bbf84ed136e45d451a260c6b73ed674652f90a2b3211d6a35e78054563a9bb", size = 1417721 }, - { url = "https://files.pythonhosted.org/packages/27/1e/83fa8a787180e1632c3d831f7e58994d7aaf23a0961320d21e84f922f919/black-24.8.0-py3-none-any.whl", hash = "sha256:972085c618ee94f402da1af548a4f218c754ea7e5dc70acb168bfaca4c2542ed", size = 206504 }, +sdist = { url = "https://files.pythonhosted.org/packages/d8/0d/cc2fb42b8c50d80143221515dd7e4766995bd07c56c9a3ed30baf080b6dc/black-24.10.0.tar.gz", hash = "sha256:846ea64c97afe3bc677b761787993be4991810ecc7a4a937816dd6bddedc4875", size = 645813 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a3/f3/465c0eb5cddf7dbbfe1fecd9b875d1dcf51b88923cd2c1d7e9ab95c6336b/black-24.10.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e6668650ea4b685440857138e5fe40cde4d652633b1bdffc62933d0db4ed9812", size = 1623211 }, + { url = "https://files.pythonhosted.org/packages/df/57/b6d2da7d200773fdfcc224ffb87052cf283cec4d7102fab450b4a05996d8/black-24.10.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1c536fcf674217e87b8cc3657b81809d3c085d7bf3ef262ead700da345bfa6ea", size = 1457139 }, + { url = "https://files.pythonhosted.org/packages/6e/c5/9023b7673904a5188f9be81f5e129fff69f51f5515655fbd1d5a4e80a47b/black-24.10.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:649fff99a20bd06c6f727d2a27f401331dc0cc861fb69cde910fe95b01b5928f", size = 1753774 }, + { url = "https://files.pythonhosted.org/packages/e1/32/df7f18bd0e724e0d9748829765455d6643ec847b3f87e77456fc99d0edab/black-24.10.0-cp310-cp310-win_amd64.whl", hash = "sha256:fe4d6476887de70546212c99ac9bd803d90b42fc4767f058a0baa895013fbb3e", size = 1414209 }, + { url = "https://files.pythonhosted.org/packages/c2/cc/7496bb63a9b06a954d3d0ac9fe7a73f3bf1cd92d7a58877c27f4ad1e9d41/black-24.10.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5a2221696a8224e335c28816a9d331a6c2ae15a2ee34ec857dcf3e45dbfa99ad", size = 1607468 }, + { url = "https://files.pythonhosted.org/packages/2b/e3/69a738fb5ba18b5422f50b4f143544c664d7da40f09c13969b2fd52900e0/black-24.10.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f9da3333530dbcecc1be13e69c250ed8dfa67f43c4005fb537bb426e19200d50", size = 1437270 }, + { url = "https://files.pythonhosted.org/packages/c9/9b/2db8045b45844665c720dcfe292fdaf2e49825810c0103e1191515fc101a/black-24.10.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4007b1393d902b48b36958a216c20c4482f601569d19ed1df294a496eb366392", size = 1737061 }, + { url = "https://files.pythonhosted.org/packages/a3/95/17d4a09a5be5f8c65aa4a361444d95edc45def0de887810f508d3f65db7a/black-24.10.0-cp311-cp311-win_amd64.whl", hash = "sha256:394d4ddc64782e51153eadcaaca95144ac4c35e27ef9b0a42e121ae7e57a9175", size = 1423293 }, + { url = "https://files.pythonhosted.org/packages/90/04/bf74c71f592bcd761610bbf67e23e6a3cff824780761f536512437f1e655/black-24.10.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b5e39e0fae001df40f95bd8cc36b9165c5e2ea88900167bddf258bacef9bbdc3", size = 1644256 }, + { url = "https://files.pythonhosted.org/packages/4c/ea/a77bab4cf1887f4b2e0bce5516ea0b3ff7d04ba96af21d65024629afedb6/black-24.10.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:d37d422772111794b26757c5b55a3eade028aa3fde43121ab7b673d050949d65", size = 1448534 }, + { url = "https://files.pythonhosted.org/packages/4e/3e/443ef8bc1fbda78e61f79157f303893f3fddf19ca3c8989b163eb3469a12/black-24.10.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:14b3502784f09ce2443830e3133dacf2c0110d45191ed470ecb04d0f5f6fcb0f", size = 1761892 }, + { url = "https://files.pythonhosted.org/packages/52/93/eac95ff229049a6901bc84fec6908a5124b8a0b7c26ea766b3b8a5debd22/black-24.10.0-cp312-cp312-win_amd64.whl", hash = "sha256:30d2c30dc5139211dda799758559d1b049f7f14c580c409d6ad925b74a4208a8", size = 1434796 }, + { url = "https://files.pythonhosted.org/packages/d0/a0/a993f58d4ecfba035e61fca4e9f64a2ecae838fc9f33ab798c62173ed75c/black-24.10.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:1cbacacb19e922a1d75ef2b6ccaefcd6e93a2c05ede32f06a21386a04cedb981", size = 1643986 }, + { url = "https://files.pythonhosted.org/packages/37/d5/602d0ef5dfcace3fb4f79c436762f130abd9ee8d950fa2abdbf8bbc555e0/black-24.10.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1f93102e0c5bb3907451063e08b9876dbeac810e7da5a8bfb7aeb5a9ef89066b", size = 1448085 }, + { url = "https://files.pythonhosted.org/packages/47/6d/a3a239e938960df1a662b93d6230d4f3e9b4a22982d060fc38c42f45a56b/black-24.10.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ddacb691cdcdf77b96f549cf9591701d8db36b2f19519373d60d31746068dbf2", size = 1760928 }, + { url = "https://files.pythonhosted.org/packages/dd/cf/af018e13b0eddfb434df4d9cd1b2b7892bab119f7a20123e93f6910982e8/black-24.10.0-cp313-cp313-win_amd64.whl", hash = "sha256:680359d932801c76d2e9c9068d05c6b107f2584b2a5b88831c83962eb9984c1b", size = 1436875 }, + { url = "https://files.pythonhosted.org/packages/fe/02/f408c804e0ee78c367dcea0a01aedde4f1712af93b8b6e60df981e0228c7/black-24.10.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:17374989640fbca88b6a448129cd1745c5eb8d9547b464f281b251dd00155ccd", size = 1622516 }, + { url = "https://files.pythonhosted.org/packages/f8/b9/9b706ed2f55bfb28b436225a9c57da35990c9005b90b8c91f03924454ad7/black-24.10.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:63f626344343083322233f175aaf372d326de8436f5928c042639a4afbbf1d3f", size = 1456181 }, + { url = "https://files.pythonhosted.org/packages/0a/1c/314d7f17434a5375682ad097f6f4cc0e3f414f3c95a9b1bb4df14a0f11f9/black-24.10.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfa1d0cb6200857f1923b602f978386a3a2758a65b52e0950299ea014be6800", size = 1752801 }, + { url = "https://files.pythonhosted.org/packages/39/a7/20e5cd9237d28ad0b31438de5d9f01c8b99814576f4c0cda1edd62caf4b0/black-24.10.0-cp39-cp39-win_amd64.whl", hash = "sha256:2cd9c95431d94adc56600710f8813ee27eea544dd118d45896bb734e9d7a0dc7", size = 1413626 }, + { url = "https://files.pythonhosted.org/packages/8d/a7/4b27c50537ebca8bec139b872861f9d2bf501c5ec51fcf897cb924d9e264/black-24.10.0-py3-none-any.whl", hash = "sha256:3bb2b7a1f7b685f85b11fed1ef10f8a9148bceb49853e47a294a3dd963c1dd7d", size = 206898 }, +] + +[[package]] +name = "cachetools" +version = "5.5.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/c3/38/a0f315319737ecf45b4319a8cd1f3a908e29d9277b46942263292115eee7/cachetools-5.5.0.tar.gz", hash = "sha256:2cc24fb4cbe39633fb7badd9db9ca6295d766d9c2995f245725a46715d050f2a", size = 27661 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a4/07/14f8ad37f2d12a5ce41206c21820d8cb6561b728e51fad4530dff0552a67/cachetools-5.5.0-py3-none-any.whl", hash = "sha256:02134e8439cdc2ffb62023ce1debca2944c3f289d66bb17ead3ab3dede74b292", size = 9524 }, ] [[package]] @@ -208,14 +217,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/f1/47/d7145bf2dc04684935d57d67dff9d6d795b2ba2796806bb109864be3a151/cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9", size = 488469 }, { url = "https://files.pythonhosted.org/packages/bf/ee/f94057fa6426481d663b88637a9a10e859e492c73d0384514a17d78ee205/cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d", size = 172475 }, { url = "https://files.pythonhosted.org/packages/7c/fc/6a8cb64e5f0324877d503c854da15d76c1e50eb722e320b15345c4d0c6de/cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a", size = 182009 }, - { url = "https://files.pythonhosted.org/packages/48/08/15bf6b43ae9bd06f6b00ad8a91f5a8fe1069d4c9fab550a866755402724e/cffi-1.17.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b", size = 182457 }, - { url = "https://files.pythonhosted.org/packages/c2/5b/f1523dd545f92f7df468e5f653ffa4df30ac222f3c884e51e139878f1cb5/cffi-1.17.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964", size = 425932 }, - { url = "https://files.pythonhosted.org/packages/53/93/7e547ab4105969cc8c93b38a667b82a835dd2cc78f3a7dad6130cfd41e1d/cffi-1.17.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9", size = 448585 }, - { url = "https://files.pythonhosted.org/packages/56/c4/a308f2c332006206bb511de219efeff090e9d63529ba0a77aae72e82248b/cffi-1.17.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc", size = 456268 }, - { url = "https://files.pythonhosted.org/packages/ca/5b/b63681518265f2f4060d2b60755c1c77ec89e5e045fc3773b72735ddaad5/cffi-1.17.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c", size = 436592 }, - { url = "https://files.pythonhosted.org/packages/bb/19/b51af9f4a4faa4a8ac5a0e5d5c2522dcd9703d07fac69da34a36c4d960d3/cffi-1.17.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1", size = 446512 }, - { url = "https://files.pythonhosted.org/packages/e2/63/2bed8323890cb613bbecda807688a31ed11a7fe7afe31f8faaae0206a9a3/cffi-1.17.1-cp38-cp38-win32.whl", hash = "sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8", size = 171576 }, - { url = "https://files.pythonhosted.org/packages/2f/70/80c33b044ebc79527447fd4fbc5455d514c3bb840dede4455de97da39b4d/cffi-1.17.1-cp38-cp38-win_amd64.whl", hash = "sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1", size = 181229 }, { url = "https://files.pythonhosted.org/packages/b9/ea/8bb50596b8ffbc49ddd7a1ad305035daa770202a6b782fc164647c2673ad/cffi-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16", size = 182220 }, { url = "https://files.pythonhosted.org/packages/ae/11/e77c8cd24f58285a82c23af484cf5b124a376b32644e445960d1a4654c3a/cffi-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36", size = 178605 }, { url = "https://files.pythonhosted.org/packages/ed/65/25a8dc32c53bf5b7b6c2686b42ae2ad58743f7ff644844af7cdb29b49361/cffi-1.17.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8", size = 424910 }, @@ -305,21 +306,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/d8/90/6af4cd042066a4adad58ae25648a12c09c879efa4849c705719ba1b23d8c/charset_normalizer-3.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ffc519621dce0c767e96b9c53f09c5d215578e10b02c285809f76509a3931482", size = 144970 }, { url = "https://files.pythonhosted.org/packages/cc/67/e5e7e0cbfefc4ca79025238b43cdf8a2037854195b37d6417f3d0895c4c2/charset_normalizer-3.4.0-cp313-cp313-win32.whl", hash = "sha256:f19c1585933c82098c2a520f8ec1227f20e339e33aca8fa6f956f6691b784e67", size = 94973 }, { url = "https://files.pythonhosted.org/packages/65/97/fc9bbc54ee13d33dc54a7fcf17b26368b18505500fc01e228c27b5222d80/charset_normalizer-3.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:707b82d19e65c9bd28b81dde95249b07bf9f5b90ebe1ef17d9b57473f8a64b7b", size = 102308 }, - { url = "https://files.pythonhosted.org/packages/86/f4/ccab93e631e7293cca82f9f7ba39783c967f823a0000df2d8dd743cad74f/charset_normalizer-3.4.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:af73657b7a68211996527dbfeffbb0864e043d270580c5aef06dc4b659a4b578", size = 193961 }, - { url = "https://files.pythonhosted.org/packages/94/d4/2b21cb277bac9605026d2d91a4a8872bc82199ed11072d035dc674c27223/charset_normalizer-3.4.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:cab5d0b79d987c67f3b9e9c53f54a61360422a5a0bc075f43cab5621d530c3b6", size = 124507 }, - { url = "https://files.pythonhosted.org/packages/9a/e0/a7c1fcdff20d9c667342e0391cfeb33ab01468d7d276b2c7914b371667cc/charset_normalizer-3.4.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:9289fd5dddcf57bab41d044f1756550f9e7cf0c8e373b8cdf0ce8773dc4bd417", size = 119298 }, - { url = "https://files.pythonhosted.org/packages/70/de/1538bb2f84ac9940f7fa39945a5dd1d22b295a89c98240b262fc4b9fcfe0/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6b493a043635eb376e50eedf7818f2f322eabbaa974e948bd8bdd29eb7ef2a51", size = 139328 }, - { url = "https://files.pythonhosted.org/packages/e9/ca/288bb1a6bc2b74fb3990bdc515012b47c4bc5925c8304fc915d03f94b027/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9fa2566ca27d67c86569e8c85297aaf413ffab85a8960500f12ea34ff98e4c41", size = 149368 }, - { url = "https://files.pythonhosted.org/packages/aa/75/58374fdaaf8406f373e508dab3486a31091f760f99f832d3951ee93313e8/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a8e538f46104c815be19c975572d74afb53f29650ea2025bbfaef359d2de2f7f", size = 141944 }, - { url = "https://files.pythonhosted.org/packages/32/c8/0bc558f7260db6ffca991ed7166494a7da4fda5983ee0b0bfc8ed2ac6ff9/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6fd30dc99682dc2c603c2b315bded2799019cea829f8bf57dc6b61efde6611c8", size = 143326 }, - { url = "https://files.pythonhosted.org/packages/0e/dd/7f6fec09a1686446cee713f38cf7d5e0669e0bcc8288c8e2924e998cf87d/charset_normalizer-3.4.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2006769bd1640bdf4d5641c69a3d63b71b81445473cac5ded39740a226fa88ab", size = 146171 }, - { url = "https://files.pythonhosted.org/packages/4c/a8/440f1926d6d8740c34d3ca388fbd718191ec97d3d457a0677eb3aa718fce/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:dc15e99b2d8a656f8e666854404f1ba54765871104e50c8e9813af8a7db07f12", size = 139711 }, - { url = "https://files.pythonhosted.org/packages/e9/7f/4b71e350a3377ddd70b980bea1e2cc0983faf45ba43032b24b2578c14314/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:ab2e5bef076f5a235c3774b4f4028a680432cded7cad37bba0fd90d64b187d19", size = 148348 }, - { url = "https://files.pythonhosted.org/packages/1e/70/17b1b9202531a33ed7ef41885f0d2575ae42a1e330c67fddda5d99ad1208/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:4ec9dd88a5b71abfc74e9df5ebe7921c35cbb3b641181a531ca65cdb5e8e4dea", size = 151290 }, - { url = "https://files.pythonhosted.org/packages/44/30/574b5b5933d77ecb015550aafe1c7d14a8cd41e7e6c4dcea5ae9e8d496c3/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:43193c5cda5d612f247172016c4bb71251c784d7a4d9314677186a838ad34858", size = 149114 }, - { url = "https://files.pythonhosted.org/packages/0b/11/ca7786f7e13708687443082af20d8341c02e01024275a28bc75032c5ce5d/charset_normalizer-3.4.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:aa693779a8b50cd97570e5a0f343538a8dbd3e496fa5dcb87e29406ad0299654", size = 143856 }, - { url = "https://files.pythonhosted.org/packages/f9/c2/1727c1438256c71ed32753b23ec2e6fe7b6dff66a598f6566cfe8139305e/charset_normalizer-3.4.0-cp38-cp38-win32.whl", hash = "sha256:7706f5850360ac01d80c89bcef1640683cc12ed87f42579dab6c5d3ed6888613", size = 94333 }, - { url = "https://files.pythonhosted.org/packages/09/c8/0e17270496a05839f8b500c1166e3261d1226e39b698a735805ec206967b/charset_normalizer-3.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:c3e446d253bd88f6377260d07c895816ebf33ffffd56c1c792b13bff9c3e1ade", size = 101454 }, { url = "https://files.pythonhosted.org/packages/54/2f/28659eee7f5d003e0f5a3b572765bf76d6e0fe6601ab1f1b1dd4cba7e4f1/charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:980b4f289d1d90ca5efcf07958d3eb38ed9c0b7676bf2831a54d4f66f9c27dfa", size = 196326 }, { url = "https://files.pythonhosted.org/packages/d1/18/92869d5c0057baa973a3ee2af71573be7b084b3c3d428fe6463ce71167f8/charset_normalizer-3.4.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:f28f891ccd15c514a0981f3b9db9aa23d62fe1a99997512b0491d2ed323d229a", size = 125614 }, { url = "https://files.pythonhosted.org/packages/d6/27/327904c5a54a7796bb9f36810ec4173d2df5d88b401d2b95ef53111d214e/charset_normalizer-3.4.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8aacce6e2e1edcb6ac625fb0f8c3a9570ccc7bfba1f63419b3769ccf6a00ed0", size = 120450 }, @@ -340,14 +326,14 @@ wheels = [ [[package]] name = "click" -version = "8.1.7" +version = "8.1.8" source = { registry = "https://pypi.org/simple" } dependencies = [ - { name = "colorama", marker = "platform_system == 'Windows'" }, + { name = "colorama", marker = "sys_platform == 'win32'" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/96/d3/f04c7bfcf5c1862a2a5b845c6b2b360488cf47af55dfa79c98f6a6bf98b5/click-8.1.7.tar.gz", hash = "sha256:ca9853ad459e787e2192211578cc907e7594e294c7ccc834310722b41b9ca6de", size = 336121 } +sdist = { url = "https://files.pythonhosted.org/packages/b9/2e/0090cbf739cee7d23781ad4b89a9894a41538e4fcf4c31dcdd705b78eb8b/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a", size = 226593 } wheels = [ - { url = "https://files.pythonhosted.org/packages/00/2e/d53fa4befbf2cfa713304affc7ca780ce4fc1fd8710527771b58311a3229/click-8.1.7-py3-none-any.whl", hash = "sha256:ae74fb96c20a0277a1d615f1e4d73c8414f5a98db8b799a7931d1582f3390c28", size = 97941 }, + { url = "https://files.pythonhosted.org/packages/7e/d4/7ebdbd03970677812aac39c869717059dbb71a4cfc033ca6e5221787892c/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2", size = 98188 }, ] [[package]] @@ -419,20 +405,11 @@ wheels = [ [[package]] name = "docutils" -version = "0.20.1" +version = "0.21.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/1f/53/a5da4f2c5739cf66290fac1431ee52aff6851c7c8ffd8264f13affd7bcdd/docutils-0.20.1.tar.gz", hash = "sha256:f08a4e276c3a1583a86dce3e34aba3fe04d02bba2dd51ed16106244e8a923e3b", size = 2058365 } +sdist = { url = "https://files.pythonhosted.org/packages/ae/ed/aefcc8cd0ba62a0560c3c18c33925362d46c6075480bfa4df87b28e169a9/docutils-0.21.2.tar.gz", hash = "sha256:3a6b18732edf182daa3cd12775bbb338cf5691468f91eeeb109deff6ebfa986f", size = 2204444 } wheels = [ - { url = "https://files.pythonhosted.org/packages/26/87/f238c0670b94533ac0353a4e2a1a771a0cc73277b88bff23d3ae35a256c1/docutils-0.20.1-py3-none-any.whl", hash = "sha256:96f387a2c5562db4476f09f13bbab2192e764cac08ebbf3a34a95d9b1e4a59d6", size = 572666 }, -] - -[[package]] -name = "editables" -version = "0.5" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/37/4a/986d35164e2033ddfb44515168a281a7986e260d344cf369c3f52d4c3275/editables-0.5.tar.gz", hash = "sha256:309627d9b5c4adc0e668d8c6fa7bac1ba7c8c5d415c2d27f60f081f8e80d1de2", size = 14744 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/6b/be/0f2f4a5e8adc114a02b63d92bf8edbfa24db6fc602fca83c885af2479e0e/editables-0.5-py3-none-any.whl", hash = "sha256:61e5ffa82629e0d8bfe09bc44a07db3c1ab8ed1ce78a6980732870f19b5e7d4c", size = 5098 }, + { url = "https://files.pythonhosted.org/packages/8f/d7/9322c609343d929e75e7e5e6255e614fcc67572cfd083959cdef3b7aad79/docutils-0.21.2-py3-none-any.whl", hash = "sha256:dafca5b9e384f0e419294eb4d2ff9fa826435bf15f15b7bd45723e8ad76811b2", size = 587408 }, ] [[package]] @@ -511,6 +488,90 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/e9/bd/cc3a402a6439c15c3d4294333e13042b915bbeab54edc457c723931fed3f/GitPython-3.1.43-py3-none-any.whl", hash = "sha256:eec7ec56b92aad751f9912a73404bc02ba212a23adb2c7098ee668417051a1ff", size = 207337 }, ] +[[package]] +name = "google-api-core" +version = "2.22.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-auth" }, + { name = "googleapis-common-protos" }, + { name = "proto-plus" }, + { name = "protobuf" }, + { name = "requests" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/00/c2/425f97c2087affbd452a05d3faa08d97de333f2ca554733e1becab55ee4e/google_api_core-2.22.0.tar.gz", hash = "sha256:26f8d76b96477db42b55fd02a33aae4a42ec8b86b98b94969b7333a2c828bf35", size = 159700 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ba/7b/1137a9811be73d8ff8238eb2d9f60f0bc0bb6a1edd87f9d47557ab937a2b/google_api_core-2.22.0-py3-none-any.whl", hash = "sha256:a6652b6bd51303902494998626653671703c420f6f4c88cfd3f50ed723e9d021", size = 156538 }, +] + +[[package]] +name = "google-api-python-client" +version = "2.151.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-api-core" }, + { name = "google-auth" }, + { name = "google-auth-httplib2" }, + { name = "httplib2" }, + { name = "uritemplate" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/7c/87/5a753c932a962f1ac72403608b6840500187fd9d856127a360b7a30c59ec/google_api_python_client-2.151.0.tar.gz", hash = "sha256:a9d26d630810ed4631aea21d1de3e42072f98240aaf184a8a1a874a371115034", size = 12030480 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/75/32/675ec68ed1bd27664d74f980cd262504603da0b683c2dd09c8725f576236/google_api_python_client-2.151.0-py2.py3-none-any.whl", hash = "sha256:4427b2f47cd88b0355d540c2c52215f68c337f3bc9d6aae1ceeae4525977504c", size = 12534219 }, +] + +[[package]] +name = "google-auth" +version = "2.35.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "cachetools" }, + { name = "pyasn1-modules" }, + { name = "rsa" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a1/37/c854a8b1b1020cf042db3d67577c6f84cd1e8ff6515e4f5498ae9e444ea5/google_auth-2.35.0.tar.gz", hash = "sha256:f4c64ed4e01e8e8b646ef34c018f8bf3338df0c8e37d8b3bba40e7f574a3278a", size = 267223 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/27/1f/3a72917afcb0d5cd842cbccb81bf7a8a7b45b4c66d8dc4556ccb3b016bfc/google_auth-2.35.0-py2.py3-none-any.whl", hash = "sha256:25df55f327ef021de8be50bad0dfd4a916ad0de96da86cd05661c9297723ad3f", size = 208968 }, +] + +[[package]] +name = "google-auth-httplib2" +version = "0.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-auth" }, + { name = "httplib2" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/56/be/217a598a818567b28e859ff087f347475c807a5649296fb5a817c58dacef/google-auth-httplib2-0.2.0.tar.gz", hash = "sha256:38aa7badf48f974f1eb9861794e9c0cb2a0511a4ec0679b1f886d108f5640e05", size = 10842 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/be/8a/fe34d2f3f9470a27b01c9e76226965863f153d5fbe276f83608562e49c04/google_auth_httplib2-0.2.0-py2.py3-none-any.whl", hash = "sha256:b65a0a2123300dd71281a7bf6e64d65a0759287df52729bdd1ae2e47dc311a3d", size = 9253 }, +] + +[[package]] +name = "google-auth-oauthlib" +version = "1.2.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "google-auth" }, + { name = "requests-oauthlib" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/cc/0f/1772edb8d75ecf6280f1c7f51cbcebe274e8b17878b382f63738fd96cee5/google_auth_oauthlib-1.2.1.tar.gz", hash = "sha256:afd0cad092a2eaa53cd8e8298557d6de1034c6cb4a740500b5357b648af97263", size = 24970 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/1a/8e/22a28dfbd218033e4eeaf3a0533b2b54852b6530da0c0fe934f0cc494b29/google_auth_oauthlib-1.2.1-py2.py3-none-any.whl", hash = "sha256:2d58a27262d55aa1b87678c3ba7142a080098cbc2024f903c62355deb235d91f", size = 24930 }, +] + +[[package]] +name = "googleapis-common-protos" +version = "1.65.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/53/3b/1599ceafa875ffb951480c8c74f4b77646a6b80e80970698f2aa93c216ce/googleapis_common_protos-1.65.0.tar.gz", hash = "sha256:334a29d07cddc3aa01dee4988f9afd9b2916ee2ff49d6b757155dc0d197852c0", size = 113657 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/ec/08/49bfe7cf737952cc1a9c43e80cc258ed45dad7f183c5b8276fc94cb3862d/googleapis_common_protos-1.65.0-py2.py3-none-any.whl", hash = "sha256:2972e6c496f435b92590fd54045060867f3fe9be2c82ab148fc8885035479a63", size = 220890 }, +] + [[package]] name = "h11" version = "0.14.0" @@ -522,7 +583,7 @@ wheels = [ [[package]] name = "hatch" -version = "1.9.4" +version = "1.14.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "click" }, @@ -538,29 +599,29 @@ dependencies = [ { name = "tomli-w" }, { name = "tomlkit" }, { name = "userpath" }, + { name = "uv" }, { name = "virtualenv" }, { name = "zstandard" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/17/98/63bf6c592b65f67201db292489053b86310cfb107eb095d345398e00cbd3/hatch-1.9.4.tar.gz", hash = "sha256:9bb7d1c4a7a51cc1f9e16394875c940b45fa84b698f0291529316b27d74e7f32", size = 689598 } +sdist = { url = "https://files.pythonhosted.org/packages/bc/15/b4e3d50d8177e6e8a243b24d9819e3807f7bfd3b2bebe7b5aef32a9c79cb/hatch-1.14.0.tar.gz", hash = "sha256:351e41bc6c72bc93cb98651212226e495b43549eee27c487832e459e5d0f0eda", size = 5188143 } wheels = [ - { url = "https://files.pythonhosted.org/packages/05/38/ba8f90264d19ed39851f37a22f2a4be8e9644a1203f114b16647f954bb02/hatch-1.9.4-py3-none-any.whl", hash = "sha256:461eb86b4b46249e38a9a621c7239e61285fd8e14b5a1b5a727c394893a25300", size = 110812 }, + { url = "https://files.pythonhosted.org/packages/85/c6/ad910cdb79600af0100b7c4f7093eb4b95a2b44e589e66b6b938b09cc6f9/hatch-1.14.0-py3-none-any.whl", hash = "sha256:b12c7a2f4aaf6db7180e35c476e1a2ad4ec7197c20c4332964599424d4918ded", size = 125763 }, ] [[package]] name = "hatchling" -version = "1.21.1" +version = "1.27.0" source = { registry = "https://pypi.org/simple" } dependencies = [ - { name = "editables" }, { name = "packaging" }, { name = "pathspec" }, { name = "pluggy" }, { name = "tomli", marker = "python_full_version < '3.11'" }, { name = "trove-classifiers" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/d8/a1/7dd1caa87c0b15c04c6291e25112e5d082cce02ee87f221a8be1d594f857/hatchling-1.21.1.tar.gz", hash = "sha256:bba440453a224e7d4478457fa2e8d8c3633765bafa02975a6b53b9bf917980bc", size = 58059 } +sdist = { url = "https://files.pythonhosted.org/packages/8f/8a/cc1debe3514da292094f1c3a700e4ca25442489731ef7c0814358816bb03/hatchling-1.27.0.tar.gz", hash = "sha256:971c296d9819abb3811112fc52c7a9751c8d381898f36533bb16f9791e941fd6", size = 54983 } wheels = [ - { url = "https://files.pythonhosted.org/packages/3a/bb/40528a09a33845bd7fd75c33b3be7faec3b5c8f15f68a58931da67420fb9/hatchling-1.21.1-py3-none-any.whl", hash = "sha256:21e8c13f8458b219a91cb84e5b61c15bf786695d1c4fabc29e91e78f94bfe892", size = 76740 }, + { url = "https://files.pythonhosted.org/packages/08/e7/ae38d7a6dfba0533684e0b2136817d667588ae3ec984c1a4e5df5eb88482/hatchling-1.27.0-py3-none-any.whl", hash = "sha256:d3a2f3567c4f926ea39849cdf924c7e99e6686c9c8e288ae1037c8fa2a5d937b", size = 75794 }, ] [[package]] @@ -576,6 +637,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/06/89/b161908e2f51be56568184aeb4a880fd287178d176fd1c860d2217f41106/httpcore-1.0.6-py3-none-any.whl", hash = "sha256:27b59625743b85577a8c0e10e55b50b5368a4f2cfe8cc7bcfa9cf00829c2682f", size = 78011 }, ] +[[package]] +name = "httplib2" +version = "0.22.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pyparsing" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/3d/ad/2371116b22d616c194aa25ec410c9c6c37f23599dcd590502b74db197584/httplib2-0.22.0.tar.gz", hash = "sha256:d7a10bc5ef5ab08322488bde8c726eeee5c8618723fdb399597ec58f3d82df81", size = 351116 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a8/6c/d2fbdaaa5959339d53ba38e94c123e4e84b8fbc4b84beb0e70d7c1608486/httplib2-0.22.0-py3-none-any.whl", hash = "sha256:14ae0a53c1ba8f3d37e9e27cf37eabb0fb9980f435ba405d546948b009dd64dc", size = 96854 }, +] + [[package]] name = "httpx" version = "0.27.2" @@ -634,18 +707,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/a0/d9/a1e041c5e7caa9a05c925f4bdbdfb7f006d1f74996af53467bc394c97be7/importlib_metadata-8.5.0-py3-none-any.whl", hash = "sha256:45e54197d28b7a7f1559e60b95e7c567032b602131fbd588f1497f47880aa68b", size = 26514 }, ] -[[package]] -name = "importlib-resources" -version = "6.4.5" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "zipp", marker = "python_full_version < '3.10'" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/98/be/f3e8c6081b684f176b761e6a2fef02a0be939740ed6f54109a2951d806f3/importlib_resources-6.4.5.tar.gz", hash = "sha256:980862a1d16c9e147a59603677fa2aa5fd82b87f223b6cb870695bcfce830065", size = 43372 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e1/6a/4604f9ae2fa62ef47b9de2fa5ad599589d28c9fd1d335f32759813dfa91e/importlib_resources-6.4.5-py3-none-any.whl", hash = "sha256:ac29d5f956f01d5e4bb63102a5a19957f1b9175e45649977264a1416783bb717", size = 36115 }, -] - [[package]] name = "iniconfig" version = "2.0.0" @@ -726,9 +787,7 @@ version = "4.23.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "attrs" }, - { name = "importlib-resources", marker = "python_full_version < '3.9'" }, { name = "jsonschema-specifications" }, - { name = "pkgutil-resolve-name", marker = "python_full_version < '3.9'" }, { name = "referencing" }, { name = "rpds-py" }, ] @@ -739,15 +798,14 @@ wheels = [ [[package]] name = "jsonschema-specifications" -version = "2023.12.1" +version = "2024.10.1" source = { registry = "https://pypi.org/simple" } dependencies = [ - { name = "importlib-resources", marker = "python_full_version < '3.9'" }, { name = "referencing" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/f8/b9/cc0cc592e7c195fb8a650c1d5990b10175cf13b4c97465c72ec841de9e4b/jsonschema_specifications-2023.12.1.tar.gz", hash = "sha256:48a76787b3e70f5ed53f1160d2b81f586e4ca6d1548c5de7085d1682674764cc", size = 13983 } +sdist = { url = "https://files.pythonhosted.org/packages/10/db/58f950c996c793472e336ff3655b13fbcf1e3b359dcf52dcf3ed3b52c352/jsonschema_specifications-2024.10.1.tar.gz", hash = "sha256:0f38b83639958ce1152d02a7f062902c41c8fd20d558b0c34344292d417ae272", size = 15561 } wheels = [ - { url = "https://files.pythonhosted.org/packages/ee/07/44bd408781594c4d0a027666ef27fab1e441b109dc3b76b4f836f8fd04fe/jsonschema_specifications-2023.12.1-py3-none-any.whl", hash = "sha256:87e4fdf3a94858b8a2ba2778d9ba57d8a9cafca7c7489c46ba0d30a8bc6a9c3c", size = 18482 }, + { url = "https://files.pythonhosted.org/packages/d1/0f/8910b19ac0670a0f80ce1008e5e751c4a57e14d2c4c13a482aa6079fa9d6/jsonschema_specifications-2024.10.1-py3-none-any.whl", hash = "sha256:a09a0680616357d9a0ecf05c12ad234479f549239d0f5b55f3deea67475da9bf", size = 18459 }, ] [[package]] @@ -756,7 +814,6 @@ version = "25.5.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "importlib-metadata", marker = "python_full_version < '3.12'" }, - { name = "importlib-resources", marker = "python_full_version < '3.9'" }, { name = "jaraco-classes" }, { name = "jaraco-context" }, { name = "jaraco-functools" }, @@ -783,60 +840,70 @@ wheels = [ [[package]] name = "markupsafe" -version = "2.1.5" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/87/5b/aae44c6655f3801e81aa3eef09dbbf012431987ba564d7231722f68df02d/MarkupSafe-2.1.5.tar.gz", hash = "sha256:d283d37a890ba4c1ae73ffadf8046435c76e7bc2247bbb63c00bd1a709c6544b", size = 19384 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/e4/54/ad5eb37bf9d51800010a74e4665425831a9db4e7c4e0fde4352e391e808e/MarkupSafe-2.1.5-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:a17a92de5231666cfbe003f0e4b9b3a7ae3afb1ec2845aadc2bacc93ff85febc", size = 18206 }, - { url = "https://files.pythonhosted.org/packages/6a/4a/a4d49415e600bacae038c67f9fecc1d5433b9d3c71a4de6f33537b89654c/MarkupSafe-2.1.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:72b6be590cc35924b02c78ef34b467da4ba07e4e0f0454a2c5907f473fc50ce5", size = 14079 }, - { url = "https://files.pythonhosted.org/packages/0a/7b/85681ae3c33c385b10ac0f8dd025c30af83c78cec1c37a6aa3b55e67f5ec/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e61659ba32cf2cf1481e575d0462554625196a1f2fc06a1c777d3f48e8865d46", size = 26620 }, - { url = "https://files.pythonhosted.org/packages/7c/52/2b1b570f6b8b803cef5ac28fdf78c0da318916c7d2fe9402a84d591b394c/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2174c595a0d73a3080ca3257b40096db99799265e1c27cc5a610743acd86d62f", size = 25818 }, - { url = "https://files.pythonhosted.org/packages/29/fe/a36ba8c7ca55621620b2d7c585313efd10729e63ef81e4e61f52330da781/MarkupSafe-2.1.5-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ae2ad8ae6ebee9d2d94b17fb62763125f3f374c25618198f40cbb8b525411900", size = 25493 }, - { url = "https://files.pythonhosted.org/packages/60/ae/9c60231cdfda003434e8bd27282b1f4e197ad5a710c14bee8bea8a9ca4f0/MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:075202fa5b72c86ad32dc7d0b56024ebdbcf2048c0ba09f1cde31bfdd57bcfff", size = 30630 }, - { url = "https://files.pythonhosted.org/packages/65/dc/1510be4d179869f5dafe071aecb3f1f41b45d37c02329dfba01ff59e5ac5/MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:598e3276b64aff0e7b3451b72e94fa3c238d452e7ddcd893c3ab324717456bad", size = 29745 }, - { url = "https://files.pythonhosted.org/packages/30/39/8d845dd7d0b0613d86e0ef89549bfb5f61ed781f59af45fc96496e897f3a/MarkupSafe-2.1.5-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fce659a462a1be54d2ffcacea5e3ba2d74daa74f30f5f143fe0c58636e355fdd", size = 30021 }, - { url = "https://files.pythonhosted.org/packages/c7/5c/356a6f62e4f3c5fbf2602b4771376af22a3b16efa74eb8716fb4e328e01e/MarkupSafe-2.1.5-cp310-cp310-win32.whl", hash = "sha256:d9fad5155d72433c921b782e58892377c44bd6252b5af2f67f16b194987338a4", size = 16659 }, - { url = "https://files.pythonhosted.org/packages/69/48/acbf292615c65f0604a0c6fc402ce6d8c991276e16c80c46a8f758fbd30c/MarkupSafe-2.1.5-cp310-cp310-win_amd64.whl", hash = "sha256:bf50cd79a75d181c9181df03572cdce0fbb75cc353bc350712073108cba98de5", size = 17213 }, - { url = "https://files.pythonhosted.org/packages/11/e7/291e55127bb2ae67c64d66cef01432b5933859dfb7d6949daa721b89d0b3/MarkupSafe-2.1.5-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:629ddd2ca402ae6dbedfceeba9c46d5f7b2a61d9749597d4307f943ef198fc1f", size = 18219 }, - { url = "https://files.pythonhosted.org/packages/6b/cb/aed7a284c00dfa7c0682d14df85ad4955a350a21d2e3b06d8240497359bf/MarkupSafe-2.1.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5b7b716f97b52c5a14bffdf688f971b2d5ef4029127f1ad7a513973cfd818df2", size = 14098 }, - { url = "https://files.pythonhosted.org/packages/1c/cf/35fe557e53709e93feb65575c93927942087e9b97213eabc3fe9d5b25a55/MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6ec585f69cec0aa07d945b20805be741395e28ac1627333b1c5b0105962ffced", size = 29014 }, - { url = "https://files.pythonhosted.org/packages/97/18/c30da5e7a0e7f4603abfc6780574131221d9148f323752c2755d48abad30/MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b91c037585eba9095565a3556f611e3cbfaa42ca1e865f7b8015fe5c7336d5a5", size = 28220 }, - { url = "https://files.pythonhosted.org/packages/0c/40/2e73e7d532d030b1e41180807a80d564eda53babaf04d65e15c1cf897e40/MarkupSafe-2.1.5-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7502934a33b54030eaf1194c21c692a534196063db72176b0c4028e140f8f32c", size = 27756 }, - { url = "https://files.pythonhosted.org/packages/18/46/5dca760547e8c59c5311b332f70605d24c99d1303dd9a6e1fc3ed0d73561/MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:0e397ac966fdf721b2c528cf028494e86172b4feba51d65f81ffd65c63798f3f", size = 33988 }, - { url = "https://files.pythonhosted.org/packages/6d/c5/27febe918ac36397919cd4a67d5579cbbfa8da027fa1238af6285bb368ea/MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:c061bb86a71b42465156a3ee7bd58c8c2ceacdbeb95d05a99893e08b8467359a", size = 32718 }, - { url = "https://files.pythonhosted.org/packages/f8/81/56e567126a2c2bc2684d6391332e357589a96a76cb9f8e5052d85cb0ead8/MarkupSafe-2.1.5-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:3a57fdd7ce31c7ff06cdfbf31dafa96cc533c21e443d57f5b1ecc6cdc668ec7f", size = 33317 }, - { url = "https://files.pythonhosted.org/packages/00/0b/23f4b2470accb53285c613a3ab9ec19dc944eaf53592cb6d9e2af8aa24cc/MarkupSafe-2.1.5-cp311-cp311-win32.whl", hash = "sha256:397081c1a0bfb5124355710fe79478cdbeb39626492b15d399526ae53422b906", size = 16670 }, - { url = "https://files.pythonhosted.org/packages/b7/a2/c78a06a9ec6d04b3445a949615c4c7ed86a0b2eb68e44e7541b9d57067cc/MarkupSafe-2.1.5-cp311-cp311-win_amd64.whl", hash = "sha256:2b7c57a4dfc4f16f7142221afe5ba4e093e09e728ca65c51f5620c9aaeb9a617", size = 17224 }, - { url = "https://files.pythonhosted.org/packages/53/bd/583bf3e4c8d6a321938c13f49d44024dbe5ed63e0a7ba127e454a66da974/MarkupSafe-2.1.5-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:8dec4936e9c3100156f8a2dc89c4b88d5c435175ff03413b443469c7c8c5f4d1", size = 18215 }, - { url = "https://files.pythonhosted.org/packages/48/d6/e7cd795fc710292c3af3a06d80868ce4b02bfbbf370b7cee11d282815a2a/MarkupSafe-2.1.5-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:3c6b973f22eb18a789b1460b4b91bf04ae3f0c4234a0a6aa6b0a92f6f7b951d4", size = 14069 }, - { url = "https://files.pythonhosted.org/packages/51/b5/5d8ec796e2a08fc814a2c7d2584b55f889a55cf17dd1a90f2beb70744e5c/MarkupSafe-2.1.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ac07bad82163452a6884fe8fa0963fb98c2346ba78d779ec06bd7a6262132aee", size = 29452 }, - { url = "https://files.pythonhosted.org/packages/0a/0d/2454f072fae3b5a137c119abf15465d1771319dfe9e4acbb31722a0fff91/MarkupSafe-2.1.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f5dfb42c4604dddc8e4305050aa6deb084540643ed5804d7455b5df8fe16f5e5", size = 28462 }, - { url = "https://files.pythonhosted.org/packages/2d/75/fd6cb2e68780f72d47e6671840ca517bda5ef663d30ada7616b0462ad1e3/MarkupSafe-2.1.5-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ea3d8a3d18833cf4304cd2fc9cbb1efe188ca9b5efef2bdac7adc20594a0e46b", size = 27869 }, - { url = "https://files.pythonhosted.org/packages/b0/81/147c477391c2750e8fc7705829f7351cf1cd3be64406edcf900dc633feb2/MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:d050b3361367a06d752db6ead6e7edeb0009be66bc3bae0ee9d97fb326badc2a", size = 33906 }, - { url = "https://files.pythonhosted.org/packages/8b/ff/9a52b71839d7a256b563e85d11050e307121000dcebc97df120176b3ad93/MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:bec0a414d016ac1a18862a519e54b2fd0fc8bbfd6890376898a6c0891dd82e9f", size = 32296 }, - { url = "https://files.pythonhosted.org/packages/88/07/2dc76aa51b481eb96a4c3198894f38b480490e834479611a4053fbf08623/MarkupSafe-2.1.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:58c98fee265677f63a4385256a6d7683ab1832f3ddd1e66fe948d5880c21a169", size = 33038 }, - { url = "https://files.pythonhosted.org/packages/96/0c/620c1fb3661858c0e37eb3cbffd8c6f732a67cd97296f725789679801b31/MarkupSafe-2.1.5-cp312-cp312-win32.whl", hash = "sha256:8590b4ae07a35970728874632fed7bd57b26b0102df2d2b233b6d9d82f6c62ad", size = 16572 }, - { url = "https://files.pythonhosted.org/packages/3f/14/c3554d512d5f9100a95e737502f4a2323a1959f6d0d01e0d0997b35f7b10/MarkupSafe-2.1.5-cp312-cp312-win_amd64.whl", hash = "sha256:823b65d8706e32ad2df51ed89496147a42a2a6e01c13cfb6ffb8b1e92bc910bb", size = 17127 }, - { url = "https://files.pythonhosted.org/packages/f8/ff/2c942a82c35a49df5de3a630ce0a8456ac2969691b230e530ac12314364c/MarkupSafe-2.1.5-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:656f7526c69fac7f600bd1f400991cc282b417d17539a1b228617081106feb4a", size = 18192 }, - { url = "https://files.pythonhosted.org/packages/4f/14/6f294b9c4f969d0c801a4615e221c1e084722ea6114ab2114189c5b8cbe0/MarkupSafe-2.1.5-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:97cafb1f3cbcd3fd2b6fbfb99ae11cdb14deea0736fc2b0952ee177f2b813a46", size = 14072 }, - { url = "https://files.pythonhosted.org/packages/81/d4/fd74714ed30a1dedd0b82427c02fa4deec64f173831ec716da11c51a50aa/MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f3fbcb7ef1f16e48246f704ab79d79da8a46891e2da03f8783a5b6fa41a9532", size = 26928 }, - { url = "https://files.pythonhosted.org/packages/c7/bd/50319665ce81bb10e90d1cf76f9e1aa269ea6f7fa30ab4521f14d122a3df/MarkupSafe-2.1.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fa9db3f79de01457b03d4f01b34cf91bc0048eb2c3846ff26f66687c2f6d16ab", size = 26106 }, - { url = "https://files.pythonhosted.org/packages/4c/6f/f2b0f675635b05f6afd5ea03c094557bdb8622fa8e673387444fe8d8e787/MarkupSafe-2.1.5-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffee1f21e5ef0d712f9033568f8344d5da8cc2869dbd08d87c84656e6a2d2f68", size = 25781 }, - { url = "https://files.pythonhosted.org/packages/51/e0/393467cf899b34a9d3678e78961c2c8cdf49fb902a959ba54ece01273fb1/MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:5dedb4db619ba5a2787a94d877bc8ffc0566f92a01c0ef214865e54ecc9ee5e0", size = 30518 }, - { url = "https://files.pythonhosted.org/packages/f6/02/5437e2ad33047290dafced9df741d9efc3e716b75583bbd73a9984f1b6f7/MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:30b600cf0a7ac9234b2638fbc0fb6158ba5bdcdf46aeb631ead21248b9affbc4", size = 29669 }, - { url = "https://files.pythonhosted.org/packages/0e/7d/968284145ffd9d726183ed6237c77938c021abacde4e073020f920e060b2/MarkupSafe-2.1.5-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:8dd717634f5a044f860435c1d8c16a270ddf0ef8588d4887037c5028b859b0c3", size = 29933 }, - { url = "https://files.pythonhosted.org/packages/bf/f3/ecb00fc8ab02b7beae8699f34db9357ae49d9f21d4d3de6f305f34fa949e/MarkupSafe-2.1.5-cp38-cp38-win32.whl", hash = "sha256:daa4ee5a243f0f20d528d939d06670a298dd39b1ad5f8a72a4275124a7819eff", size = 16656 }, - { url = "https://files.pythonhosted.org/packages/92/21/357205f03514a49b293e214ac39de01fadd0970a6e05e4bf1ddd0ffd0881/MarkupSafe-2.1.5-cp38-cp38-win_amd64.whl", hash = "sha256:619bc166c4f2de5caa5a633b8b7326fbe98e0ccbfacabd87268a2b15ff73a029", size = 17206 }, - { url = "https://files.pythonhosted.org/packages/0f/31/780bb297db036ba7b7bbede5e1d7f1e14d704ad4beb3ce53fb495d22bc62/MarkupSafe-2.1.5-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:7a68b554d356a91cce1236aa7682dc01df0edba8d043fd1ce607c49dd3c1edcf", size = 18193 }, - { url = "https://files.pythonhosted.org/packages/6c/77/d77701bbef72892affe060cdacb7a2ed7fd68dae3b477a8642f15ad3b132/MarkupSafe-2.1.5-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:db0b55e0f3cc0be60c1f19efdde9a637c32740486004f20d1cff53c3c0ece4d2", size = 14073 }, - { url = "https://files.pythonhosted.org/packages/d9/a7/1e558b4f78454c8a3a0199292d96159eb4d091f983bc35ef258314fe7269/MarkupSafe-2.1.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3e53af139f8579a6d5f7b76549125f0d94d7e630761a2111bc431fd820e163b8", size = 26486 }, - { url = "https://files.pythonhosted.org/packages/5f/5a/360da85076688755ea0cceb92472923086993e86b5613bbae9fbc14136b0/MarkupSafe-2.1.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:17b950fccb810b3293638215058e432159d2b71005c74371d784862b7e4683f3", size = 25685 }, - { url = "https://files.pythonhosted.org/packages/6a/18/ae5a258e3401f9b8312f92b028c54d7026a97ec3ab20bfaddbdfa7d8cce8/MarkupSafe-2.1.5-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4c31f53cdae6ecfa91a77820e8b151dba54ab528ba65dfd235c80b086d68a465", size = 25338 }, - { url = "https://files.pythonhosted.org/packages/0b/cc/48206bd61c5b9d0129f4d75243b156929b04c94c09041321456fd06a876d/MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:bff1b4290a66b490a2f4719358c0cdcd9bafb6b8f061e45c7a2460866bf50c2e", size = 30439 }, - { url = "https://files.pythonhosted.org/packages/d1/06/a41c112ab9ffdeeb5f77bc3e331fdadf97fa65e52e44ba31880f4e7f983c/MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:bc1667f8b83f48511b94671e0e441401371dfd0f0a795c7daa4a3cd1dde55bea", size = 29531 }, - { url = "https://files.pythonhosted.org/packages/02/8c/ab9a463301a50dab04d5472e998acbd4080597abc048166ded5c7aa768c8/MarkupSafe-2.1.5-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5049256f536511ee3f7e1b3f87d1d1209d327e818e6ae1365e8653d7e3abb6a6", size = 29823 }, - { url = "https://files.pythonhosted.org/packages/bc/29/9bc18da763496b055d8e98ce476c8e718dcfd78157e17f555ce6dd7d0895/MarkupSafe-2.1.5-cp39-cp39-win32.whl", hash = "sha256:00e046b6dd71aa03a41079792f8473dc494d564611a8f89bbbd7cb93295ebdcf", size = 16658 }, - { url = "https://files.pythonhosted.org/packages/f6/f8/4da07de16f10551ca1f640c92b5f316f9394088b183c6a57183df6de5ae4/MarkupSafe-2.1.5-cp39-cp39-win_amd64.whl", hash = "sha256:fa173ec60341d6bb97a89f5ea19c85c5643c1e7dedebc22f5181eb73573142c5", size = 17211 }, +version = "3.0.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b2/97/5d42485e71dfc078108a86d6de8fa46db44a1a9295e89c5d6d4a06e23a62/markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0", size = 20537 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/04/90/d08277ce111dd22f77149fd1a5d4653eeb3b3eaacbdfcbae5afb2600eebd/MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8", size = 14357 }, + { url = "https://files.pythonhosted.org/packages/04/e1/6e2194baeae0bca1fae6629dc0cbbb968d4d941469cbab11a3872edff374/MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158", size = 12393 }, + { url = "https://files.pythonhosted.org/packages/1d/69/35fa85a8ece0a437493dc61ce0bb6d459dcba482c34197e3efc829aa357f/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579", size = 21732 }, + { url = "https://files.pythonhosted.org/packages/22/35/137da042dfb4720b638d2937c38a9c2df83fe32d20e8c8f3185dbfef05f7/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d", size = 20866 }, + { url = "https://files.pythonhosted.org/packages/29/28/6d029a903727a1b62edb51863232152fd335d602def598dade38996887f0/MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb", size = 20964 }, + { url = "https://files.pythonhosted.org/packages/cc/cd/07438f95f83e8bc028279909d9c9bd39e24149b0d60053a97b2bc4f8aa51/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b", size = 21977 }, + { url = "https://files.pythonhosted.org/packages/29/01/84b57395b4cc062f9c4c55ce0df7d3108ca32397299d9df00fedd9117d3d/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c", size = 21366 }, + { url = "https://files.pythonhosted.org/packages/bd/6e/61ebf08d8940553afff20d1fb1ba7294b6f8d279df9fd0c0db911b4bbcfd/MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171", size = 21091 }, + { url = "https://files.pythonhosted.org/packages/11/23/ffbf53694e8c94ebd1e7e491de185124277964344733c45481f32ede2499/MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50", size = 15065 }, + { url = "https://files.pythonhosted.org/packages/44/06/e7175d06dd6e9172d4a69a72592cb3f7a996a9c396eee29082826449bbc3/MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a", size = 15514 }, + { url = "https://files.pythonhosted.org/packages/6b/28/bbf83e3f76936960b850435576dd5e67034e200469571be53f69174a2dfd/MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d", size = 14353 }, + { url = "https://files.pythonhosted.org/packages/6c/30/316d194b093cde57d448a4c3209f22e3046c5bb2fb0820b118292b334be7/MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93", size = 12392 }, + { url = "https://files.pythonhosted.org/packages/f2/96/9cdafba8445d3a53cae530aaf83c38ec64c4d5427d975c974084af5bc5d2/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832", size = 23984 }, + { url = "https://files.pythonhosted.org/packages/f1/a4/aefb044a2cd8d7334c8a47d3fb2c9f328ac48cb349468cc31c20b539305f/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84", size = 23120 }, + { url = "https://files.pythonhosted.org/packages/8d/21/5e4851379f88f3fad1de30361db501300d4f07bcad047d3cb0449fc51f8c/MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca", size = 23032 }, + { url = "https://files.pythonhosted.org/packages/00/7b/e92c64e079b2d0d7ddf69899c98842f3f9a60a1ae72657c89ce2655c999d/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798", size = 24057 }, + { url = "https://files.pythonhosted.org/packages/f9/ac/46f960ca323037caa0a10662ef97d0a4728e890334fc156b9f9e52bcc4ca/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e", size = 23359 }, + { url = "https://files.pythonhosted.org/packages/69/84/83439e16197337b8b14b6a5b9c2105fff81d42c2a7c5b58ac7b62ee2c3b1/MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4", size = 23306 }, + { url = "https://files.pythonhosted.org/packages/9a/34/a15aa69f01e2181ed8d2b685c0d2f6655d5cca2c4db0ddea775e631918cd/MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d", size = 15094 }, + { url = "https://files.pythonhosted.org/packages/da/b8/3a3bd761922d416f3dc5d00bfbed11f66b1ab89a0c2b6e887240a30b0f6b/MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b", size = 15521 }, + { url = "https://files.pythonhosted.org/packages/22/09/d1f21434c97fc42f09d290cbb6350d44eb12f09cc62c9476effdb33a18aa/MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf", size = 14274 }, + { url = "https://files.pythonhosted.org/packages/6b/b0/18f76bba336fa5aecf79d45dcd6c806c280ec44538b3c13671d49099fdd0/MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225", size = 12348 }, + { url = "https://files.pythonhosted.org/packages/e0/25/dd5c0f6ac1311e9b40f4af06c78efde0f3b5cbf02502f8ef9501294c425b/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028", size = 24149 }, + { url = "https://files.pythonhosted.org/packages/f3/f0/89e7aadfb3749d0f52234a0c8c7867877876e0a20b60e2188e9850794c17/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8", size = 23118 }, + { url = "https://files.pythonhosted.org/packages/d5/da/f2eeb64c723f5e3777bc081da884b414671982008c47dcc1873d81f625b6/MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c", size = 22993 }, + { url = "https://files.pythonhosted.org/packages/da/0e/1f32af846df486dce7c227fe0f2398dc7e2e51d4a370508281f3c1c5cddc/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557", size = 24178 }, + { url = "https://files.pythonhosted.org/packages/c4/f6/bb3ca0532de8086cbff5f06d137064c8410d10779c4c127e0e47d17c0b71/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22", size = 23319 }, + { url = "https://files.pythonhosted.org/packages/a2/82/8be4c96ffee03c5b4a034e60a31294daf481e12c7c43ab8e34a1453ee48b/MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48", size = 23352 }, + { url = "https://files.pythonhosted.org/packages/51/ae/97827349d3fcffee7e184bdf7f41cd6b88d9919c80f0263ba7acd1bbcb18/MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30", size = 15097 }, + { url = "https://files.pythonhosted.org/packages/c1/80/a61f99dc3a936413c3ee4e1eecac96c0da5ed07ad56fd975f1a9da5bc630/MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87", size = 15601 }, + { url = "https://files.pythonhosted.org/packages/83/0e/67eb10a7ecc77a0c2bbe2b0235765b98d164d81600746914bebada795e97/MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd", size = 14274 }, + { url = "https://files.pythonhosted.org/packages/2b/6d/9409f3684d3335375d04e5f05744dfe7e9f120062c9857df4ab490a1031a/MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430", size = 12352 }, + { url = "https://files.pythonhosted.org/packages/d2/f5/6eadfcd3885ea85fe2a7c128315cc1bb7241e1987443d78c8fe712d03091/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094", size = 24122 }, + { url = "https://files.pythonhosted.org/packages/0c/91/96cf928db8236f1bfab6ce15ad070dfdd02ed88261c2afafd4b43575e9e9/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396", size = 23085 }, + { url = "https://files.pythonhosted.org/packages/c2/cf/c9d56af24d56ea04daae7ac0940232d31d5a8354f2b457c6d856b2057d69/MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79", size = 22978 }, + { url = "https://files.pythonhosted.org/packages/2a/9f/8619835cd6a711d6272d62abb78c033bda638fdc54c4e7f4272cf1c0962b/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a", size = 24208 }, + { url = "https://files.pythonhosted.org/packages/f9/bf/176950a1792b2cd2102b8ffeb5133e1ed984547b75db47c25a67d3359f77/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca", size = 23357 }, + { url = "https://files.pythonhosted.org/packages/ce/4f/9a02c1d335caabe5c4efb90e1b6e8ee944aa245c1aaaab8e8a618987d816/MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c", size = 23344 }, + { url = "https://files.pythonhosted.org/packages/ee/55/c271b57db36f748f0e04a759ace9f8f759ccf22b4960c270c78a394f58be/MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1", size = 15101 }, + { url = "https://files.pythonhosted.org/packages/29/88/07df22d2dd4df40aba9f3e402e6dc1b8ee86297dddbad4872bd5e7b0094f/MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f", size = 15603 }, + { url = "https://files.pythonhosted.org/packages/62/6a/8b89d24db2d32d433dffcd6a8779159da109842434f1dd2f6e71f32f738c/MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c", size = 14510 }, + { url = "https://files.pythonhosted.org/packages/7a/06/a10f955f70a2e5a9bf78d11a161029d278eeacbd35ef806c3fd17b13060d/MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb", size = 12486 }, + { url = "https://files.pythonhosted.org/packages/34/cf/65d4a571869a1a9078198ca28f39fba5fbb910f952f9dbc5220afff9f5e6/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c", size = 25480 }, + { url = "https://files.pythonhosted.org/packages/0c/e3/90e9651924c430b885468b56b3d597cabf6d72be4b24a0acd1fa0e12af67/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d", size = 23914 }, + { url = "https://files.pythonhosted.org/packages/66/8c/6c7cf61f95d63bb866db39085150df1f2a5bd3335298f14a66b48e92659c/MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe", size = 23796 }, + { url = "https://files.pythonhosted.org/packages/bb/35/cbe9238ec3f47ac9a7c8b3df7a808e7cb50fe149dc7039f5f454b3fba218/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5", size = 25473 }, + { url = "https://files.pythonhosted.org/packages/e6/32/7621a4382488aa283cc05e8984a9c219abad3bca087be9ec77e89939ded9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a", size = 24114 }, + { url = "https://files.pythonhosted.org/packages/0d/80/0985960e4b89922cb5a0bac0ed39c5b96cbc1a536a99f30e8c220a996ed9/MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9", size = 24098 }, + { url = "https://files.pythonhosted.org/packages/82/78/fedb03c7d5380df2427038ec8d973587e90561b2d90cd472ce9254cf348b/MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6", size = 15208 }, + { url = "https://files.pythonhosted.org/packages/4f/65/6079a46068dfceaeabb5dcad6d674f5f5c61a6fa5673746f42a9f4c233b3/MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f", size = 15739 }, + { url = "https://files.pythonhosted.org/packages/a7/ea/9b1530c3fdeeca613faeb0fb5cbcf2389d816072fab72a71b45749ef6062/MarkupSafe-3.0.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:eaa0a10b7f72326f1372a713e73c3f739b524b3af41feb43e4921cb529f5929a", size = 14344 }, + { url = "https://files.pythonhosted.org/packages/4b/c2/fbdbfe48848e7112ab05e627e718e854d20192b674952d9042ebd8c9e5de/MarkupSafe-3.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:48032821bbdf20f5799ff537c7ac3d1fba0ba032cfc06194faffa8cda8b560ff", size = 12389 }, + { url = "https://files.pythonhosted.org/packages/f0/25/7a7c6e4dbd4f867d95d94ca15449e91e52856f6ed1905d58ef1de5e211d0/MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a9d3f5f0901fdec14d8d2f66ef7d035f2157240a433441719ac9a3fba440b13", size = 21607 }, + { url = "https://files.pythonhosted.org/packages/53/8f/f339c98a178f3c1e545622206b40986a4c3307fe39f70ccd3d9df9a9e425/MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:88b49a3b9ff31e19998750c38e030fc7bb937398b1f78cfa599aaef92d693144", size = 20728 }, + { url = "https://files.pythonhosted.org/packages/1a/03/8496a1a78308456dbd50b23a385c69b41f2e9661c67ea1329849a598a8f9/MarkupSafe-3.0.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cfad01eed2c2e0c01fd0ecd2ef42c492f7f93902e39a42fc9ee1692961443a29", size = 20826 }, + { url = "https://files.pythonhosted.org/packages/e6/cf/0a490a4bd363048c3022f2f475c8c05582179bb179defcee4766fb3dcc18/MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:1225beacc926f536dc82e45f8a4d68502949dc67eea90eab715dea3a21c1b5f0", size = 21843 }, + { url = "https://files.pythonhosted.org/packages/19/a3/34187a78613920dfd3cdf68ef6ce5e99c4f3417f035694074beb8848cd77/MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:3169b1eefae027567d1ce6ee7cae382c57fe26e82775f460f0b2778beaad66c0", size = 21219 }, + { url = "https://files.pythonhosted.org/packages/17/d8/5811082f85bb88410ad7e452263af048d685669bbbfb7b595e8689152498/MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:eb7972a85c54febfb25b5c4b4f3af4dcc731994c7da0d8a0b4a6eb0640e1d178", size = 20946 }, + { url = "https://files.pythonhosted.org/packages/7c/31/bd635fb5989440d9365c5e3c47556cfea121c7803f5034ac843e8f37c2f2/MarkupSafe-3.0.2-cp39-cp39-win32.whl", hash = "sha256:8c4e8c3ce11e1f92f6536ff07154f9d49677ebaaafc32db9db4620bc11ed480f", size = 15063 }, + { url = "https://files.pythonhosted.org/packages/b3/73/085399401383ce949f727afec55ec3abd76648d04b9f22e1c0e99cb4bec3/MarkupSafe-3.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:6e296a513ca3d94054c2c881cc913116e90fd030ad1c656b3869762b754f5f8a", size = 15506 }, ] [[package]] @@ -898,13 +965,22 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314 }, ] +[[package]] +name = "oauthlib" +version = "3.2.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/6d/fa/fbf4001037904031639e6bfbfc02badfc7e12f137a8afa254df6c4c8a670/oauthlib-3.2.2.tar.gz", hash = "sha256:9859c40929662bec5d64f34d01c99e093149682a3f38915dc0655d5a633dd918", size = 177352 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/7e/80/cab10959dc1faead58dc8384a781dfbf93cb4d33d50988f7a69f1b7c9bbe/oauthlib-3.2.2-py3-none-any.whl", hash = "sha256:8139f29aac13e25d502680e9e19963e83f16838d48a0d71c287fe40e7067fbca", size = 151688 }, +] + [[package]] name = "packaging" -version = "24.1" +version = "24.2" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/51/65/50db4dda066951078f0a96cf12f4b9ada6e4b811516bf0262c0f4f7064d4/packaging-24.1.tar.gz", hash = "sha256:026ed72c8ed3fcce5bf8950572258698927fd1dbda10a5e981cdf0ac37f4f002", size = 148788 } +sdist = { url = "https://files.pythonhosted.org/packages/d0/63/68dbb6eb2de9cb10ee4c9c14a0148804425e13c4fb20d61cce69f53106da/packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f", size = 163950 } wheels = [ - { url = "https://files.pythonhosted.org/packages/08/aa/cc0199a5f0ad350994d660967a8efb233fe0416e4639146c089643407ce6/packaging-24.1-py3-none-any.whl", hash = "sha256:5b8f2217dbdbd2f7f384c41c628544e6d52f2d0f53c6d0c3ea61aa5d1d7ff124", size = 53985 }, + { url = "https://files.pythonhosted.org/packages/88/ef/eb23f262cca3c0c4eb7ab1933c3b1f03d021f2c48f54763065b6f0e321be/packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759", size = 65451 }, ] [[package]] @@ -937,23 +1013,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/ef/7d/500c9ad20238fcfcb4cb9243eede163594d7020ce87bd9610c9e02771876/pip-24.3.1-py3-none-any.whl", hash = "sha256:3790624780082365f47549d032f3770eeb2b1e8bd1f7b2e02dace1afa361b4ed", size = 1822182 }, ] -[[package]] -name = "pipx" -version = "1.7.1" -source = { registry = "https://pypi.org/simple" } -dependencies = [ - { name = "argcomplete" }, - { name = "colorama", marker = "sys_platform == 'win32'" }, - { name = "packaging" }, - { name = "platformdirs" }, - { name = "tomli", marker = "python_full_version < '3.11'" }, - { name = "userpath" }, -] -sdist = { url = "https://files.pythonhosted.org/packages/17/21/dd6b9a9c4f0cb659ce3dad991f0e8dde852b2c81922224ef77df4222ab7a/pipx-1.7.1.tar.gz", hash = "sha256:762de134e16a462be92645166d225ecef446afaef534917f5f70008d63584360", size = 291889 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/35/af/66db02a214590a841bcd1df1f02f7ef818dc3f43487acddab0b8c40b25d2/pipx-1.7.1-py3-none-any.whl", hash = "sha256:3933c43bb344e649cb28e10d357e0967ce8572f1c19caf90cf39ae95c2a0afaf", size = 78749 }, -] - [[package]] name = "pkginfo" version = "1.10.0" @@ -963,15 +1022,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/56/09/054aea9b7534a15ad38a363a2bd974c20646ab1582a387a95b8df1bfea1c/pkginfo-1.10.0-py3-none-any.whl", hash = "sha256:889a6da2ed7ffc58ab5b900d888ddce90bce912f2d2de1dc1c26f4cb9fe65097", size = 30392 }, ] -[[package]] -name = "pkgutil-resolve-name" -version = "1.3.10" -source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/70/f2/f2891a9dc37398696ddd945012b90ef8d0a034f0012e3f83c3f7a70b0f79/pkgutil_resolve_name-1.3.10.tar.gz", hash = "sha256:357d6c9e6a755653cfd78893817c0853af365dd51ec97f3d358a819373bbd174", size = 5054 } -wheels = [ - { url = "https://files.pythonhosted.org/packages/c9/5c/3d4882ba113fd55bdba9326c1e4c62a15e674a2501de4869e6bd6301f87e/pkgutil_resolve_name-1.3.10-py3-none-any.whl", hash = "sha256:ca27cc078d25c5ad71a9de0a7a330146c4e014c2462d9af19c6b828280649c5e", size = 4734 }, -] - [[package]] name = "platformdirs" version = "4.3.6" @@ -992,7 +1042,7 @@ wheels = [ [[package]] name = "pre-commit" -version = "3.5.0" +version = "4.0.1" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "cfgv" }, @@ -1001,9 +1051,50 @@ dependencies = [ { name = "pyyaml" }, { name = "virtualenv" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/04/b3/4ae08d21eb097162f5aad37f4585f8069a86402ed7f5362cc9ae097f9572/pre_commit-3.5.0.tar.gz", hash = "sha256:5804465c675b659b0862f07907f96295d490822a450c4c40e747d0b1c6ebcb32", size = 177079 } +sdist = { url = "https://files.pythonhosted.org/packages/2e/c8/e22c292035f1bac8b9f5237a2622305bc0304e776080b246f3df57c4ff9f/pre_commit-4.0.1.tar.gz", hash = "sha256:80905ac375958c0444c65e9cebebd948b3cdb518f335a091a670a89d652139d2", size = 191678 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/16/8f/496e10d51edd6671ebe0432e33ff800aa86775d2d147ce7d43389324a525/pre_commit-4.0.1-py2.py3-none-any.whl", hash = "sha256:efde913840816312445dc98787724647c65473daefe420785f885e8ed9a06878", size = 218713 }, +] + +[[package]] +name = "pre-commit-uv" +version = "4.1.4" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pre-commit" }, + { name = "uv" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/b1/6c/c3c1d01698c8abb0b546defc0304971fa7fb2ba84ad35587b9dad095d73f/pre_commit_uv-4.1.4.tar.gz", hash = "sha256:3db606a79b226127b27dbbd8381b78c0e30de3ac775a8492c576a68e9250535c", size = 6493 } wheels = [ - { url = "https://files.pythonhosted.org/packages/6c/75/526915fedf462e05eeb1c75ceaf7e3f9cde7b5ce6f62740fe5f7f19a0050/pre_commit-3.5.0-py2.py3-none-any.whl", hash = "sha256:841dc9aef25daba9a0238cd27984041fa0467b4199fc4852e27950664919f660", size = 203698 }, + { url = "https://files.pythonhosted.org/packages/f1/70/1b65f9118ef64f6ffe5d57a67170bbff25d4f4a3d1cb78e8ed3392e16114/pre_commit_uv-4.1.4-py3-none-any.whl", hash = "sha256:7f01fb494fa1caa5097d20a38f71df7cea0209197b2564699cef9b3f3aa9d135", size = 5578 }, +] + +[[package]] +name = "proto-plus" +version = "1.25.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "protobuf" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/7e/05/74417b2061e1bf1b82776037cad97094228fa1c1b6e82d08a78d3fb6ddb6/proto_plus-1.25.0.tar.gz", hash = "sha256:fbb17f57f7bd05a68b7707e745e26528b0b3c34e378db91eef93912c54982d91", size = 56124 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/dd/25/0b7cc838ae3d76d46539020ec39fc92bfc9acc29367e58fe912702c2a79e/proto_plus-1.25.0-py3-none-any.whl", hash = "sha256:c91fc4a65074ade8e458e95ef8bac34d4008daa7cce4a12d6707066fca648961", size = 50126 }, +] + +[[package]] +name = "protobuf" +version = "5.28.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/74/6e/e69eb906fddcb38f8530a12f4b410699972ab7ced4e21524ece9d546ac27/protobuf-5.28.3.tar.gz", hash = "sha256:64badbc49180a5e401f373f9ce7ab1d18b63f7dd4a9cdc43c92b9f0b481cef7b", size = 422479 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/c5/05163fad52d7c43e124a545f1372d18266db36036377ad29de4271134a6a/protobuf-5.28.3-cp310-abi3-win32.whl", hash = "sha256:0c4eec6f987338617072592b97943fdbe30d019c56126493111cf24344c1cc24", size = 419624 }, + { url = "https://files.pythonhosted.org/packages/9c/4c/4563ebe001ff30dca9d7ed12e471fa098d9759712980cde1fd03a3a44fb7/protobuf-5.28.3-cp310-abi3-win_amd64.whl", hash = "sha256:91fba8f445723fcf400fdbe9ca796b19d3b1242cd873907979b9ed71e4afe868", size = 431464 }, + { url = "https://files.pythonhosted.org/packages/1c/f2/baf397f3dd1d3e4af7e3f5a0382b868d25ac068eefe1ebde05132333436c/protobuf-5.28.3-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:a3f6857551e53ce35e60b403b8a27b0295f7d6eb63d10484f12bc6879c715687", size = 414743 }, + { url = "https://files.pythonhosted.org/packages/85/50/cd61a358ba1601f40e7d38bcfba22e053f40ef2c50d55b55926aecc8fec7/protobuf-5.28.3-cp38-abi3-manylinux2014_aarch64.whl", hash = "sha256:3fa2de6b8b29d12c61911505d893afe7320ce7ccba4df913e2971461fa36d584", size = 316511 }, + { url = "https://files.pythonhosted.org/packages/5d/ae/3257b09328c0b4e59535e497b0c7537d4954038bdd53a2f0d2f49d15a7c4/protobuf-5.28.3-cp38-abi3-manylinux2014_x86_64.whl", hash = "sha256:712319fbdddb46f21abb66cd33cb9e491a5763b2febd8f228251add221981135", size = 316624 }, + { url = "https://files.pythonhosted.org/packages/57/b5/ee3d918f536168def73b3f49edeba065429ab3a7e7b033d33e69c46ddff9/protobuf-5.28.3-cp39-cp39-win32.whl", hash = "sha256:135658402f71bbd49500322c0f736145731b16fc79dc8f367ab544a17eab4535", size = 419648 }, + { url = "https://files.pythonhosted.org/packages/53/54/e1bdf6f1d29828ddb6aca0a83bf208ab1d5f88126f34e17e487b2cd20d93/protobuf-5.28.3-cp39-cp39-win_amd64.whl", hash = "sha256:70585a70fc2dd4818c51287ceef5bdba6387f88a578c86d47bb34669b5552c36", size = 431591 }, + { url = "https://files.pythonhosted.org/packages/ad/c3/2377c159e28ea89a91cf1ca223f827ae8deccb2c9c401e5ca233cd73002f/protobuf-5.28.3-py3-none-any.whl", hash = "sha256:cee1757663fa32a1ee673434fcf3bf24dd54763c79690201208bafec62f19eed", size = 169511 }, ] [[package]] @@ -1030,6 +1121,27 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/22/a6/858897256d0deac81a172289110f31629fc4cee19b6f01283303e18c8db3/ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35", size = 13993 }, ] +[[package]] +name = "pyasn1" +version = "0.6.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/ba/e9/01f1a64245b89f039897cb0130016d79f77d52669aae6ee7b159a6c4c018/pyasn1-0.6.1.tar.gz", hash = "sha256:6f580d2bdd84365380830acf45550f2511469f673cb4a5ae3857a3170128b034", size = 145322 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c8/f1/d6a797abb14f6283c0ddff96bbdd46937f64122b8c925cab503dd37f8214/pyasn1-0.6.1-py3-none-any.whl", hash = "sha256:0d632f46f2ba09143da3a8afe9e33fb6f92fa2320ab7e886e2d0f7672af84629", size = 83135 }, +] + +[[package]] +name = "pyasn1-modules" +version = "0.4.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pyasn1" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/1d/67/6afbf0d507f73c32d21084a79946bfcfca5fbc62a72057e9c23797a737c9/pyasn1_modules-0.4.1.tar.gz", hash = "sha256:c28e2dbf9c06ad61c71a075c7e0f9fd0f1b0bb2d2ad4377f240d33ac2ab60a7c", size = 310028 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/77/89/bc88a6711935ba795a679ea6ebee07e128050d6382eaa35a0a47c8032bdc/pyasn1_modules-0.4.1-py3-none-any.whl", hash = "sha256:49bfa96b45a292b711e986f222502c1c9a5e1f4e568fc30e2574a6c7d07838fd", size = 181537 }, +] + [[package]] name = "pycparser" version = "2.22" @@ -1099,6 +1211,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/5e/22/d3db169895faaf3e2eda892f005f433a62db2decbcfbc2f61e6517adfa87/PyNaCl-1.5.0-cp36-abi3-win_amd64.whl", hash = "sha256:20f42270d27e1b6a29f54032090b972d97f0a1b0948cc52392041ef7831fee93", size = 212141 }, ] +[[package]] +name = "pyparsing" +version = "3.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/8c/d5/e5aeee5387091148a19e1145f63606619cb5f20b83fccb63efae6474e7b2/pyparsing-3.2.0.tar.gz", hash = "sha256:cbf74e27246d595d9a74b186b810f6fbb86726dbf3b9532efb343f6d7294fe9c", size = 920984 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/be/ec/2eb3cd785efd67806c46c13a17339708ddc346cbb684eade7a6e6f79536a/pyparsing-3.2.0-py3-none-any.whl", hash = "sha256:93d9577b88da0bbea8cc8334ee8b918ed014968fd2ec383e868fb8afb1ccef84", size = 106921 }, +] + [[package]] name = "pytest" version = "8.3.3" @@ -1180,13 +1301,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597 }, { url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527 }, { url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446 }, - { url = "https://files.pythonhosted.org/packages/74/d9/323a59d506f12f498c2097488d80d16f4cf965cee1791eab58b56b19f47a/PyYAML-6.0.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:24471b829b3bf607e04e88d79542a9d48bb037c2267d7927a874e6c205ca7e9a", size = 183218 }, - { url = "https://files.pythonhosted.org/packages/74/cc/20c34d00f04d785f2028737e2e2a8254e1425102e730fee1d6396f832577/PyYAML-6.0.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d7fded462629cfa4b685c5416b949ebad6cec74af5e2d42905d41e257e0869f5", size = 728067 }, - { url = "https://files.pythonhosted.org/packages/20/52/551c69ca1501d21c0de51ddafa8c23a0191ef296ff098e98358f69080577/PyYAML-6.0.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d84a1718ee396f54f3a086ea0a66d8e552b2ab2017ef8b420e92edbc841c352d", size = 757812 }, - { url = "https://files.pythonhosted.org/packages/fd/7f/2c3697bba5d4aa5cc2afe81826d73dfae5f049458e44732c7a0938baa673/PyYAML-6.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9056c1ecd25795207ad294bcf39f2db3d845767be0ea6e6a34d856f006006083", size = 746531 }, - { url = "https://files.pythonhosted.org/packages/8c/ab/6226d3df99900e580091bb44258fde77a8433511a86883bd4681ea19a858/PyYAML-6.0.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:82d09873e40955485746739bcb8b4586983670466c23382c19cffecbf1fd8706", size = 800820 }, - { url = "https://files.pythonhosted.org/packages/a0/99/a9eb0f3e710c06c5d922026f6736e920d431812ace24aae38228d0d64b04/PyYAML-6.0.2-cp38-cp38-win32.whl", hash = "sha256:43fa96a3ca0d6b1812e01ced1044a003533c47f6ee8aca31724f78e93ccc089a", size = 145514 }, - { url = "https://files.pythonhosted.org/packages/75/8a/ee831ad5fafa4431099aa4e078d4c8efd43cd5e48fbc774641d233b683a9/PyYAML-6.0.2-cp38-cp38-win_amd64.whl", hash = "sha256:01179a4a8559ab5de078078f37e5c1a30d76bb88519906844fd7bdea1b7729ff", size = 162702 }, { url = "https://files.pythonhosted.org/packages/65/d8/b7a1db13636d7fb7d4ff431593c510c8b8fca920ade06ca8ef20015493c5/PyYAML-6.0.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:688ba32a1cffef67fd2e9398a2efebaea461578b0923624778664cc1c914db5d", size = 184777 }, { url = "https://files.pythonhosted.org/packages/0a/02/6ec546cd45143fdf9840b2c6be8d875116a64076218b61d68e12548e5839/PyYAML-6.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8786accb172bd8afb8be14490a16625cbc387036876ab6ba70912730faf8e1f", size = 172318 }, { url = "https://files.pythonhosted.org/packages/0e/9a/8cc68be846c972bda34f6c2a93abb644fb2476f4dcc924d52175786932c9/PyYAML-6.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d8e03406cac8513435335dbab54c0d385e4a49e4945d2909a581c83647ca0290", size = 720891 }, @@ -1200,16 +1314,16 @@ wheels = [ [[package]] name = "readme-renderer" -version = "43.0" +version = "44.0" source = { registry = "https://pypi.org/simple" } dependencies = [ { name = "docutils" }, { name = "nh3" }, { name = "pygments" }, ] -sdist = { url = "https://files.pythonhosted.org/packages/fe/b5/536c775084d239df6345dccf9b043419c7e3308bc31be4c7882196abc62e/readme_renderer-43.0.tar.gz", hash = "sha256:1818dd28140813509eeed8d62687f7cd4f7bad90d4db586001c5dc09d4fde311", size = 31768 } +sdist = { url = "https://files.pythonhosted.org/packages/5a/a9/104ec9234c8448c4379768221ea6df01260cd6c2ce13182d4eac531c8342/readme_renderer-44.0.tar.gz", hash = "sha256:8712034eabbfa6805cacf1402b4eeb2a73028f72d1166d6f5cb7f9c047c5d1e1", size = 32056 } wheels = [ - { url = "https://files.pythonhosted.org/packages/45/be/3ea20dc38b9db08387cf97997a85a7d51527ea2057d71118feb0aa8afa55/readme_renderer-43.0-py3-none-any.whl", hash = "sha256:19db308d86ecd60e5affa3b2a98f017af384678c63c88e5d4556a380e674f3f9", size = 13301 }, + { url = "https://files.pythonhosted.org/packages/e1/67/921ec3024056483db83953ae8e48079ad62b92db7880013ca77632921dd0/readme_renderer-44.0-py3-none-any.whl", hash = "sha256:2fbca89b81a08526aadf1357a8c2ae889ec05fb03f5da67f9769c9a592166151", size = 13310 }, ] [[package]] @@ -1240,6 +1354,19 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928 }, ] +[[package]] +name = "requests-oauthlib" +version = "2.0.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "oauthlib" }, + { name = "requests" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/42/f2/05f29bc3913aea15eb670be136045bf5c5bbf4b99ecb839da9b422bb2c85/requests-oauthlib-2.0.0.tar.gz", hash = "sha256:b3dffaebd884d8cd778494369603a9e7b58d29111bf6b41bdc2dcd87203af4e9", size = 55650 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3b/5d/63d4ae3b9daea098d5d6f5da83984853c1bbacd5dc826764b249fe119d24/requests_oauthlib-2.0.0-py2.py3-none-any.whl", hash = "sha256:7dd8a5c40426b779b0868c404bdef9768deccf22749cde15852df527e6269b36", size = 24179 }, +] + [[package]] name = "requests-toolbelt" version = "1.0.0" @@ -1347,19 +1474,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/87/81/dc30bc449ccba63ad23a0f6633486d4e0e6955f45f3715a130dacabd6ad0/rpds_py-0.20.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:37fe0f12aebb6a0e3e17bb4cd356b1286d2d18d2e93b2d39fe647138458b4bcb", size = 531076 }, { url = "https://files.pythonhosted.org/packages/50/80/fb62ab48f3b5cfe704ead6ad372da1922ddaa76397055e02eb507054c979/rpds_py-0.20.1-cp313-none-win32.whl", hash = "sha256:a624cc00ef2158e04188df5e3016385b9353638139a06fb77057b3498f794782", size = 202804 }, { url = "https://files.pythonhosted.org/packages/d9/30/a3391e76d0b3313f33bdedd394a519decae3a953d2943e3dabf80ae32447/rpds_py-0.20.1-cp313-none-win_amd64.whl", hash = "sha256:b71b8666eeea69d6363248822078c075bac6ed135faa9216aa85f295ff009b1e", size = 220502 }, - { url = "https://files.pythonhosted.org/packages/53/ef/b1883734ea0cd9996de793cdc38c32a28143b04911d1e570090acd8a9162/rpds_py-0.20.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:5b48e790e0355865197ad0aca8cde3d8ede347831e1959e158369eb3493d2191", size = 327757 }, - { url = "https://files.pythonhosted.org/packages/54/63/47d34dc4ddb3da73e78e10c9009dcf8edc42d355a221351c05c822c2a50b/rpds_py-0.20.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:3e310838a5801795207c66c73ea903deda321e6146d6f282e85fa7e3e4854804", size = 318785 }, - { url = "https://files.pythonhosted.org/packages/f7/e1/d6323be4afbe3013f28725553b7bfa80b3f013f91678af258f579f8ea8f9/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2249280b870e6a42c0d972339e9cc22ee98730a99cd7f2f727549af80dd5a963", size = 361511 }, - { url = "https://files.pythonhosted.org/packages/ab/d3/c40e4d9ecd571f0f50fe69bc53fe608d7b2c49b30738b480044990260838/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e79059d67bea28b53d255c1437b25391653263f0e69cd7dec170d778fdbca95e", size = 370201 }, - { url = "https://files.pythonhosted.org/packages/f1/b6/96a4a9977a8a06c2c49d90aa571346aff1642abf15066a39a0b4817bf049/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2b431c777c9653e569986ecf69ff4a5dba281cded16043d348bf9ba505486f36", size = 403866 }, - { url = "https://files.pythonhosted.org/packages/cd/8f/702b52287949314b498a311f92b5ee0ba30c702a27e0e6b560e2da43b8d5/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:da584ff96ec95e97925174eb8237e32f626e7a1a97888cdd27ee2f1f24dd0ad8", size = 430163 }, - { url = "https://files.pythonhosted.org/packages/c4/ce/af016c81fda833bf125b20d1677d816f230cad2ab189f46bcbfea3c7a375/rpds_py-0.20.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:02a0629ec053fc013808a85178524e3cb63a61dbc35b22499870194a63578fb9", size = 360776 }, - { url = "https://files.pythonhosted.org/packages/08/a7/988e179c9bef55821abe41762228d65077e0570ca75c9efbcd1bc6e263b4/rpds_py-0.20.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fbf15aff64a163db29a91ed0868af181d6f68ec1a3a7d5afcfe4501252840bad", size = 383008 }, - { url = "https://files.pythonhosted.org/packages/96/b0/e4077f7f1b9622112ae83254aedfb691490278793299bc06dcf54ec8c8e4/rpds_py-0.20.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:07924c1b938798797d60c6308fa8ad3b3f0201802f82e4a2c41bb3fafb44cc28", size = 546371 }, - { url = "https://files.pythonhosted.org/packages/e4/5e/1d4dd08ec0352cfe516ea93ea1993c2f656f893c87dafcd9312bd07f65f7/rpds_py-0.20.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:4a5a844f68776a7715ecb30843b453f07ac89bad393431efbf7accca3ef599c1", size = 549809 }, - { url = "https://files.pythonhosted.org/packages/57/ac/a716b4729ff23ec034b7d2ff76a86e6f0753c4098401bdfdf55b2efe90e6/rpds_py-0.20.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:518d2ca43c358929bf08f9079b617f1c2ca6e8848f83c1225c88caeac46e6cbc", size = 528492 }, - { url = "https://files.pythonhosted.org/packages/e0/ed/a0b58a9ecef79918169eacdabd14eb4c5c86ce71184ed56b80c6eb425828/rpds_py-0.20.1-cp38-none-win32.whl", hash = "sha256:3aea7eed3e55119635a74bbeb80b35e776bafccb70d97e8ff838816c124539f1", size = 200512 }, - { url = "https://files.pythonhosted.org/packages/5f/c3/222e25124283afc76c473fcd2c547e82ec57683fa31cb4d6c6eb44e5d57a/rpds_py-0.20.1-cp38-none-win_amd64.whl", hash = "sha256:7dca7081e9a0c3b6490a145593f6fe3173a94197f2cb9891183ef75e9d64c425", size = 218627 }, { url = "https://files.pythonhosted.org/packages/d6/87/e7e0fcbfdc0d0e261534bcc885f6ae6253095b972e32f8b8b1278c78a2a9/rpds_py-0.20.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:b41b6321805c472f66990c2849e152aff7bc359eb92f781e3f606609eac877ad", size = 327867 }, { url = "https://files.pythonhosted.org/packages/93/a0/17836b7961fc82586e9b818abdee2a27e2e605a602bb8c0d43f02092f8c2/rpds_py-0.20.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a90c373ea2975519b58dece25853dbcb9779b05cc46b4819cb1917e3b3215b6", size = 318893 }, { url = "https://files.pythonhosted.org/packages/dc/03/deb81d8ea3a8b974e7b03cfe8c8c26616ef8f4980dd430d8dd0a2f1b4d8e/rpds_py-0.20.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:16d4477bcb9fbbd7b5b0e4a5d9b493e42026c0bf1f06f723a9353f5153e75d30", size = 361664 }, @@ -1399,6 +1513,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/05/c3/10c68a08849f1fa45d205e54141fa75d316013e3d701ef01770ee1220bb8/rpds_py-0.20.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:83cba698cfb3c2c5a7c3c6bac12fe6c6a51aae69513726be6411076185a8b24a", size = 219991 }, ] +[[package]] +name = "rsa" +version = "4.9" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pyasn1" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/aa/65/7d973b89c4d2351d7fb232c2e452547ddfa243e93131e7cfa766da627b52/rsa-4.9.tar.gz", hash = "sha256:e38464a49c6c85d7f1351b0126661487a7e0a14a50f1675ec50eb34d4f20ef21", size = 29711 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/49/97/fa78e3d2f65c02c8e1268b9aba606569fe97f6c8f7c2d74394553347c145/rsa-4.9-py3-none-any.whl", hash = "sha256:90260d9058e514786967344d0ef75fa8727eed8a7d2e43ce9f4bcf1b536174f7", size = 34315 }, +] + [[package]] name = "secretstorage" version = "3.3.3" @@ -1468,11 +1594,11 @@ wheels = [ [[package]] name = "tomli-w" -version = "1.0.0" +version = "1.1.0" source = { registry = "https://pypi.org/simple" } -sdist = { url = "https://files.pythonhosted.org/packages/49/05/6bf21838623186b91aedbda06248ad18f03487dc56fbc20e4db384abde6c/tomli_w-1.0.0.tar.gz", hash = "sha256:f463434305e0336248cac9c2dc8076b707d8a12d019dd349f5c1e382dd1ae1b9", size = 6531 } +sdist = { url = "https://files.pythonhosted.org/packages/d4/19/b65f1a088ee23e37cdea415b357843eca8b1422a7b11a9eee6e35d4ec273/tomli_w-1.1.0.tar.gz", hash = "sha256:49e847a3a304d516a169a601184932ef0f6b61623fe680f836a2aa7128ed0d33", size = 6929 } wheels = [ - { url = "https://files.pythonhosted.org/packages/bb/01/1da9c66ecb20f31ed5aa5316a957e0b1a5e786a0d9689616ece4ceaf1321/tomli_w-1.0.0-py3-none-any.whl", hash = "sha256:9f2a07e8be30a0729e533ec968016807069991ae2fd921a78d42f429ae5f4463", size = 5984 }, + { url = "https://files.pythonhosted.org/packages/c4/ac/ce90573ba446a9bbe65838ded066a805234d159b4446ae9f8ec5bbd36cbd/tomli_w-1.1.0-py3-none-any.whl", hash = "sha256:1403179c78193e3184bfaade390ddbd071cba48a32a2e62ba11aae47490c63f7", size = 6440 }, ] [[package]] @@ -1484,6 +1610,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/f9/b6/a447b5e4ec71e13871be01ba81f5dfc9d0af7e473da256ff46bc0e24026f/tomlkit-0.13.2-py3-none-any.whl", hash = "sha256:7a974427f6e119197f670fbbbeae7bef749a6c14e793db934baefc1b5f03efde", size = 37955 }, ] +[[package]] +name = "tqdm" +version = "4.67.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "colorama", marker = "sys_platform == 'win32'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a8/4b/29b4ef32e036bb34e4ab51796dd745cdba7ed47ad142a9f4a1eb8e0c744d/tqdm-4.67.1.tar.gz", hash = "sha256:f8aef9c52c08c13a65f30ea34f4e5aac3fd1a34959879d7e59e63027286627f2", size = 169737 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl", hash = "sha256:26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2", size = 78540 }, +] + [[package]] name = "trove-classifiers" version = "2024.10.21.16" @@ -1522,6 +1660,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/26/9f/ad63fc0248c5379346306f8668cda6e2e2e9c95e01216d2b8ffd9ff037d0/typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d", size = 37438 }, ] +[[package]] +name = "uritemplate" +version = "4.1.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d2/5a/4742fdba39cd02a56226815abfa72fe0aa81c33bed16ed045647d6000eba/uritemplate-4.1.1.tar.gz", hash = "sha256:4346edfc5c3b79f694bccd6d6099a322bbeb628dbf2cd86eea55a456ce5124f0", size = 273898 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/81/c0/7461b49cd25aeece13766f02ee576d1db528f1c37ce69aee300e075b485b/uritemplate-4.1.1-py2.py3-none-any.whl", hash = "sha256:830c08b8d99bdd312ea4ead05994a38e8936266f84b9a7878232db50b044e02e", size = 10356 }, +] + [[package]] name = "urllib3" version = "2.2.3" @@ -1543,6 +1690,31 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/43/99/3ec6335ded5b88c2f7ed25c56ffd952546f7ed007ffb1e1539dc3b57015a/userpath-1.9.2-py3-none-any.whl", hash = "sha256:2cbf01a23d655a1ff8fc166dfb78da1b641d1ceabf0fe5f970767d380b14e89d", size = 9065 }, ] +[[package]] +name = "uv" +version = "0.4.30" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/8e/66/8191736201d0b503f75cc5682e5d1a47e0e4fe55f5616605af8727e2c9de/uv-0.4.30.tar.gz", hash = "sha256:d9de718380e2f167243ca5e1dccea781e06404158442491255fec5955d57fed9", size = 2126167 } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/67/f8eefd7499740fc5c2764574ad2d577a50d925c506e74cd0557c2d64f05b/uv-0.4.30-py3-none-linux_armv6l.whl", hash = "sha256:4ddad09385221fa5c609169e4a0dd5bee27cf56c1dc450d4cdc113122c54bb09", size = 13447487 }, + { url = "https://files.pythonhosted.org/packages/bb/07/9e8f09a4f93fd3cda20e635392994bf15c79ec5c853b5d3fe001b8259ef6/uv-0.4.30-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:f63d6646acdf2f38a5afca9fb9eeac62efa663a57f3c134f735a5f575b4e748f", size = 13478492 }, + { url = "https://files.pythonhosted.org/packages/67/37/8994c3d0be99851a21a6ee01bbf3cb35ddc4b202a2f6f4014098d5893660/uv-0.4.30-py3-none-macosx_11_0_arm64.whl", hash = "sha256:353617bfcf72e1eabade426d83fb86a69d11273d1612aabc3f4566d41c596c97", size = 12467039 }, + { url = "https://files.pythonhosted.org/packages/0a/bc/c5fc5ede7f073c850fe61d1b35d45d45936bd212a188a513e319d11e450c/uv-0.4.30-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.musllinux_1_1_aarch64.whl", hash = "sha256:dedcae3619f0eb181459b597fefefd99cb21fe5a5a48a530be6f5ad934399bfb", size = 12740841 }, + { url = "https://files.pythonhosted.org/packages/a1/a7/a728622e0990ba8fe5188387c7a21218e605f00297c6466ecd4caff068e4/uv-0.4.30-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:232575f30ed971ea32d4a525b7146c4b088a07ed6e70a31da63792d563fcac44", size = 13257182 }, + { url = "https://files.pythonhosted.org/packages/53/08/eb5283f4fb758537f18d5dfbb0f8dae3198be9f091e7a66d016a6a8c0b5c/uv-0.4.30-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0c89f2eff63a08d04e81629611f43b1ffa668af6de0382b95a71599af7d4b77c", size = 13817386 }, + { url = "https://files.pythonhosted.org/packages/43/28/b1b914c67807cd05d0e0ffe682d82335fa9d222ebd271553aa423b34b734/uv-0.4.30-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:4d41d09cabba1988728c2d9b9ad25f79233c2aa3d6ecd724c36f4678c4c89711", size = 14417701 }, + { url = "https://files.pythonhosted.org/packages/0a/5d/fa1294dec14271be15affd420bdbba415dbc7e3db5b63719f8fb6d5cef34/uv-0.4.30-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9ed0183e747065b9b1bcfb699ff10df671ebe6259709ce83e709f86cea564aee", size = 14163236 }, + { url = "https://files.pythonhosted.org/packages/ff/ce/e2fedbfcf055f79dd8c6e827d130bb8b9f2fd0841a6a0973baca8bdee242/uv-0.4.30-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9e17a799c6279800996828e10288ca8ccc40cc883d8998802b938aa671dfa9ce", size = 18250185 }, + { url = "https://files.pythonhosted.org/packages/3b/36/592477b62bbd1d652ec2d45a5a6daba7ed5a6ce008690eb0749e18733adb/uv-0.4.30-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:63196143f45018364c450ba94279a5bcff8562c14ba63deb41a92ed30baa6e22", size = 13953259 }, + { url = "https://files.pythonhosted.org/packages/f0/a1/4eb54d4b2809cb6b896881609d8620321f9907d052afee3111f72a50d16c/uv-0.4.30-py3-none-manylinux_2_28_aarch64.whl", hash = "sha256:6395820540f368f622e818735862abd633dfe7e729c450fca56b65bab4b46661", size = 12941390 }, + { url = "https://files.pythonhosted.org/packages/e4/68/e963aa4c235151f8f91442ffeb734642fa9d139630b5bcdb77719c84638f/uv-0.4.30-py3-none-musllinux_1_1_armv7l.whl", hash = "sha256:1a83df281c5d900b4758b1a3969b3cff57231f9027db8508b71dce1f2da78684", size = 13209967 }, + { url = "https://files.pythonhosted.org/packages/86/10/b72965bf44de9f31f5031efe9abad871b22c05884092314da4eb1233d0f0/uv-0.4.30-py3-none-musllinux_1_1_i686.whl", hash = "sha256:4aecd9fb39cf018e129627090a1d35af2b0184bb87078d573c9998f5e4072416", size = 13559034 }, + { url = "https://files.pythonhosted.org/packages/3a/58/2ed027ea9ae017d16a78f0b49e738f2df36ce67d2c1c836fcf442731170c/uv-0.4.30-py3-none-musllinux_1_1_ppc64le.whl", hash = "sha256:444468ad0e94b35cbf6acfc8a28589cfe1247136d43895e60a18955ff89a07ad", size = 15433457 }, + { url = "https://files.pythonhosted.org/packages/2b/db/b45b2d1470e39961e7d612f1f2ecd815de9b0fdd3298fbf14ef770863dbc/uv-0.4.30-py3-none-musllinux_1_1_x86_64.whl", hash = "sha256:ea55ca0fe5bdd04e46deaf395b3daf4fa92392f774e83610d066a2b272af5d3f", size = 14062977 }, + { url = "https://files.pythonhosted.org/packages/39/ee/1bac3464ae9c666c974a03e673a8cbb36023783a9c07de24d8a5e0473c4e/uv-0.4.30-py3-none-win32.whl", hash = "sha256:7f09bd6a853767863e2fb905f0eb1a0ed7afa9ea118852e5c02d2b451944e1cf", size = 13377566 }, + { url = "https://files.pythonhosted.org/packages/7b/05/3b42d33752cc0085369b4320e05ff667617de5a570be7cb358c6150ca046/uv-0.4.30-py3-none-win_amd64.whl", hash = "sha256:44c5aeb5b374f9fd1083959934daa9020db3610f0405198c5e3d8ec1f23d961d", size = 15022847 }, +] + [[package]] name = "virtualenv" version = "20.27.1" @@ -1593,16 +1765,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/14/26/93a9fa02c6f257df54d7570dfe8011995138118d11939a4ecd82cb849613/wrapt-1.16.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:418abb18146475c310d7a6dc71143d6f7adec5b004ac9ce08dc7a34e2babdc5c", size = 91738 }, { url = "https://files.pythonhosted.org/packages/a2/5b/4660897233eb2c8c4de3dc7cefed114c61bacb3c28327e64150dc44ee2f6/wrapt-1.16.0-cp312-cp312-win32.whl", hash = "sha256:685f568fa5e627e93f3b52fda002c7ed2fa1800b50ce51f6ed1d572d8ab3e7fc", size = 35568 }, { url = "https://files.pythonhosted.org/packages/5c/cc/8297f9658506b224aa4bd71906447dea6bb0ba629861a758c28f67428b91/wrapt-1.16.0-cp312-cp312-win_amd64.whl", hash = "sha256:dcdba5c86e368442528f7060039eda390cc4091bfd1dca41e8046af7c910dda8", size = 37653 }, - { url = "https://files.pythonhosted.org/packages/fe/9e/d3bc95e75670ba15c5b25ecf07fc49941843e2678d777ca59339348d1c96/wrapt-1.16.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1dd50a2696ff89f57bd8847647a1c363b687d3d796dc30d4dd4a9d1689a706f0", size = 37320 }, - { url = "https://files.pythonhosted.org/packages/72/b5/0c9be75f826c8e8d583a4ab312552d63d9f7c0768710146a22ac59bda4a9/wrapt-1.16.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:44a2754372e32ab315734c6c73b24351d06e77ffff6ae27d2ecf14cf3d229202", size = 38163 }, - { url = "https://files.pythonhosted.org/packages/69/21/b2ba809bafc9b6265e359f9c259c6d9a52a16cf6be20c72d95e76da609dd/wrapt-1.16.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e9723528b9f787dc59168369e42ae1c3b0d3fadb2f1a71de14531d321ee05b0", size = 83535 }, - { url = "https://files.pythonhosted.org/packages/58/43/d72e625edb5926483c9868214d25b5e7d5858ace6a80c9dfddfbadf4d8f9/wrapt-1.16.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:dbed418ba5c3dce92619656802cc5355cb679e58d0d89b50f116e4a9d5a9603e", size = 75975 }, - { url = "https://files.pythonhosted.org/packages/ef/c6/56e718e2c58a4078518c14d97e531ef1e9e8a5c1ddafdc0d264a92be1a1a/wrapt-1.16.0-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:941988b89b4fd6b41c3f0bfb20e92bd23746579736b7343283297c4c8cbae68f", size = 83363 }, - { url = "https://files.pythonhosted.org/packages/34/49/589db6fa2d5d428b71716815bca8b39196fdaeea7c247a719ed2f93b0ab4/wrapt-1.16.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:6a42cd0cfa8ffc1915aef79cb4284f6383d8a3e9dcca70c445dcfdd639d51267", size = 87739 }, - { url = "https://files.pythonhosted.org/packages/c5/40/3eabe06c8dc54fada7364f34e8caa562efe3bf3f769bf3258de9c785a27f/wrapt-1.16.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:1ca9b6085e4f866bd584fb135a041bfc32cab916e69f714a7d1d397f8c4891ca", size = 80700 }, - { url = "https://files.pythonhosted.org/packages/15/4e/081f59237b620a124b035f1229f55db40841a9339fdb8ef60b4decc44df9/wrapt-1.16.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d5e49454f19ef621089e204f862388d29e6e8d8b162efce05208913dde5b9ad6", size = 87783 }, - { url = "https://files.pythonhosted.org/packages/3a/ad/9d26a33bc80444ff97b937f94611f3b986fd40f735823558dfdf05ef9db8/wrapt-1.16.0-cp38-cp38-win32.whl", hash = "sha256:c31f72b1b6624c9d863fc095da460802f43a7c6868c5dda140f51da24fd47d7b", size = 35332 }, - { url = "https://files.pythonhosted.org/packages/01/db/4b29ba5f97d2a0aa97ec41eba1036b7c3eaf6e61e1f4639420cec2463a01/wrapt-1.16.0-cp38-cp38-win_amd64.whl", hash = "sha256:490b0ee15c1a55be9c1bd8609b8cecd60e325f0575fc98f50058eae366e01f41", size = 37524 }, { url = "https://files.pythonhosted.org/packages/70/cc/b92e1da2cad6a9f8ee481000ece07a35e3b24e041e60ff8b850c079f0ebf/wrapt-1.16.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9b201ae332c3637a42f02d1045e1d0cccfdc41f1f2f801dafbaa7e9b4797bfc2", size = 37314 }, { url = "https://files.pythonhosted.org/packages/4a/cc/3402bcc897978be00fef608cd9e3e39ec8869c973feeb5e1e277670e5ad2/wrapt-1.16.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:2076fad65c6736184e77d7d4729b63a6d1ae0b70da4868adeec40989858eb3fb", size = 38162 }, { url = "https://files.pythonhosted.org/packages/28/d3/4f079f649c515727c127c987b2ec2e0816b80d95784f2d28d1a57d2a1029/wrapt-1.16.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c5cd603b575ebceca7da5a3a251e69561bec509e0b46e4993e1cac402b7247b8", size = 80235 }, @@ -1698,22 +1860,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/02/90/2633473864f67a15526324b007a9f96c96f56d5f32ef2a56cc12f9548723/zstandard-0.23.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa6ce8b52c5987b3e34d5674b0ab529a4602b632ebab0a93b07bfb4dfc8f8a33", size = 5191299 }, { url = "https://files.pythonhosted.org/packages/b0/4c/315ca5c32da7e2dc3455f3b2caee5c8c2246074a61aac6ec3378a97b7136/zstandard-0.23.0-cp313-cp313-win32.whl", hash = "sha256:a9b07268d0c3ca5c170a385a0ab9fb7fdd9f5fd866be004c4ea39e44edce47dd", size = 430862 }, { url = "https://files.pythonhosted.org/packages/a2/bf/c6aaba098e2d04781e8f4f7c0ba3c7aa73d00e4c436bcc0cf059a66691d1/zstandard-0.23.0-cp313-cp313-win_amd64.whl", hash = "sha256:f3513916e8c645d0610815c257cbfd3242adfd5c4cfa78be514e5a3ebb42a41b", size = 495578 }, - { url = "https://files.pythonhosted.org/packages/fb/96/867dd4f5e9ee6215f83985c43f4134b28c058617a7af8ad9592669f960dd/zstandard-0.23.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:2ef3775758346d9ac6214123887d25c7061c92afe1f2b354f9388e9e4d48acfc", size = 788685 }, - { url = "https://files.pythonhosted.org/packages/19/57/e81579db7740757036e97dc461f4f26a318fe8dfc6b3477dd557b7f85aae/zstandard-0.23.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4051e406288b8cdbb993798b9a45c59a4896b6ecee2f875424ec10276a895740", size = 633665 }, - { url = "https://files.pythonhosted.org/packages/ac/a5/b8c9d79511796684a2a653843e0464dfcc11a052abb5855af7035d919ecc/zstandard-0.23.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e2d1a054f8f0a191004675755448d12be47fa9bebbcffa3cdf01db19f2d30a54", size = 4944817 }, - { url = "https://files.pythonhosted.org/packages/fa/59/ee5a3c4f060c431d3aaa7ff2b435d9723c579bffda274d071c981bf08b17/zstandard-0.23.0-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f83fa6cae3fff8e98691248c9320356971b59678a17f20656a9e59cd32cee6d8", size = 5311485 }, - { url = "https://files.pythonhosted.org/packages/8a/70/ea438a09d757d49c5bb73a895c13492277b83981c08ed294441b1965eaf2/zstandard-0.23.0-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:32ba3b5ccde2d581b1e6aa952c836a6291e8435d788f656fe5976445865ae045", size = 5340843 }, - { url = "https://files.pythonhosted.org/packages/1c/4b/be9f3f9ed33ff4d5e578cf167c16ac1d8542232d5e4831c49b615b5918a6/zstandard-0.23.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2f146f50723defec2975fb7e388ae3a024eb7151542d1599527ec2aa9cacb152", size = 5442446 }, - { url = "https://files.pythonhosted.org/packages/ef/17/55eff9df9004e1896f2ade19981e7cd24d06b463fe72f9a61f112b8185d0/zstandard-0.23.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1bfe8de1da6d104f15a60d4a8a768288f66aa953bbe00d027398b93fb9680b26", size = 4863800 }, - { url = "https://files.pythonhosted.org/packages/59/8c/fe542982e63e1948066bf2adc18e902196eb08f3407188474b5a4e855e2e/zstandard-0.23.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:29a2bc7c1b09b0af938b7a8343174b987ae021705acabcbae560166567f5a8db", size = 4935488 }, - { url = "https://files.pythonhosted.org/packages/38/6c/a54e30864aff0cc065c053fbdb581114328f70f45f30fcb0f80b12bb4460/zstandard-0.23.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:61f89436cbfede4bc4e91b4397eaa3e2108ebe96d05e93d6ccc95ab5714be512", size = 5467670 }, - { url = "https://files.pythonhosted.org/packages/ba/11/32788cc80aa8c1069a9fdc48a60355bd25ac8211b2414dd0ff6ee6bb5ff5/zstandard-0.23.0-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:53ea7cdc96c6eb56e76bb06894bcfb5dfa93b7adcf59d61c6b92674e24e2dd5e", size = 4859904 }, - { url = "https://files.pythonhosted.org/packages/60/93/baf7ad86b2258c08c06bdccdaddeb3d6d0918601e16fa9c73c8079c8c816/zstandard-0.23.0-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:a4ae99c57668ca1e78597d8b06d5af837f377f340f4cce993b551b2d7731778d", size = 4700723 }, - { url = "https://files.pythonhosted.org/packages/95/bd/e65f1c1e0185ed0c7f5bda51b0d73fc379a75f5dc2583aac83dd131378dc/zstandard-0.23.0-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:379b378ae694ba78cef921581ebd420c938936a153ded602c4fea612b7eaa90d", size = 5208667 }, - { url = "https://files.pythonhosted.org/packages/dc/cf/2dfa4610829c6c1dbc3ce858caed6de13928bec78c1e4d0bedfd4b20589b/zstandard-0.23.0-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:50a80baba0285386f97ea36239855f6020ce452456605f262b2d33ac35c7770b", size = 5667083 }, - { url = "https://files.pythonhosted.org/packages/16/f6/d84d95984fb9c8f57747ffeff66677f0a58acf430f9ddff84bc3b9aad35d/zstandard-0.23.0-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:61062387ad820c654b6a6b5f0b94484fa19515e0c5116faf29f41a6bc91ded6e", size = 5195874 }, - { url = "https://files.pythonhosted.org/packages/fc/a6/239f43f2e3ea0360c5641c075bd587c7f2a32b29d9ba53a538435621bcbb/zstandard-0.23.0-cp38-cp38-win32.whl", hash = "sha256:b8c0bd73aeac689beacd4e7667d48c299f61b959475cdbb91e7d3d88d27c56b9", size = 430654 }, - { url = "https://files.pythonhosted.org/packages/d5/b6/16e737301831c9c62379ed466c3d916c56b8a9a95fbce9bf1d7fea318945/zstandard-0.23.0-cp38-cp38-win_amd64.whl", hash = "sha256:a05e6d6218461eb1b4771d973728f0133b2a4613a6779995df557f70794fd60f", size = 495519 }, { url = "https://files.pythonhosted.org/packages/fb/96/4fcafeb7e013a2386d22f974b5b97a0b9a65004ed58c87ae001599bfbd48/zstandard-0.23.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3aa014d55c3af933c1315eb4bb06dd0459661cc0b15cd61077afa6489bec63bb", size = 788697 }, { url = "https://files.pythonhosted.org/packages/83/ff/a52ce725be69b86a2967ecba0497a8184540cc284c0991125515449e54e2/zstandard-0.23.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0a7f0804bb3799414af278e9ad51be25edf67f78f916e08afdb983e74161b916", size = 633679 }, { url = "https://files.pythonhosted.org/packages/34/0f/3dc62db122f6a9c481c335fff6fc9f4e88d8f6e2d47321ee3937328addb4/zstandard-0.23.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb2b1ecfef1e67897d336de3a0e3f52478182d6a47eda86cbd42504c5cbd009a", size = 4940416 }, diff --git a/dev/stats/explore_pr_candidates.ipynb b/dev/stats/explore_pr_candidates.ipynb index ead9e977f9bce..afae804ebbc54 100644 --- a/dev/stats/explore_pr_candidates.ipynb +++ b/dev/stats/explore_pr_candidates.ipynb @@ -19,7 +19,7 @@ "metadata": {}, "outputs": [], "source": [ - "file = open(\"prlist\",\"rb\") # open the pickled file\n", + "file = open(\"prlist\", \"rb\") # open the pickled file\n", "selected_prs = pickle.load(file)" ] }, @@ -33,26 +33,26 @@ "\n", "for pr_stat in selected_prs:\n", " data = {\n", - " 'number': [pr_stat.pull_request.number],\n", - " 'url': [pr_stat.pull_request.html_url],\n", - " 'title': [pr_stat.pull_request.title],\n", - " 'overall_score': [pr_stat.score],\n", - " 'label_score': [pr_stat.label_score],\n", - " 'length_score': [pr_stat.length_score],\n", - " 'body_length': [pr_stat.body_length],\n", - " 'comment_length': [pr_stat.comment_length],\n", - " 'interaction_score': [pr_stat.interaction_score],\n", - " 'comments': [pr_stat.num_comments],\n", - " 'reactions': [pr_stat.num_reactions],\n", - " 'reviews': [pr_stat.num_reviews],\n", - " 'num_interacting_users': [pr_stat.num_interacting_users],\n", - " 'change_score': [pr_stat.change_score],\n", - " 'additions': [pr_stat.num_additions],\n", - " 'deletions': [pr_stat.num_deletions],\n", - " 'num_changed_files': [pr_stat.num_changed_files],\n", + " \"number\": [pr_stat.pull_request.number],\n", + " \"url\": [pr_stat.pull_request.html_url],\n", + " \"title\": [pr_stat.pull_request.title],\n", + " \"overall_score\": [pr_stat.score],\n", + " \"label_score\": [pr_stat.label_score],\n", + " \"length_score\": [pr_stat.length_score],\n", + " \"body_length\": [pr_stat.body_length],\n", + " \"comment_length\": [pr_stat.comment_length],\n", + " \"interaction_score\": [pr_stat.interaction_score],\n", + " \"comments\": [pr_stat.num_comments],\n", + " \"reactions\": [pr_stat.num_reactions],\n", + " \"reviews\": [pr_stat.num_reviews],\n", + " \"num_interacting_users\": [pr_stat.num_interacting_users],\n", + " \"change_score\": [pr_stat.change_score],\n", + " \"additions\": [pr_stat.num_additions],\n", + " \"deletions\": [pr_stat.num_deletions],\n", + " \"num_changed_files\": [pr_stat.num_changed_files],\n", " }\n", " df = pd.DataFrame(data)\n", - " rows = pd.concat([df, rows]).reset_index(drop = True)" + " rows = pd.concat([df, rows]).reset_index(drop=True)" ] }, { diff --git a/docs/apache-airflow-providers-openlineage/guides/developer.rst b/docs/apache-airflow-providers-openlineage/guides/developer.rst index 806f2c21455f0..67e355cdc77a4 100644 --- a/docs/apache-airflow-providers-openlineage/guides/developer.rst +++ b/docs/apache-airflow-providers-openlineage/guides/developer.rst @@ -63,16 +63,13 @@ OpenLineage defines a few methods for implementation in Operators. Those are ref .. code-block:: python - def get_openlineage_facets_on_start() -> OperatorLineage: - ... + def get_openlineage_facets_on_start() -> OperatorLineage: ... - def get_openlineage_facets_on_complete(ti: TaskInstance) -> OperatorLineage: - ... + def get_openlineage_facets_on_complete(ti: TaskInstance) -> OperatorLineage: ... - def get_openlineage_facets_on_failure(ti: TaskInstance) -> OperatorLineage: - ... + def get_openlineage_facets_on_failure(ti: TaskInstance) -> OperatorLineage: ... OpenLineage methods get called respectively when task instance changes state to: diff --git a/docs/apache-airflow/core-concepts/dag-run.rst b/docs/apache-airflow/core-concepts/dag-run.rst index abfd27823c83f..68da9a0067574 100644 --- a/docs/apache-airflow/core-concepts/dag-run.rst +++ b/docs/apache-airflow/core-concepts/dag-run.rst @@ -100,6 +100,7 @@ in the configuration file. When turned off, the scheduler creates a DAG run only Code that goes along with the Airflow tutorial located at: https://github.com/apache/airflow/blob/main/airflow/example_dags/tutorial.py """ + from airflow.models.dag import DAG from airflow.operators.bash import BashOperator diff --git a/docs/apache-airflow/core-concepts/dags.rst b/docs/apache-airflow/core-concepts/dags.rst index 39a2816beabcb..feb1db349d5be 100644 --- a/docs/apache-airflow/core-concepts/dags.rst +++ b/docs/apache-airflow/core-concepts/dags.rst @@ -663,6 +663,7 @@ This is especially useful if your tasks are built dynamically from configuration """ ### My great DAG """ + import pendulum dag = DAG( diff --git a/docs/apache-airflow/core-concepts/tasks.rst b/docs/apache-airflow/core-concepts/tasks.rst index e5c52b1d7684b..613eb1b570c30 100644 --- a/docs/apache-airflow/core-concepts/tasks.rst +++ b/docs/apache-airflow/core-concepts/tasks.rst @@ -210,13 +210,11 @@ Examples of ``sla_miss_callback`` function signature: .. code-block:: python - def my_sla_miss_callback(dag, task_list, blocking_task_list, slas, blocking_tis): - ... + def my_sla_miss_callback(dag, task_list, blocking_task_list, slas, blocking_tis): ... .. code-block:: python - def my_sla_miss_callback(*args): - ... + def my_sla_miss_callback(*args): ... Example DAG: diff --git a/pyproject.toml b/pyproject.toml index 0f7d7258aab00..0817479446f12 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -22,13 +22,13 @@ # The dependencies can be automatically upgraded by running: # pre-commit run --hook-stage manual update-build-dependencies --all-files requires = [ - "GitPython==3.1.43", - "gitdb==4.0.11", + "GitPython==3.1.44", + "gitdb==4.0.12", "hatchling==1.27.0", "packaging==24.2", "pathspec==0.12.1", "pluggy==1.5.0", - "smmap==5.0.1", + "smmap==5.0.2", "tomli==2.2.1; python_version < '3.11'", "trove-classifiers==2024.10.21.16", ] @@ -154,46 +154,12 @@ Homepage = "https://airflow.apache.org/" "Release Notes" = "https://airflow.apache.org/docs/apache-airflow/stable/release_notes.html" "Slack Chat" = "https://s.apache.org/airflow-slack" "Source Code" = "https://github.com/apache/airflow" -Twitter = "https://twitter.com/ApacheAirflow" +X = "https://x.com/ApacheAirflow" +LinkedIn = "https://www.linkedin.com/company/apache-airflow/" +Mastodon = "https://fosstodon.org/@airflow" +Bluesky = "https://bsky.app/profile/apache-airflow.bsky.social" YouTube = "https://www.youtube.com/channel/UCSXwxpWZQ7XZ1WL3wqevChA/" -[tool.hatch.envs.default] -python = "3.8" -platforms = ["linux", "macos"] -description = "Default environment with Python 3.8 for maximum compatibility" -features = [] - -[tool.hatch.envs.airflow-38] -python = "3.8" -platforms = ["linux", "macos"] -description = "Environment with Python 3.8. No devel installed." -features = [] - -[tool.hatch.envs.airflow-39] -python = "3.9" -platforms = ["linux", "macos"] -description = "Environment with Python 3.9. No devel installed." -features = [] - -[tool.hatch.envs.airflow-310] -python = "3.10" -platforms = ["linux", "macos"] -description = "Environment with Python 3.10. No devel installed." -features = [] - -[tool.hatch.envs.airflow-311] -python = "3.11" -platforms = ["linux", "macos"] -description = "Environment with Python 3.11. No devel installed" -features = [] - -[tool.hatch.envs.airflow-312] -python = "3.12" -platforms = ["linux", "macos"] -description = "Environment with Python 3.12. No devel installed" -features = [] - - [tool.hatch.version] path = "airflow/__init__.py" @@ -263,7 +229,7 @@ extend-select = [ "UP", # Pyupgrade "ASYNC", # subset of flake8-async rules "ISC", # Checks for implicit literal string concatenation (auto-fixable) - "TCH", # Rules around TYPE_CHECKING blocks + "TC", # Rules around TYPE_CHECKING blocks "G", # flake8-logging-format rules "LOG", # flake8-logging rules, most of them autofixable "PT", # flake8-pytest-style rules @@ -309,9 +275,7 @@ ignore = [ "D214", "D215", "E731", # Do not assign a lambda expression, use a def - "TCH003", # Do not move imports from stdlib to TYPE_CHECKING block - "PT004", # Fixture does not return anything, add leading underscore - "PT005", # Fixture returns a value, remove leading underscore + "TC003", # Do not move imports from stdlib to TYPE_CHECKING block "PT006", # Wrong type of names in @pytest.mark.parametrize "PT007", # Wrong type of values in @pytest.mark.parametrize "PT011", # pytest.raises() is too broad, set the match parameter @@ -332,6 +296,7 @@ ignore = [ "COM812", "COM819", "E501", # Formatted code may exceed the line length, leading to line-too-long (E501) errors. + "ASYNC110", # TODO: Use `anyio.Event` instead of awaiting `anyio.sleep` in a `while` loop ] unfixable = [ # PT022 replace empty `yield` to empty `return`. Might be fixed with a combination of PLR1711 @@ -557,6 +522,9 @@ python_files = [ testpaths = [ "tests", ] + +asyncio_default_fixture_loop_scope = "function" + # Keep temporary directories (created by `tmp_path`) for 2 recent runs only failed tests. tmp_path_retention_count = "2" tmp_path_retention_policy = "failed" diff --git a/scripts/ci/cleanup_docker.sh b/scripts/ci/cleanup_docker.sh index a61a77fd45f5e..400cc501541cd 100755 --- a/scripts/ci/cleanup_docker.sh +++ b/scripts/ci/cleanup_docker.sh @@ -16,7 +16,10 @@ # specific language governing permissions and limitations # under the License. function cleanup_docker { - docker system prune --all --force --volumes || true + # This is faster than docker prune + sudo systemctl stop docker + sudo rm -rf /var/lib/docker + sudo systemctl start docker } cleanup_docker diff --git a/scripts/ci/constraints/ci_commit_constraints.sh b/scripts/ci/constraints/ci_commit_constraints.sh index 727ddcf6257f8..c452d7a4a8e91 100755 --- a/scripts/ci/constraints/ci_commit_constraints.sh +++ b/scripts/ci/constraints/ci_commit_constraints.sh @@ -26,9 +26,6 @@ This update in constraints is automatically committed by the CI 'constraints-pus The action that build those constraints can be found at https://github.com/${GITHUB_REPOSITORY}/actions/runs/${GITHUB_RUN_ID}/ -The image tag used for that build was: ${IMAGE_TAG}. You can enter Breeze environment -with this image by running 'breeze shell --image-tag ${IMAGE_TAG}' - All tests passed in this build so we determined we can push the updated constraints. See https://github.com/apache/airflow/blob/main/README.md#installing-from-pypi for details. diff --git a/scripts/ci/docker-compose/base.yml b/scripts/ci/docker-compose/base.yml index 5450a4af8c04f..da448652e4b8e 100644 --- a/scripts/ci/docker-compose/base.yml +++ b/scripts/ci/docker-compose/base.yml @@ -17,7 +17,7 @@ --- services: airflow: - image: ${AIRFLOW_CI_IMAGE_WITH_TAG} + image: ${AIRFLOW_CI_IMAGE} pull_policy: never environment: - USER=root diff --git a/scripts/ci/docker-compose/devcontainer.env b/scripts/ci/docker-compose/devcontainer.env index 2b7cddb47eb6a..b63c7cf5fdd22 100644 --- a/scripts/ci/docker-compose/devcontainer.env +++ b/scripts/ci/docker-compose/devcontainer.env @@ -17,7 +17,6 @@ HOME= AIRFLOW_CI_IMAGE="ghcr.io/apache/airflow/main/ci/python3.8:latest" ANSWER= -AIRFLOW_ENABLE_AIP_44="true" AIRFLOW_ENV="development" PYTHON_MAJOR_MINOR_VERSION="3.8" AIRFLOW_EXTRAS= @@ -39,7 +38,6 @@ DEV_MODE="true" DOCKER_IS_ROOTLESS="false" DOWNGRADE_PENDULUM="false" DOWNGRADE_SQLALCHEMY="false" -ENABLED_SYSTEMS= GITHUB_ACTIONS="false" HELM_TEST_PACKAGE="" HOST_USER_ID= @@ -59,11 +57,9 @@ NUM_RUNS= ONLY_MIN_VERSION_UPDATE="false" PACKAGE_FORMAT= POSTGRES_VERSION=10 -PYDANTIC="v2" PYTHONDONTWRITEBYTECODE="true" REMOVE_ARM_PACKAGES="false" RUN_TESTS="false" -RUN_SYSTEM_TESTS="" AIRFLOW_SKIP_CONSTRAINTS="false" SKIP_SSH_SETUP="true" SKIP_ENVIRONMENT_INITIALIZATION="false" diff --git a/scripts/ci/docker-compose/forward-credentials.yml b/scripts/ci/docker-compose/forward-credentials.yml index 03a24f4455063..fcc4e4e67d1dd 100644 --- a/scripts/ci/docker-compose/forward-credentials.yml +++ b/scripts/ci/docker-compose/forward-credentials.yml @@ -29,3 +29,4 @@ services: - ${HOME}/.config:/root/.config:cached - ${HOME}/.docker:/root/.docker:cached - ${HOME}/.snowsql:/root/.snowsql:cached + - ${HOME}/.ssh:/root/.ssh:cached diff --git a/scripts/ci/docker-compose/integration-keycloak.yml b/scripts/ci/docker-compose/integration-keycloak.yml new file mode 100644 index 0000000000000..7373c5fb61773 --- /dev/null +++ b/scripts/ci/docker-compose/integration-keycloak.yml @@ -0,0 +1,62 @@ +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +--- +services: + keycloak: + image: quay.io/keycloak/keycloak:23.0.6 + labels: + breeze.description: "Integration for manual testing of multi-team Airflow." + entrypoint: /opt/keycloak/keycloak-entrypoint.sh + environment: + KC_HOSTNAME: localhost + KC_HOSTNAME_PORT: 48080 + KC_HOSTNAME_STRICT_BACKCHANNEL: false + KC_HTTP_ENABLED: true + KC_HOSTNAME_STRICT: true + + KEYCLOAK_ADMIN: admin + KEYCLOAK_ADMIN_PASSWORD: admin + + KC_DB: postgres + KC_DB_URL: jdbc:postgresql://postgres/keycloak + KC_DB_USERNAME: keycloak + KC_DB_PASSWORD: keycloak + ports: + - 48080:48080 + restart: always + depends_on: + postgres: + condition: service_healthy + volumes: + - ./keycloak/keycloak-entrypoint.sh:/opt/keycloak/keycloak-entrypoint.sh + + postgres: + volumes: + - ./keycloak/init-keycloak-db.sh:/docker-entrypoint-initdb.d/init-keycloak-db.sh + environment: + KC_POSTGRES_DB: keycloak + KC_POSTGRES_USER: keycloak + KC_POSTGRES_PASSWORD: keycloak + healthcheck: + test: ["CMD", "psql", "-h", "localhost", "-U", "keycloak"] + interval: 10s + timeout: 10s + retries: 5 + + airflow: + depends_on: + - keycloak diff --git a/scripts/ci/docker-compose/integration-openlineage.yml b/scripts/ci/docker-compose/integration-openlineage.yml index 22719886576fe..e0d69676c1152 100644 --- a/scripts/ci/docker-compose/integration-openlineage.yml +++ b/scripts/ci/docker-compose/integration-openlineage.yml @@ -17,7 +17,7 @@ --- services: marquez: - image: marquezproject/marquez:0.40.0 + image: marquezproject/marquez:0.49.0 labels: breeze.description: "Integration required for Openlineage hooks." environment: @@ -33,7 +33,7 @@ services: entrypoint: ["./entrypoint.sh"] marquez_web: - image: marquezproject/marquez-web:0.40.0 + image: marquezproject/marquez-web:0.49.0 environment: - MARQUEZ_HOST=marquez - MARQUEZ_PORT=5000 diff --git a/scripts/ci/docker-compose/keycloak/init-keycloak-db.sh b/scripts/ci/docker-compose/keycloak/init-keycloak-db.sh new file mode 100755 index 0000000000000..47df6aede204d --- /dev/null +++ b/scripts/ci/docker-compose/keycloak/init-keycloak-db.sh @@ -0,0 +1,27 @@ +#!/bin/sh + +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +set -eu + +psql -v ON_ERROR_STOP=1 --username "${POSTGRES_USER}" > /dev/null <<-EOSQL + CREATE USER ${KC_POSTGRES_USER}; + ALTER USER ${KC_POSTGRES_USER} WITH PASSWORD '${KC_POSTGRES_PASSWORD}'; + CREATE DATABASE ${KC_POSTGRES_DB}; + GRANT ALL PRIVILEGES ON DATABASE ${KC_POSTGRES_DB} TO ${KC_POSTGRES_USER}; +EOSQL diff --git a/scripts/ci/docker-compose/keycloak/keycloak-entrypoint.sh b/scripts/ci/docker-compose/keycloak/keycloak-entrypoint.sh new file mode 100755 index 0000000000000..e699d858346aa --- /dev/null +++ b/scripts/ci/docker-compose/keycloak/keycloak-entrypoint.sh @@ -0,0 +1,45 @@ +#!/bin/bash + +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +# We exit in case cd fails +cd /opt/keycloak/bin/ || exit + +http_port="${KC_HOSTNAME_PORT}" + +# Start Keycloak in the background +./kc.sh start-dev --http-port="$http_port" & + +# Wait for Keycloak to be ready +echo "Waiting for Keycloak to start on port $http_port..." +while ! (echo > /dev/tcp/localhost/"$http_port") 2>/dev/null; do + echo "keycloak still not started" + sleep 5 +done +sleep 3 +echo "Keycloak is running (probably...)" + +# The below commands are used to disable the ssl requirement to use the admin panel of keycloak +echo "Configuring admin console access without ssl/https" +# Get credentials to make the below update to the realm settings +./kcadm.sh config credentials --server http://localhost:"$http_port" --realm master --user admin --password admin +./kcadm.sh update realms/master -s sslRequired=NONE --server http://localhost:"$http_port" +echo "Configuring complete!" + +# Keep the container running +wait diff --git a/scripts/ci/docker-compose/providers-and-tests-sources.yml b/scripts/ci/docker-compose/providers-and-tests-sources.yml index e792d783dce15..8a06f2fcc0d1f 100644 --- a/scripts/ci/docker-compose/providers-and-tests-sources.yml +++ b/scripts/ci/docker-compose/providers-and-tests-sources.yml @@ -21,6 +21,7 @@ services: tty: true # docker run -t environment: - AIRFLOW__CORE__PLUGINS_FOLDER=/files/plugins + - LINK_PROVIDERS_TO_AIRFLOW_PACKAGE=true # We only mount tests folder volumes: - ../../../.bash_aliases:/root/.bash_aliases:cached @@ -30,8 +31,8 @@ services: - ../../../empty:/opt/airflow/airflow # but keep tests - ../../../tests/:/opt/airflow/tests:cached - # and providers - - ../../../airflow/providers:/opt/airflow/airflow/providers:cached + # Mount providers to make sure that we have the latest providers - both tests and sources + - ../../../providers/:/opt/airflow/providers:cached # and entrypoint and in_container scripts for testing - ../../../scripts/docker/entrypoint_ci.sh:/entrypoint - ../../../scripts/in_container/:/opt/airflow/scripts/in_container diff --git a/scripts/ci/images/ci_start_arm_instance_and_connect_to_docker.sh b/scripts/ci/images/ci_start_arm_instance_and_connect_to_docker.sh deleted file mode 100755 index fb02a9d2ef7ca..0000000000000 --- a/scripts/ci/images/ci_start_arm_instance_and_connect_to_docker.sh +++ /dev/null @@ -1,91 +0,0 @@ -#!/usr/bin/env bash -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -SCRIPTS_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd)" -# This is an AMI that is based on Basic Amazon Linux AMI with installed and configured docker service -WORKING_DIR="/tmp/armdocker" -INSTANCE_INFO="${WORKING_DIR}/instance_info.json" -ARM_AMI="ami-0e43196369d299715" # AMI ID of latest arm-docker-ami-v* -INSTANCE_TYPE="m7g.medium" # m7g.medium -> 1 vCPUS 4 GB RAM -MARKET_OPTIONS="MarketType=spot,SpotOptions={MaxPrice=0.25,SpotInstanceType=one-time}" -REGION="us-east-2" -EC2_USER="ec2-user" -USER_DATA_FILE="${SCRIPTS_DIR}/initialize.sh" -METADATA_ADDRESS="http://169.254.169.254/latest/meta-data" -MAC_ADDRESS=$(curl -s "${METADATA_ADDRESS}/network/interfaces/macs/" | head -n1 | tr -d '/') -CIDR=$(curl -s "${METADATA_ADDRESS}/network/interfaces/macs/${MAC_ADDRESS}/vpc-ipv4-cidr-block/") - -: "${GITHUB_TOKEN:?Should be set}" - -function start_arm_instance() { - set -x - mkdir -p "${WORKING_DIR}" - cd "${WORKING_DIR}" || exit 1 - aws ec2 run-instances \ - --region "${REGION}" \ - --image-id "${ARM_AMI}" \ - --count 1 \ - --block-device-mappings "[{\"DeviceName\":\"/dev/xvda\",\"Ebs\":{\"VolumeSize\":16}}]" \ - --instance-type "${INSTANCE_TYPE}" \ - --user-data "file://${USER_DATA_FILE}" \ - --instance-market-options "${MARKET_OPTIONS}" \ - --instance-initiated-shutdown-behavior terminate \ - --output json \ - > "${INSTANCE_INFO}" - - INSTANCE_ID=$(jq < "${INSTANCE_INFO}" ".Instances[0].InstanceId" -r) - if [[ ${INSTANCE_ID} == "" ]]; then - echo "ERROR!!!! Failed to start ARM instance. Likely because it could not be allocated on spot market." - exit 1 - fi - AVAILABILITY_ZONE=$(jq < "${INSTANCE_INFO}" ".Instances[0].Placement.AvailabilityZone" -r) - aws ec2 wait instance-status-ok --instance-ids "${INSTANCE_ID}" - INSTANCE_PRIVATE_DNS_NAME=$(aws ec2 describe-instances \ - --filters "Name=instance-state-name,Values=running" "Name=instance-id,Values=${INSTANCE_ID}" \ - --query 'Reservations[*].Instances[*].PrivateDnsName' --output text) - SECURITY_GROUP=$(jq < "${INSTANCE_INFO}" ".Instances[0].NetworkInterfaces[0].Groups[0].GroupId" -r) - rm -f my_key - ssh-keygen -t rsa -f my_key -N "" - aws ec2-instance-connect send-ssh-public-key --instance-id "${INSTANCE_ID}" \ - --availability-zone "${AVAILABILITY_ZONE}" \ - --instance-os-user "${EC2_USER}" \ - --ssh-public-key "file://${WORKING_DIR}/my_key.pub" - aws ec2 authorize-security-group-ingress --region "${REGION}" --group-id "${SECURITY_GROUP}" \ - --protocol tcp --port 22 --cidr "${CIDR}" || true - export AUTOSSH_LOGFILE="${WORKING_DIR}/autossh.log" - autossh -f "-L12357:/var/run/docker.sock" \ - -N -o "IdentitiesOnly=yes" -o "StrictHostKeyChecking=no" \ - -i "${WORKING_DIR}/my_key" "${EC2_USER}@${INSTANCE_PRIVATE_DNS_NAME}" - - bash -c 'echo -n "Waiting port 12357 .."; for _ in `seq 1 40`; do echo -n .; sleep 0.25; nc -z localhost 12357 && echo " Open." && exit ; done; echo " Timeout!" >&2; exit 1' -} - -function create_context() { - echo - echo "Creating buildx context: airflow_cache" - echo - docker buildx rm --force airflow_cache || true - docker buildx create --name airflow_cache - docker buildx create --name airflow_cache --append localhost:12357 - docker buildx ls - echo - echo "Context created" - echo -} - -start_arm_instance -create_context diff --git a/scripts/ci/install_breeze.sh b/scripts/ci/install_breeze.sh index aa5a3160060bf..093c8f6db9ce5 100755 --- a/scripts/ci/install_breeze.sh +++ b/scripts/ci/install_breeze.sh @@ -21,13 +21,15 @@ cd "$( dirname "${BASH_SOURCE[0]}" )/../../" PYTHON_ARG="" +PIP_VERSION="24.3.1" +UV_VERSION="0.5.17" if [[ ${PYTHON_VERSION=} != "" ]]; then PYTHON_ARG="--python=$(which python"${PYTHON_VERSION}") " fi -python -m pip install --upgrade pip==24.3.1 -python -m pip install "pipx>=1.4.1" -python -m pipx uninstall apache-airflow-breeze >/dev/null 2>&1 || true +python -m pip install --upgrade "pip==${PIP_VERSION}" +python -m pip install "uv==${UV_VERSION}" +uv tool uninstall apache-airflow-breeze >/dev/null 2>&1 || true # shellcheck disable=SC2086 -python -m pipx install ${PYTHON_ARG} --force --editable ./dev/breeze/ +uv tool install ${PYTHON_ARG} --force --editable ./dev/breeze/ echo '/home/runner/.local/bin' >> "${GITHUB_PATH}" diff --git a/scripts/ci/pre_commit/base_operator_partial_arguments.py b/scripts/ci/pre_commit/base_operator_partial_arguments.py deleted file mode 100755 index 14999e034edbc..0000000000000 --- a/scripts/ci/pre_commit/base_operator_partial_arguments.py +++ /dev/null @@ -1,164 +0,0 @@ -#!/usr/bin/env python -# -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -from __future__ import annotations - -import ast -import itertools -import pathlib -import sys -import typing - -ROOT_DIR = pathlib.Path(__file__).resolve().parents[3] - -BASEOPERATOR_PY = ROOT_DIR.joinpath("airflow", "models", "baseoperator.py") -MAPPEDOPERATOR_PY = ROOT_DIR.joinpath("airflow", "models", "mappedoperator.py") - -IGNORED = { - # These are only used in the worker and thus mappable. - "do_xcom_push", - "email_on_failure", - "email_on_retry", - "post_execute", - "pre_execute", - "multiple_outputs", - # Doesn't matter, not used anywhere. - "default_args", - # Deprecated and is aliased to max_active_tis_per_dag. - "task_concurrency", - # attrs internals. - "HIDE_ATTRS_FROM_UI", - # Only on BaseOperator. - "_dag", - "output", - "partial", - "shallow_copy_attrs", - # Only on MappedOperator. - "expand_input", - "partial_kwargs", -} - - -BO_MOD = ast.parse(BASEOPERATOR_PY.read_text("utf-8"), str(BASEOPERATOR_PY)) -MO_MOD = ast.parse(MAPPEDOPERATOR_PY.read_text("utf-8"), str(MAPPEDOPERATOR_PY)) - -BO_CLS = next( - node - for node in ast.iter_child_nodes(BO_MOD) - if isinstance(node, ast.ClassDef) and node.name == "BaseOperator" -) -BO_INIT = next( - node - for node in ast.iter_child_nodes(BO_CLS) - if isinstance(node, ast.FunctionDef) and node.name == "__init__" -) -BO_PARTIAL = next( - node - for node in ast.iter_child_nodes(BO_MOD) - if isinstance(node, ast.FunctionDef) and node.name == "partial" -) -MO_CLS = next( - node - for node in ast.iter_child_nodes(MO_MOD) - if isinstance(node, ast.ClassDef) and node.name == "MappedOperator" -) - - -def _compare(a: set[str], b: set[str], *, excludes: set[str]) -> tuple[set[str], set[str]]: - only_in_a = {n for n in a if n not in b and n not in excludes and n[0] != "_"} - only_in_b = {n for n in b if n not in a and n not in excludes and n[0] != "_"} - return only_in_a, only_in_b - - -def _iter_arg_names(func: ast.FunctionDef) -> typing.Iterator[str]: - func_args = func.args - for arg in itertools.chain(func_args.args, getattr(func_args, "posonlyargs", ()), func_args.kwonlyargs): - yield arg.arg - - -def check_baseoperator_partial_arguments() -> bool: - only_in_init, only_in_partial = _compare( - set(itertools.islice(_iter_arg_names(BO_INIT), 1, None)), - set(itertools.islice(_iter_arg_names(BO_PARTIAL), 1, None)), - excludes=IGNORED, - ) - if only_in_init: - print("Arguments in BaseOperator missing from partial():", ", ".join(sorted(only_in_init))) - if only_in_partial: - print("Arguments in partial() missing from BaseOperator:", ", ".join(sorted(only_in_partial))) - if only_in_init or only_in_partial: - return False - return True - - -def _iter_assignment_to_self_attributes(targets: typing.Iterable[ast.expr]) -> typing.Iterator[str]: - for t in targets: - if isinstance(t, ast.Attribute) and isinstance(t.value, ast.Name) and t.value.id == "self": - yield t.attr # Something like "self.foo = ...". - else: - # Recursively visit nodes in unpacking assignments like "a, b = ...". - yield from _iter_assignment_to_self_attributes(getattr(t, "elts", ())) - - -def _iter_assignment_targets(func: ast.FunctionDef) -> typing.Iterator[str]: - for stmt in func.body: - if isinstance(stmt, ast.AnnAssign): - yield from _iter_assignment_to_self_attributes([stmt.target]) - elif isinstance(stmt, ast.Assign): - yield from _iter_assignment_to_self_attributes(stmt.targets) - - -def _is_property(f: ast.FunctionDef) -> bool: - if len(f.decorator_list) != 1: - return False - decorator = f.decorator_list[0] - return isinstance(decorator, ast.Name) and decorator.id == "property" - - -def _iter_member_names(klass: ast.ClassDef) -> typing.Iterator[str]: - for node in ast.iter_child_nodes(klass): - if isinstance(node, ast.AnnAssign) and isinstance(node.target, ast.Name): - yield node.target.id - elif isinstance(node, ast.FunctionDef) and _is_property(node): - yield node.name - elif isinstance(node, ast.Assign): - if len(node.targets) == 1 and isinstance(target := node.targets[0], ast.Name): - yield target.id - - -def check_operator_member_parity() -> bool: - only_in_base, only_in_mapped = _compare( - set(itertools.chain(_iter_assignment_targets(BO_INIT), _iter_member_names(BO_CLS))), - set(_iter_member_names(MO_CLS)), - excludes=IGNORED, - ) - if only_in_base: - print("Members on BaseOperator missing from MappedOperator:", ", ".join(sorted(only_in_base))) - if only_in_mapped: - print("Members on MappedOperator missing from BaseOperator:", ", ".join(sorted(only_in_mapped))) - if only_in_base or only_in_mapped: - return False - return True - - -if __name__ == "__main__": - results = [ - check_baseoperator_partial_arguments(), - check_operator_member_parity(), - ] - sys.exit(not all(results)) diff --git a/scripts/ci/pre_commit/boring_cyborg.py b/scripts/ci/pre_commit/boring_cyborg.py index cf852b12bb6da..ec674485b5457 100755 --- a/scripts/ci/pre_commit/boring_cyborg.py +++ b/scripts/ci/pre_commit/boring_cyborg.py @@ -17,13 +17,11 @@ # under the License. from __future__ import annotations -import subprocess import sys from pathlib import Path import yaml from termcolor import colored -from wcmatch import glob if __name__ not in ("__main__", "__mp_main__"): raise SystemExit( @@ -33,9 +31,8 @@ CONFIG_KEY = "labelPRBasedOnFilePath" -current_files = subprocess.check_output(["git", "ls-files"]).decode().splitlines() -git_root = Path(subprocess.check_output(["git", "rev-parse", "--show-toplevel"]).decode().strip()) -cyborg_config_path = git_root / ".github" / "boring-cyborg.yml" +repo_root = Path(__file__).parent.parent.parent.parent +cyborg_config_path = repo_root / ".github" / "boring-cyborg.yml" cyborg_config = yaml.safe_load(cyborg_config_path.read_text()) if CONFIG_KEY not in cyborg_config: raise SystemExit(f"Missing section {CONFIG_KEY}") @@ -43,12 +40,14 @@ errors = [] for label, patterns in cyborg_config[CONFIG_KEY].items(): for pattern in patterns: - if glob.globfilter(current_files, pattern, flags=glob.G | glob.E): + try: + next(Path(repo_root).glob(pattern)) continue - yaml_path = f"{CONFIG_KEY}.{label}" - errors.append( - f"Unused pattern [{colored(pattern, 'cyan')}] in [{colored(yaml_path, 'cyan')}] section." - ) + except StopIteration: + yaml_path = f"{CONFIG_KEY}.{label}" + errors.append( + f"Unused pattern [{colored(pattern, 'cyan')}] in [{colored(yaml_path, 'cyan')}] section." + ) if errors: print(f"Found {colored(str(len(errors)), 'red')} problems:") diff --git a/scripts/ci/pre_commit/check_cncf_k8s_used_for_k8s_executor_only.py b/scripts/ci/pre_commit/check_cncf_k8s_used_for_k8s_executor_only.py index 0117c07c0c2f8..a452c648cdff3 100755 --- a/scripts/ci/pre_commit/check_cncf_k8s_used_for_k8s_executor_only.py +++ b/scripts/ci/pre_commit/check_cncf_k8s_used_for_k8s_executor_only.py @@ -51,8 +51,6 @@ def get_imports(path: str): errors: list[str] = [] -EXCEPTIONS = ["airflow/cli/commands/kubernetes_command.py"] - def main() -> int: for path in sys.argv[1:]: @@ -62,9 +60,8 @@ def main() -> int: import_count += 1 if len(imp.module) > 3: if imp.module[:4] == ["airflow", "providers", "cncf", "kubernetes"]: - if path not in EXCEPTIONS: - local_error_count += 1 - errors.append(f"{path}: ({'.'.join(imp.module)})") + local_error_count += 1 + errors.append(f"{path}: ({'.'.join(imp.module)})") console.print(f"[blue]{path}:[/] Import count: {import_count}, error_count {local_error_count}") if errors: console.print( diff --git a/scripts/ci/pre_commit/check_common_sql_dependency.py b/scripts/ci/pre_commit/check_common_sql_dependency.py index 9719310a7174d..59bce775aa63e 100755 --- a/scripts/ci/pre_commit/check_common_sql_dependency.py +++ b/scripts/ci/pre_commit/check_common_sql_dependency.py @@ -21,12 +21,15 @@ import ast import pathlib import sys -from typing import Iterable +from collections.abc import Iterable import yaml from packaging.specifiers import SpecifierSet from rich.console import Console +sys.path.insert(0, str(pathlib.Path(__file__).parent.resolve())) +from common_precommit_utils import get_provider_base_dir_from_path + console = Console(color_system="standard", width=200) @@ -36,10 +39,9 @@ MAKE_COMMON_METHOD_NAME: str = "_make_common_data_structure" -def get_classes(file_path: str) -> Iterable[ast.ClassDef]: +def get_classes(file_path: pathlib.Path) -> Iterable[ast.ClassDef]: """Return a list of class declared in the given python file.""" - pathlib_path = pathlib.Path(file_path) - module = ast.parse(pathlib_path.read_text("utf-8"), str(pathlib_path)) + module = ast.parse(file_path.read_text("utf-8"), filename=file_path.as_posix()) for node in ast.walk(module): if isinstance(node, ast.ClassDef): yield node @@ -53,7 +55,7 @@ def is_subclass_of_dbapihook(node: ast.ClassDef) -> bool: return False -def has_make_serializable_method(node: ast.ClassDef) -> bool: +def has_make_common_data_structure_method(node: ast.ClassDef) -> bool: """Return True if the given class implements `_make_common_data_structure` method.""" for body_element in node.body: if isinstance(body_element, ast.FunctionDef) and (body_element.name == MAKE_COMMON_METHOD_NAME): @@ -61,12 +63,7 @@ def has_make_serializable_method(node: ast.ClassDef) -> bool: return False -def determine_provider_yaml_path(file_path: str) -> str: - """Determine the path of the provider.yaml file related to the given python file.""" - return f"{file_path.split('/hooks')[0]}/provider.yaml" - - -def get_yaml_content(file_path: str) -> dict: +def get_yaml_content(file_path: pathlib.Path) -> dict: """Load content of a yaml files.""" with open(file_path) as file: return yaml.safe_load(file) @@ -93,13 +90,16 @@ def do_version_satisfies_constraints( def check_sql_providers_dependency(): error_count: int = 0 - for path in sys.argv[1:]: - if not path.startswith("airflow/providers/"): + for file_passed in sys.argv[1:]: + path = pathlib.Path(file_passed) + if not file_passed.startswith("providers/"): continue for clazz in get_classes(path): - if is_subclass_of_dbapihook(node=clazz) and has_make_serializable_method(node=clazz): - provider_yaml_path: str = determine_provider_yaml_path(file_path=path) + if is_subclass_of_dbapihook(node=clazz) and has_make_common_data_structure_method(node=clazz): + provider_yaml_path: pathlib.Path = ( + get_provider_base_dir_from_path(file_path=path) / "provider.yaml" + ) provider_metadata: dict = get_yaml_content(file_path=provider_yaml_path) if version_constraint := get_common_sql_constraints(provider_metadata=provider_metadata): diff --git a/scripts/ci/pre_commit/check_deferrable_default.py b/scripts/ci/pre_commit/check_deferrable_default.py deleted file mode 100755 index bfde61f231643..0000000000000 --- a/scripts/ci/pre_commit/check_deferrable_default.py +++ /dev/null @@ -1,128 +0,0 @@ -#!/usr/bin/env python -# -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -from __future__ import annotations - -import ast -import glob -import itertools -import os -import sys -from typing import Iterator - -import libcst as cst -from libcst.codemod import CodemodContext -from libcst.codemod.visitors import AddImportsVisitor - -ROOT_DIR = os.path.abspath(os.path.join(os.path.dirname(__file__), os.pardir, os.pardir, os.pardir)) - -DEFERRABLE_DOC = ( - "https://github.com/apache/airflow/blob/main/docs/apache-airflow/" - "authoring-and-scheduling/deferring.rst#writing-deferrable-operators" -) - - -class DefaultDeferrableVisitor(ast.NodeVisitor): - def __init__(self, *args, **kwargs) -> None: - super().__init__(*args, *kwargs) - self.error_linenos: list[int] = [] - - def visit_FunctionDef(self, node: ast.FunctionDef) -> ast.FunctionDef: - if node.name == "__init__": - args = node.args - arguments = reversed([*args.args, *args.posonlyargs, *args.kwonlyargs]) - defaults = reversed([*args.defaults, *args.kw_defaults]) - for argument, default in itertools.zip_longest(arguments, defaults): - # argument is not deferrable - if argument is None or argument.arg != "deferrable": - continue - - # argument is deferrable, but comes with no default value - if default is None: - self.error_linenos.append(argument.lineno) - continue - - # argument is deferrable, but the default value is not valid - if not _is_valid_deferrable_default(default): - self.error_linenos.append(default.lineno) - return node - - -class DefaultDeferrableTransformer(cst.CSTTransformer): - def leave_Param(self, original_node: cst.Param, updated_node: cst.Param) -> cst.Param: - if original_node.name.value == "deferrable": - expected_default_cst = cst.parse_expression( - 'conf.getboolean("operators", "default_deferrable", fallback=False)' - ) - - if updated_node.default and updated_node.default.deep_equals(expected_default_cst): - return updated_node - return updated_node.with_changes(default=expected_default_cst) - return updated_node - - -def _is_valid_deferrable_default(default: ast.AST) -> bool: - """Check whether default is 'conf.getboolean("operators", "default_deferrable", fallback=False)'""" - return ast.unparse(default) == "conf.getboolean('operators', 'default_deferrable', fallback=False)" - - -def iter_check_deferrable_default_errors(module_filename: str) -> Iterator[str]: - ast_tree = ast.parse(open(module_filename).read()) - visitor = DefaultDeferrableVisitor() - visitor.visit(ast_tree) - # We check the module using the ast once and then fix it through cst if needed. - # The primary reason we don't do it all through cst is performance. - if visitor.error_linenos: - _fix_invalide_deferrable_default_value(module_filename) - yield from (f"{module_filename}:{lineno}" for lineno in visitor.error_linenos) - - -def _fix_invalide_deferrable_default_value(module_filename: str) -> None: - context = CodemodContext(filename=module_filename) - AddImportsVisitor.add_needed_import(context, "airflow.configuration", "conf") - transformer = DefaultDeferrableTransformer() - - source_cst_tree = cst.parse_module(open(module_filename).read()) - modified_cst_tree = AddImportsVisitor(context).transform_module(source_cst_tree.visit(transformer)) - if not source_cst_tree.deep_equals(modified_cst_tree): - with open(module_filename, "w") as writer: - writer.write(modified_cst_tree.code) - - -def main() -> int: - modules = itertools.chain( - glob.glob(f"{ROOT_DIR}/airflow/**/sensors/**.py", recursive=True), - glob.glob(f"{ROOT_DIR}/airflow/**/operators/**.py", recursive=True), - ) - - errors = [error for module in modules for error in iter_check_deferrable_default_errors(module)] - if errors: - print("Incorrect deferrable default values detected at:") - for error in errors: - print(f" {error}") - print( - """Please set the default value of deferrable to """ - """"conf.getboolean("operators", "default_deferrable", fallback=False)"\n""" - f"See: {DEFERRABLE_DOC}\n" - ) - - return len(errors) - - -if __name__ == "__main__": - sys.exit(main()) diff --git a/scripts/ci/pre_commit/check_deprecations.py b/scripts/ci/pre_commit/check_deprecations.py deleted file mode 100755 index aa01e56065e40..0000000000000 --- a/scripts/ci/pre_commit/check_deprecations.py +++ /dev/null @@ -1,194 +0,0 @@ -#!/usr/bin/env python -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. - -from __future__ import annotations - -import ast -import sys -from functools import lru_cache -from pathlib import Path - -allowed_warnings: dict[str, tuple[str, ...]] = { - "airflow": ( - "airflow.exceptions.RemovedInAirflow3Warning", - "airflow.utils.context.AirflowContextDeprecationWarning", - ), - "providers": ("airflow.exceptions.AirflowProviderDeprecationWarning",), -} -compatible_decorators: frozenset[tuple[str, ...]] = frozenset( - [ - # PEP 702 decorators - ("warnings", "deprecated"), - ("typing_extensions", "deprecated"), - # `Deprecated` package decorators - ("deprecated", "deprecated"), - ("deprecated", "classic", "deprecated"), - ] -) - - -@lru_cache(maxsize=None) -def allowed_group_warnings(group: str) -> tuple[str, tuple[str, ...]]: - group_warnings = allowed_warnings[group] - if len(group_warnings) == 1: - return f"expected {group_warnings[0]} type", group_warnings - else: - return f"expected one of {', '.join(group_warnings)} types", group_warnings - - -def built_import_from(import_from: ast.ImportFrom) -> list[str]: - result: list[str] = [] - module_name = import_from.module - if not module_name: - return result - - imports_levels = module_name.count(".") + 1 - for import_path in compatible_decorators: - if imports_levels >= len(import_path): - continue - if module_name != ".".join(import_path[:imports_levels]): - continue - for name in import_from.names: - if name.name == import_path[imports_levels]: - alias: str = name.asname or name.name - remaining_part = len(import_path) - imports_levels - 1 - if remaining_part > 0: - alias = ".".join([alias, *import_path[-remaining_part:]]) - result.append(alias) - return result - - -def built_import(import_clause: ast.Import) -> list[str]: - result = [] - for name in import_clause.names: - module_name = name.name - imports_levels = module_name.count(".") + 1 - for import_path in compatible_decorators: - if imports_levels > len(import_path): - continue - if module_name != ".".join(import_path[:imports_levels]): - continue - - alias: str = name.asname or module_name - remaining_part = len(import_path) - imports_levels - if remaining_part > 0: - alias = ".".join([alias, *import_path[-remaining_part:]]) - result.append(alias) - return result - - -def found_compatible_decorators(mod: ast.Module) -> tuple[str, ...]: - result = [] - for node in mod.body: - if not isinstance(node, (ast.ImportFrom, ast.Import)): - continue - result.extend(built_import_from(node) if isinstance(node, ast.ImportFrom) else built_import(node)) - return tuple(sorted(set(result))) - - -def resolve_name(obj: ast.Attribute | ast.Name) -> str: - name = "" - while True: - if isinstance(obj, ast.Name): - name = f"{obj.id}.{name}" if name else obj.id - break - elif isinstance(obj, ast.Attribute): - name = f"{obj.attr}.{name}" if name else obj.attr - obj = obj.value # type: ignore[assignment] - else: - msg = f"Expected to got ast.Name or ast.Attribute but got {type(obj).__name__!r}." - raise SystemExit(msg) - - return name - - -def resolve_decorator_name(obj: ast.Call | ast.Attribute | ast.Name) -> str: - return resolve_name(obj.func if isinstance(obj, ast.Call) else obj) # type: ignore[arg-type] - - -def check_decorators(mod: ast.Module, file: str, file_group: str) -> int: - if not (decorators_names := found_compatible_decorators(mod)): - # There are no expected decorators into module, exit early - return 0 - - errors = 0 - for node in ast.walk(mod): - if not hasattr(node, "decorator_list"): - continue - - for decorator in node.decorator_list: - decorator_name = resolve_decorator_name(decorator) - if decorator_name not in decorators_names: - continue - - expected_types, warns_types = allowed_group_warnings(file_group) - category_keyword: ast.keyword | None = next( - filter(lambda k: k and k.arg == "category", decorator.keywords), None - ) - if category_keyword is None: - errors += 1 - print( - f"{file}:{decorator.lineno}: Missing `category` keyword on decorator @{decorator_name}, " - f"{expected_types}" - ) - continue - elif not hasattr(category_keyword, "value"): - continue - category_value_ast = category_keyword.value - - warns_types = allowed_warnings[file_group] - if isinstance(category_value_ast, (ast.Name, ast.Attribute)): - category_value = resolve_name(category_value_ast) - if not any(cv.endswith(category_value) for cv in warns_types): - errors += 1 - print( - f"{file}:{category_keyword.lineno}: " - f"category={category_value}, but {expected_types}" - ) - elif isinstance(category_value_ast, ast.Constant): - errors += 1 - print( - f"{file}:{category_keyword.lineno}: " - f"category=Literal[{category_value_ast.value!r}], but {expected_types}" - ) - - return errors - - -def check_file(file: str) -> int: - file_path = Path(file) - if not file_path.as_posix().startswith("airflow"): - # Not expected file, exit early - return 0 - file_group = "providers" if file_path.as_posix().startswith("airflow/providers") else "airflow" - ast_module = ast.parse(file_path.read_text("utf-8"), file) - errors = 0 - errors += check_decorators(ast_module, file, file_group) - return errors - - -def main(*args: str) -> int: - errors = sum(check_file(file) for file in args[1:]) - if not errors: - return 0 - print(f"Found {errors} error{'s' if errors > 1 else ''}.") - return 1 - - -if __name__ == "__main__": - sys.exit(main(*sys.argv)) diff --git a/scripts/ci/pre_commit/check_imports_in_providers.py b/scripts/ci/pre_commit/check_imports_in_providers.py new file mode 100755 index 0000000000000..a1fc17104a2bd --- /dev/null +++ b/scripts/ci/pre_commit/check_imports_in_providers.py @@ -0,0 +1,105 @@ +#!/usr/bin/env python +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from __future__ import annotations + +import json +import os.path +import subprocess +import sys +from pathlib import Path + +sys.path.insert(0, str(Path(__file__).parent.resolve())) +from common_precommit_utils import ( + AIRFLOW_SOURCES_ROOT_PATH, + console, + get_provider_base_dir_from_path, + get_provider_id_from_path, +) + +errors_found = False + + +def check_imports(folders_to_check: list[Path]): + global errors_found + cmd = [ + "ruff", + "analyze", + "graph", + *[ + folder_to_check.as_posix() + for folder_to_check in folders_to_check + if (folder_to_check.parent / "pyproject.toml").exists() + ], + ] + console.print("Cmd", cmd) + import_tree_str = subprocess.check_output(cmd) + import_tree = json.loads(import_tree_str) + # Uncomment these if you want to debug strange dependencies and see if ruff gets it right + console.print("Dependencies discovered by ruff:") + console.print(import_tree) + + for importing_file in sys.argv[1:]: + if not importing_file.startswith("providers/"): + console.print(f"[yellow]Skipping non-provider file: {importing_file}") + continue + importing_file_path = Path(importing_file) + console.print(importing_file_path) + imported_files_array = import_tree.get(importing_file, None) + if imported_files_array is None: + if importing_file != "providers/src/airflow/providers/__init__.py": + # providers/__init__.py should be ignored + console.print(f"[red]The file {importing_file} is not discovered by ruff analyze!") + errors_found = True + continue + imported_file_paths = [Path(file) for file in imported_files_array] + for imported_file_path in imported_file_paths: + if imported_file_path.name == "version_compat.py": + # Note - this will check also imports from other places - not only from providers + # Which means that import from tests_common, and airflow will be also banned + common_path = os.path.commonpath([importing_file, imported_file_path.as_posix()]) + imported_file_parent_dir = imported_file_path.parent.as_posix() + if common_path != imported_file_parent_dir: + provider_id = get_provider_id_from_path(importing_file_path) + provider_dir = get_provider_base_dir_from_path(importing_file_path) + console.print( + f"\n[red]Invalid import of `version_compat` module in provider {provider_id} in:\n" + ) + console.print(f"[yellow]{importing_file_path}") + console.print( + f"\n[bright_blue]The AIRFLOW_V_X_Y_PLUS import should be " + f"from the {provider_id} provider root directory ({provider_dir}), but it is currently from:" + ) + console.print(f"\n[yellow]{imported_file_path}\n") + console.print( + f"1. Copy `version_compat`.py to `{provider_dir}/version_compat.py` if not there.\n" + f"2. Import the version constants you need as:\n\n" + f"[yellow]from airflow.providers.{provider_id}.version_compat import ...[/]\n" + f"\n" + ) + errors_found = True + + +find_all_source_providers = AIRFLOW_SOURCES_ROOT_PATH.rglob("**/src/") + +check_imports([*find_all_source_providers, AIRFLOW_SOURCES_ROOT_PATH / "tests_common"]) + +if errors_found: + console.print("\n[red]Errors found in imports![/]\n") + sys.exit(1) +else: + console.print("\n[green]All version_compat imports are correct![/]\n") diff --git a/scripts/ci/pre_commit/check_min_python_version.py b/scripts/ci/pre_commit/check_min_python_version.py index ac175eac50e09..825b899241816 100755 --- a/scripts/ci/pre_commit/check_min_python_version.py +++ b/scripts/ci/pre_commit/check_min_python_version.py @@ -26,7 +26,7 @@ from common_precommit_utils import console # update this version when we switch to a newer version of Python -required_version = tuple(map(int, "3.8".split("."))) +required_version = tuple(map(int, "3.9".split("."))) required_version_str = f"{required_version[0]}.{required_version[1]}" global_version = tuple( map( diff --git a/scripts/ci/pre_commit/check_pre_commit_hooks.py b/scripts/ci/pre_commit/check_pre_commit_hooks.py index 727b4d4bc0425..76d21980fea48 100755 --- a/scripts/ci/pre_commit/check_pre_commit_hooks.py +++ b/scripts/ci/pre_commit/check_pre_commit_hooks.py @@ -68,7 +68,7 @@ def get_errors_and_hooks(content: Any, max_length: int) -> tuple[list[str], dict name = hook["name"] if len(name) > max_length: errors.append( - f"Name is too long for hook `{hook_id}` in {PRE_COMMIT_YAML_FILE}. Please shorten it!" + f"Name is too long for hook `{name}` in {PRE_COMMIT_YAML_FILE}. Please shorten it!" ) continue hooks[hook_id].append(name) diff --git a/scripts/ci/pre_commit/check_provider_yaml_files.py b/scripts/ci/pre_commit/check_provider_yaml_files.py index fcbe2512910a3..f848e38afa0b2 100755 --- a/scripts/ci/pre_commit/check_provider_yaml_files.py +++ b/scripts/ci/pre_commit/check_provider_yaml_files.py @@ -17,12 +17,15 @@ # under the License. from __future__ import annotations -import os import sys from pathlib import Path sys.path.insert(0, str(Path(__file__).parent.resolve())) -from common_precommit_utils import console, initialize_breeze_precommit, run_command_via_breeze_shell +from common_precommit_utils import ( + initialize_breeze_precommit, + run_command_via_breeze_shell, + validate_cmd_result, +) initialize_breeze_precommit(__name__, __file__) @@ -33,10 +36,4 @@ warn_image_upgrade_needed=True, extra_env={"PYTHONWARNINGS": "default"}, ) -if cmd_result.returncode != 0 and os.environ.get("CI") != "true": - console.print( - "\n[yellow]If you see strange stacktraces above, especially about missing imports " - "run this command:[/]\n" - ) - console.print("[magenta]breeze ci-image build --python 3.8 --upgrade-to-newer-dependencies[/]\n") -sys.exit(cmd_result.returncode) +validate_cmd_result(cmd_result, include_ci_env_check=True) diff --git a/scripts/ci/pre_commit/check_providers_subpackages_all_have_init.py b/scripts/ci/pre_commit/check_providers_subpackages_all_have_init.py index da17f794eaeb6..649e45a2e878d 100755 --- a/scripts/ci/pre_commit/check_providers_subpackages_all_have_init.py +++ b/scripts/ci/pre_commit/check_providers_subpackages_all_have_init.py @@ -19,25 +19,33 @@ import os import sys -from glob import glob from pathlib import Path -ROOT_DIR = os.path.abspath(os.path.join(os.path.dirname(__file__), os.pardir, os.pardir, os.pardir)) -ACCEPTED_NON_INIT_DIRS = ["adr", "doc", "templates"] +ROOT_DIR = Path(__file__).parents[3].resolve() +ACCEPTED_NON_INIT_DIRS = [ + "adr", + "doc", + "templates", + "__pycache__", + "static", +] -def check_dir_init_file(provider_files: list[str]) -> None: +def check_dir_init_file(folders: list[Path]) -> None: missing_init_dirs: list[Path] = [] - for candidate_path in provider_files: - if candidate_path.endswith("/__pycache__"): - continue - path = Path(candidate_path) - if path.is_dir() and not (path / "__init__.py").exists(): - if path.name not in ACCEPTED_NON_INIT_DIRS: - missing_init_dirs.append(path) + folders = list(folders) + for path in folders: + for root, dirs, files in os.walk(path): + # Edit it in place, so we don't recurse to folders we don't care about + dirs[:] = [d for d in dirs if d not in ACCEPTED_NON_INIT_DIRS] + + if "__init__.py" in files: + continue + + missing_init_dirs.append(Path(root)) if missing_init_dirs: - with open(os.path.join(ROOT_DIR, "scripts/ci/license-templates/LICENSE.txt")) as license: + with ROOT_DIR.joinpath("scripts/ci/license-templates/LICENSE.txt").open() as license: license_txt = license.readlines() prefixed_licensed_txt = [f"# {line}" if line != "\n" else "#\n" for line in license_txt] @@ -51,7 +59,11 @@ def check_dir_init_file(provider_files: list[str]) -> None: if __name__ == "__main__": - all_provider_subpackage_dirs = sorted(glob(f"{ROOT_DIR}/airflow/providers/**/*", recursive=True)) - check_dir_init_file(all_provider_subpackage_dirs) - all_test_provider_subpackage_dirs = sorted(glob(f"{ROOT_DIR}/tests/providers/**/*", recursive=True)) - check_dir_init_file(all_test_provider_subpackage_dirs) + providers_root = Path(f"{ROOT_DIR}/providers") + providers_ns = providers_root.joinpath("src", "airflow", "providers") + providers_tests = providers_root.joinpath("tests") + + providers_pkgs = sorted(map(lambda f: f.parent, providers_ns.rglob("provider.yaml"))) + check_dir_init_file(providers_pkgs) + + check_dir_init_file([providers_root / "tests"]) diff --git a/scripts/ci/pre_commit/check_system_tests.py b/scripts/ci/pre_commit/check_system_tests.py index 89e2a9f24ae5c..3d5c743b54f78 100755 --- a/scripts/ci/pre_commit/check_system_tests.py +++ b/scripts/ci/pre_commit/check_system_tests.py @@ -38,13 +38,13 @@ WATCHER_APPEND_INSTRUCTION_SHORT = " >> watcher()" PYTEST_FUNCTION = """ -from tests.system.utils import get_test_run # noqa: E402 +from tests_common.test_utils.system_tests import get_test_run # noqa: E402 # Needed to run the example DAG with pytest (see: tests/system/README.md#run_via_pytest) test_run = get_test_run(dag) """ PYTEST_FUNCTION_PATTERN = re.compile( - r"from tests\.system\.utils import get_test_run(?: # noqa: E402)?\s+" + r"from tests_common\.test_utils\.system_tests import get_test_run(?: # noqa: E402)?\s+" r"(?:# .+\))?\s+" r"test_run = get_test_run\(dag\)" ) @@ -52,14 +52,14 @@ def _check_file(file: Path): content = file.read_text() - if "from tests.system.utils.watcher import watcher" in content: + if "from tests_common.test_utils.watcher import watcher" in content: index = content.find(WATCHER_APPEND_INSTRUCTION_SHORT) if index == -1: errors.append( - f"[red]The example {file} imports tests.system.utils.watcher " + f"[red]The example {file} imports tests_common.test_utils.watcher " f"but does not use it properly![/]\n\n" "[yellow]Make sure you have:[/]\n\n" - f" {WATCHER_APPEND_INSTRUCTION}\n\n" + f" {WATCHER_APPEND_INSTRUCTION_SHORT}\n\n" "[yellow]as the last instruction in your example DAG.[/]\n" ) else: diff --git a/scripts/ci/pre_commit/check_system_tests_hidden_in_index.py b/scripts/ci/pre_commit/check_system_tests_hidden_in_index.py index fde6f38f45a9a..1c1fdb02c1793 100755 --- a/scripts/ci/pre_commit/check_system_tests_hidden_in_index.py +++ b/scripts/ci/pre_commit/check_system_tests_hidden_in_index.py @@ -54,10 +54,10 @@ def check_system_test_entry_hidden(provider_index: Path): :maxdepth: 1 :caption: System tests - System Tests <_api/tests/system/providers/{provider_path}/index> + System Tests <_api/tests/system/{provider_path}/index> """ index_text = provider_index.read_text() - system_tests_path = AIRFLOW_SOURCES_ROOT / "tests" / "system" / "providers" / provider_path + system_tests_path = AIRFLOW_SOURCES_ROOT / "providers" / "tests" / "system" / provider_path index_text_manual = index_text.split( ".. THE REMAINDER OF THE FILE IS AUTOMATICALLY GENERATED. IT WILL BE OVERWRITTEN AT RELEASE TIME!" )[0] diff --git a/scripts/ci/pre_commit/check_template_fields.py b/scripts/ci/pre_commit/check_template_fields.py new file mode 100755 index 0000000000000..da0b60fbd978f --- /dev/null +++ b/scripts/ci/pre_commit/check_template_fields.py @@ -0,0 +1,40 @@ +#!/usr/bin/env python +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from __future__ import annotations + +import sys +from pathlib import Path + +sys.path.insert(0, str(Path(__file__).parent.resolve())) +from common_precommit_utils import ( + initialize_breeze_precommit, + run_command_via_breeze_shell, + validate_cmd_result, +) + +initialize_breeze_precommit(__name__, __file__) +py_files_to_test = sys.argv[1:] + +cmd_result = run_command_via_breeze_shell( + ["python3", "/opt/airflow/scripts/in_container/run_template_fields_check.py", *py_files_to_test], + backend="sqlite", + warn_image_upgrade_needed=True, + extra_env={"PYTHONWARNINGS": "default"}, +) + +validate_cmd_result(cmd_result, include_ci_env_check=True) diff --git a/scripts/ci/pre_commit/check_tests_in_right_folders.py b/scripts/ci/pre_commit/check_tests_in_right_folders.py index b4f68baffa1f1..67d20b1f844d6 100755 --- a/scripts/ci/pre_commit/check_tests_in_right_folders.py +++ b/scripts/ci/pre_commit/check_tests_in_right_folders.py @@ -34,6 +34,7 @@ "api_connexion", "api_experimental", "api_internal", + "assets", "auth", "callbacks", "charts", diff --git a/scripts/ci/pre_commit/check_ti_vs_tis_attributes.py b/scripts/ci/pre_commit/check_ti_vs_tis_attributes.py index 1dfc51a0a040e..c8cb358960f17 100755 --- a/scripts/ci/pre_commit/check_ti_vs_tis_attributes.py +++ b/scripts/ci/pre_commit/check_ti_vs_tis_attributes.py @@ -48,10 +48,13 @@ def compare_attributes(path1, path2): "task_instance_note", "dag_run", "trigger", - "execution_date", + "logical_date", "triggerer_job", "note", "rendered_task_instance_fields", + # Storing last heartbeat for historic TIs is not interesting/useful + "last_heartbeat_at", + "dag_version", } # exclude attrs not necessary to be in TaskInstanceHistory if not diff: return diff --git a/scripts/ci/pre_commit/checkout_no_credentials.py b/scripts/ci/pre_commit/checkout_no_credentials.py index 02e8f0a20f77a..02a720eda6d3a 100755 --- a/scripts/ci/pre_commit/checkout_no_credentials.py +++ b/scripts/ci/pre_commit/checkout_no_credentials.py @@ -57,6 +57,13 @@ def check_file(the_file: Path) -> int: # build. This is ok for security, because we are pushing it only in the `main` branch # of the repository and only for unprotected constraints branch continue + if step.get("id") == "checkout-for-backport": + # This is a special case - we are ok with persisting credentials in backport + # step, because we need them to push backport branch back to the repository in + # backport checkout-for-backport step and create pr for cherry-picker. This is ok for + # security, because cherry picker pushing it only in the `main` branch of the repository + # and only for unprotected backport branch + continue persist_credentials = with_clause.get("persist-credentials") if persist_credentials is None: console.print( diff --git a/scripts/ci/pre_commit/common_precommit_utils.py b/scripts/ci/pre_commit/common_precommit_utils.py index a70e5cfc848bd..f406c8116622d 100644 --- a/scripts/ci/pre_commit/common_precommit_utils.py +++ b/scripts/ci/pre_commit/common_precommit_utils.py @@ -214,3 +214,48 @@ def check_list_sorted(the_list: list[str], message: str, errors: list[str]) -> b console.print() errors.append(f"ERROR in {message}. The elements are not sorted/unique.") return False + + +def validate_cmd_result(cmd_result, include_ci_env_check=False): + if include_ci_env_check: + if cmd_result.returncode != 0 and os.environ.get("CI") != "true": + console.print( + "\n[yellow]If you see strange stacktraces above, especially about missing imports " + "run this command:[/]\n" + ) + console.print("[magenta]breeze ci-image build --python 3.8 --upgrade-to-newer-dependencies[/]\n") + + elif cmd_result.returncode != 0: + console.print( + "[warning]\nIf you see strange stacktraces above, " + "run `breeze ci-image build --python 3.8` and try again." + ) + sys.exit(cmd_result.returncode) + + +def get_provider_id_from_path(file_path: Path) -> str | None: + """ + Get the provider id from the path of the file it belongs to. + """ + for parent in file_path.parents: + # This works fine for both new and old providers structure - because we moved provider.yaml to + # the top-level of the provider and this code finding "providers" will find the "providers" package + # in old structure and "providers" directory in new structure - in both cases we can determine + # the provider id from the relative folders + if (parent / "provider.yaml").exists(): + for providers_root_candidate in parent.parents: + if providers_root_candidate.name == "providers": + return parent.relative_to(providers_root_candidate).as_posix().replace("/", ".") + else: + return None + return None + + +def get_provider_base_dir_from_path(file_path: Path) -> Path | None: + """ + Get the provider base dir (where provider.yaml is) from the path of the file it belongs to. + """ + for parent in file_path.parents: + if (parent / "provider.yaml").exists(): + return parent + return None diff --git a/scripts/ci/pre_commit/compat_cache_on_methods.py b/scripts/ci/pre_commit/compat_cache_on_methods.py deleted file mode 100755 index 5fee74ff2a4ac..0000000000000 --- a/scripts/ci/pre_commit/compat_cache_on_methods.py +++ /dev/null @@ -1,69 +0,0 @@ -#!/usr/bin/env python -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -from __future__ import annotations - -import ast -import pathlib -import sys - -COMPAT_MODULE = "airflow.compat.functools" - - -def check_test_file(file: str) -> int: - node = ast.parse(pathlib.Path(file).read_text("utf-8"), file) - if not (classes := [c for c in node.body if isinstance(c, ast.ClassDef)]): - # Exit early if module doesn't contain any classes - return 0 - - compat_cache_aliases = [] - for stmt in node.body: - if not isinstance(stmt, ast.ImportFrom) or stmt.module != COMPAT_MODULE: - continue - for alias in stmt.names: - if "cache" in alias.name: - compat_cache_aliases.append(alias.asname or alias.name) - if not compat_cache_aliases: - # Exit early in case if there are no imports from `airflow.compat.functools.cache` - return 0 - - found = 0 - for klass in classes: - for cls_stmt in klass.body: - if not isinstance(cls_stmt, ast.FunctionDef) or not cls_stmt.decorator_list: - continue - for decorator in cls_stmt.decorator_list: - if (isinstance(decorator, ast.Name) and decorator.id in compat_cache_aliases) or ( - isinstance(decorator, ast.Attribute) and decorator.attr in compat_cache_aliases - ): - found += 1 - prefix = f"{file}:{decorator.lineno}:" - print(f"{prefix} Use of `{COMPAT_MODULE}.cache` on methods can lead to memory leaks") - - return found - - -def main(*args: str) -> int: - errors = sum(check_test_file(file) for file in args[1:]) - if not errors: - return 0 - print(f"Found {errors} error{'s' if errors > 1 else ''}.") - return 1 - - -if __name__ == "__main__": - sys.exit(main(*sys.argv)) diff --git a/scripts/ci/pre_commit/compile_ui_assets.py b/scripts/ci/pre_commit/compile_ui_assets.py new file mode 100755 index 0000000000000..cd63b7a5676be --- /dev/null +++ b/scripts/ci/pre_commit/compile_ui_assets.py @@ -0,0 +1,89 @@ +#!/usr/bin/env python3 +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from __future__ import annotations + +import hashlib +import os +import re +import shutil +import subprocess +import sys +from pathlib import Path + +# NOTE!. This script is executed from node environment created by pre-commit and this environment +# Cannot have additional Python dependencies installed. We should not import any of the libraries +# here that are not available in stdlib! You should not import common_precommit_utils.py here because +# They are importing rich library which is not available in the node environment. + +AIRFLOW_SOURCES_PATH = Path(__file__).parents[3].resolve() +UI_HASH_FILE = AIRFLOW_SOURCES_PATH / ".build" / "ui" / "hash.txt" + +INTERNAL_SERVER_ERROR = "500 Internal Server Error" + + +def get_directory_hash(directory: Path, skip_path_regexp: str | None = None) -> str: + files = sorted(directory.rglob("*")) + if skip_path_regexp: + matcher = re.compile(skip_path_regexp) + files = [file for file in files if not matcher.match(os.fspath(file.resolve()))] + sha = hashlib.sha256() + for file in files: + if file.is_file() and not file.name.startswith("."): + sha.update(file.read_bytes()) + return sha.hexdigest() + + +if __name__ not in ("__main__", "__mp_main__"): + raise SystemExit( + "This file is intended to be executed as an executable program. You cannot use it as a module." + f"To run this script, run the ./{__file__} command" + ) + +if __name__ == "__main__": + ui_directory = AIRFLOW_SOURCES_PATH / "airflow" / "ui" + node_modules_directory = ui_directory / "node_modules" + dist_directory = ui_directory / "dist" + UI_HASH_FILE.parent.mkdir(exist_ok=True, parents=True) + if node_modules_directory.exists() and dist_directory.exists(): + old_hash = UI_HASH_FILE.read_text() if UI_HASH_FILE.exists() else "" + new_hash = get_directory_hash(ui_directory, skip_path_regexp=r".*node_modules.*") + if new_hash == old_hash: + print("The UI directory has not changed! Skip regeneration.") + sys.exit(0) + else: + shutil.rmtree(node_modules_directory, ignore_errors=True) + shutil.rmtree(dist_directory, ignore_errors=True) + env = os.environ.copy() + env["FORCE_COLOR"] = "true" + for try_num in range(3): + print(f"### Trying to install yarn dependencies: attempt: {try_num + 1} ###") + result = subprocess.run( + ["pnpm", "install", "--frozen-lockfile", "--config.confirmModulesPurge=false"], + cwd=os.fspath(ui_directory), + text=True, + check=False, + capture_output=True, + ) + if result.returncode == 0: + break + if try_num == 2 or INTERNAL_SERVER_ERROR not in result.stderr + result.stdout: + print(result.stdout + "\n" + result.stderr) + sys.exit(result.returncode) + subprocess.check_call(["pnpm", "run", "build"], cwd=os.fspath(ui_directory), env=env) + new_hash = get_directory_hash(ui_directory, skip_path_regexp=r".*node_modules.*") + UI_HASH_FILE.write_text(new_hash) diff --git a/scripts/ci/pre_commit/compile_ui_assets_dev.py b/scripts/ci/pre_commit/compile_ui_assets_dev.py new file mode 100755 index 0000000000000..d820db8701eba --- /dev/null +++ b/scripts/ci/pre_commit/compile_ui_assets_dev.py @@ -0,0 +1,65 @@ +#!/usr/bin/env python3 +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from __future__ import annotations + +import os +import subprocess +from pathlib import Path + +# NOTE!. This script is executed from node environment created by pre-commit and this environment +# Cannot have additional Python dependencies installed. We should not import any of the libraries +# here that are not available in stdlib! You should not import common_precommit_utils.py here because +# They are importing rich library which is not available in the node environment. + +if __name__ not in ("__main__", "__mp_main__"): + raise SystemExit( + "This file is intended to be executed as an executable program. You cannot use it as a module." + f"To run this script, run the ./{__file__} command" + ) + +AIRFLOW_SOURCES_PATH = Path(__file__).parents[3].resolve() +UI_CACHE_DIR = AIRFLOW_SOURCES_PATH / ".build" / "ui" +UI_HASH_FILE = UI_CACHE_DIR / "hash.txt" +UI_ASSET_OUT_FILE = UI_CACHE_DIR / "asset_compile.out" +UI_ASSET_OUT_DEV_MODE_FILE = UI_CACHE_DIR / "asset_compile_dev_mode.out" + +if __name__ == "__main__": + ui_directory = AIRFLOW_SOURCES_PATH / "airflow" / "ui" + UI_CACHE_DIR.mkdir(parents=True, exist_ok=True) + if UI_HASH_FILE.exists(): + # cleanup hash of ui so that next compile-assets recompiles them + UI_HASH_FILE.unlink() + env = os.environ.copy() + env["FORCE_COLOR"] = "true" + UI_ASSET_OUT_FILE.unlink(missing_ok=True) + with open(UI_ASSET_OUT_DEV_MODE_FILE, "w") as f: + subprocess.run( + ["pnpm", "install", "--frozen-lockfile", "--config.confirmModulesPurge=false"], + cwd=os.fspath(ui_directory), + check=True, + stdout=f, + stderr=subprocess.STDOUT, + ) + subprocess.run( + ["pnpm", "dev"], + check=True, + cwd=os.fspath(ui_directory), + env=env, + stdout=f, + stderr=subprocess.STDOUT, + ) diff --git a/scripts/ci/pre_commit/compile_www_assets.py b/scripts/ci/pre_commit/compile_www_assets.py index bf2664685ed6c..8e6a845eae0ab 100755 --- a/scripts/ci/pre_commit/compile_www_assets.py +++ b/scripts/ci/pre_commit/compile_www_assets.py @@ -52,6 +52,8 @@ def get_directory_hash(directory: Path, skip_path_regexp: str | None = None) -> f"To run this script, run the ./{__file__} command" ) +INTERNAL_SERVER_ERROR = "500 Internal Server Error" + if __name__ == "__main__": www_directory = AIRFLOW_SOURCES_PATH / "airflow" / "www" node_modules_directory = www_directory / "node_modules" @@ -68,7 +70,20 @@ def get_directory_hash(directory: Path, skip_path_regexp: str | None = None) -> shutil.rmtree(dist_directory, ignore_errors=True) env = os.environ.copy() env["FORCE_COLOR"] = "true" - subprocess.check_call(["yarn", "install", "--frozen-lockfile"], cwd=os.fspath(www_directory)) + for try_num in range(3): + print(f"### Trying to install yarn dependencies: attempt: {try_num + 1} ###") + result = subprocess.run( + ["yarn", "install", "--frozen-lockfile"], + cwd=os.fspath(www_directory), + text=True, + check=False, + capture_output=True, + ) + if result.returncode == 0: + break + if try_num == 2 or INTERNAL_SERVER_ERROR not in result.stderr + result.stdout: + print(result.stdout + "\n" + result.stderr) + sys.exit(result.returncode) subprocess.check_call(["yarn", "run", "build"], cwd=os.fspath(www_directory), env=env) new_hash = get_directory_hash(www_directory, skip_path_regexp=r".*node_modules.*") WWW_HASH_FILE.write_text(new_hash) diff --git a/scripts/ci/pre_commit/decorator_operator_implements_custom_name.py b/scripts/ci/pre_commit/decorator_operator_implements_custom_name.py index f0ce8a5bd2b65..d99ed0a1b0f6b 100755 --- a/scripts/ci/pre_commit/decorator_operator_implements_custom_name.py +++ b/scripts/ci/pre_commit/decorator_operator_implements_custom_name.py @@ -22,7 +22,7 @@ import itertools import pathlib import sys -from typing import Iterator +from collections.abc import Iterator def iter_decorated_operators(source: pathlib.Path) -> Iterator[ast.ClassDef]: diff --git a/scripts/ci/pre_commit/generate_airflow_diagrams.py b/scripts/ci/pre_commit/generate_airflow_diagrams.py index f809d566e3c89..27cb4106a10a7 100755 --- a/scripts/ci/pre_commit/generate_airflow_diagrams.py +++ b/scripts/ci/pre_commit/generate_airflow_diagrams.py @@ -44,9 +44,19 @@ def main(): hash_file = source_file.with_suffix(".md5sum") if not hash_file.exists() or not hash_file.read_text().strip() == str(checksum).strip(): console.print(f"[bright_blue]Changes in {source_file}. Regenerating the image.") - subprocess.run( - [sys.executable, source_file.resolve().as_posix()], check=True, cwd=source_file.parent + process = subprocess.run( + [sys.executable, source_file.resolve().as_posix()], check=False, cwd=source_file.parent ) + if process.returncode != 0: + if sys.platform == "darwin": + console.print( + "[red]Likely you have no graphviz installed[/]" + "Please install eralchemy2 package to run this script. " + "This will require to install graphviz, " + "and installing graphviz might be difficult for MacOS. Please follow: " + "https://pygraphviz.github.io/documentation/stable/install.html#macos ." + ) + sys.exit(process.returncode) hash_file.write_text(str(checksum) + "\n") else: console.print(f"[bright_blue]No changes in {source_file}. Not regenerating the image.") diff --git a/scripts/ci/pre_commit/helm_lint.py b/scripts/ci/pre_commit/helm_lint.py index 7663fb4672051..bb1150949aadb 100755 --- a/scripts/ci/pre_commit/helm_lint.py +++ b/scripts/ci/pre_commit/helm_lint.py @@ -33,7 +33,7 @@ sys.exit(res_setup.returncode) AIRFLOW_SOURCES_DIR = Path(__file__).parents[3].resolve() -HELM_BIN_PATH = AIRFLOW_SOURCES_DIR / ".build" / ".k8s-env" / "bin" / "helm" +HELM_BIN_PATH = AIRFLOW_SOURCES_DIR / ".build" / "k8s-env" / "bin" / "helm" result = subprocess.run( [os.fspath(HELM_BIN_PATH), "lint", ".", "-f", "values.yaml"], diff --git a/scripts/ci/pre_commit/kubeconform.py b/scripts/ci/pre_commit/kubeconform.py index 3bfccd2f8d251..2452c2158d53e 100755 --- a/scripts/ci/pre_commit/kubeconform.py +++ b/scripts/ci/pre_commit/kubeconform.py @@ -33,7 +33,7 @@ sys.exit(res_setup.returncode) AIRFLOW_SOURCES_DIR = Path(__file__).parents[3].resolve() -HELM_BIN_PATH = AIRFLOW_SOURCES_DIR / ".build" / ".k8s-env" / "bin" / "helm" +HELM_BIN_PATH = AIRFLOW_SOURCES_DIR / ".build" / "k8s-env" / "bin" / "helm" ps = subprocess.Popen( [os.fspath(HELM_BIN_PATH), "template", ".", "-f", "values.yaml"], diff --git a/scripts/ci/pre_commit/lint_ui.py b/scripts/ci/pre_commit/lint_ui.py new file mode 100755 index 0000000000000..bac91b6dfc20f --- /dev/null +++ b/scripts/ci/pre_commit/lint_ui.py @@ -0,0 +1,37 @@ +#!/usr/bin/env python3 +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from __future__ import annotations + +import subprocess +from pathlib import Path + +if __name__ not in ("__main__", "__mp_main__"): + raise SystemExit( + "This file is intended to be executed as an executable program. You cannot use it as a module." + f"To run this script, run the ./{__file__} command" + ) + +if __name__ == "__main__": + dir = Path("airflow") / "ui" + subprocess.check_call(["pnpm", "config", "set", "store-dir", ".pnpm-store"], cwd=dir) + subprocess.check_call( + ["pnpm", "install", "--frozen-lockfile", "--config.confirmModulesPurge=false"], cwd=dir + ) + subprocess.check_call(["pnpm", "codegen"], cwd=dir) + subprocess.check_call(["pnpm", "format"], cwd=dir) + subprocess.check_call(["pnpm", "lint:fix"], cwd=dir) diff --git a/scripts/ci/pre_commit/www_lint.py b/scripts/ci/pre_commit/lint_www.py similarity index 100% rename from scripts/ci/pre_commit/www_lint.py rename to scripts/ci/pre_commit/lint_www.py diff --git a/scripts/ci/pre_commit/migration_reference.py b/scripts/ci/pre_commit/migration_reference.py index 34d3a94c6a90d..505bea5ca91af 100755 --- a/scripts/ci/pre_commit/migration_reference.py +++ b/scripts/ci/pre_commit/migration_reference.py @@ -21,7 +21,11 @@ from pathlib import Path sys.path.insert(0, str(Path(__file__).parent.resolve())) -from common_precommit_utils import console, initialize_breeze_precommit, run_command_via_breeze_shell +from common_precommit_utils import ( + initialize_breeze_precommit, + run_command_via_breeze_shell, + validate_cmd_result, +) initialize_breeze_precommit(__name__, __file__) @@ -29,9 +33,5 @@ ["python3", "/opt/airflow/scripts/in_container/run_migration_reference.py"], backend="sqlite", ) -if cmd_result.returncode != 0: - console.print( - "[warning]\nIf you see strange stacktraces above, " - "run `breeze ci-image build --python 3.8` and try again." - ) -sys.exit(cmd_result.returncode) + +validate_cmd_result(cmd_result) diff --git a/scripts/ci/pre_commit/mypy_folder.py b/scripts/ci/pre_commit/mypy_folder.py index 366c54aae4c12..d674d2c2462e2 100755 --- a/scripts/ci/pre_commit/mypy_folder.py +++ b/scripts/ci/pre_commit/mypy_folder.py @@ -31,7 +31,12 @@ initialize_breeze_precommit(__name__, __file__) -ALLOWED_FOLDERS = ["airflow", "airflow/providers", "dev", "docs"] +ALLOWED_FOLDERS = [ + "airflow", + "providers/src/airflow/providers", + "dev", + "docs", +] if len(sys.argv) < 2: console.print(f"[yellow]You need to specify the folder to test as parameter: {ALLOWED_FOLDERS}\n") @@ -43,12 +48,10 @@ sys.exit(1) arguments = [mypy_folder] -if mypy_folder == "airflow/providers": +if mypy_folder == "providers/src/airflow/providers": arguments.extend( [ - "tests/providers", - "tests/system/providers", - "tests/integration/providers", + "providers/tests", "--namespace-packages", ] ) @@ -57,14 +60,6 @@ arguments.extend( [ "tests", - "--exclude", - "airflow/providers", - "--exclude", - "tests/providers", - "--exclude", - "tests/system/providers", - "--exclude", - "tests/integration/providers", ] ) diff --git a/scripts/ci/pre_commit/sync_init_decorator.py b/scripts/ci/pre_commit/sync_init_decorator.py deleted file mode 100755 index 963e9b9222537..0000000000000 --- a/scripts/ci/pre_commit/sync_init_decorator.py +++ /dev/null @@ -1,204 +0,0 @@ -#!/usr/bin/env python -# -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. - -from __future__ import annotations - -import ast -import collections.abc -import itertools -import pathlib -import sys -from typing import TYPE_CHECKING - -PACKAGE_ROOT = pathlib.Path(__file__).resolve().parents[3].joinpath("airflow") -DAG_PY = PACKAGE_ROOT.joinpath("models", "dag.py") -UTILS_TG_PY = PACKAGE_ROOT.joinpath("utils", "task_group.py") -DECOS_TG_PY = PACKAGE_ROOT.joinpath("decorators", "task_group.py") - - -def _find_dag_init(mod: ast.Module) -> ast.FunctionDef: - """Find definition of the ``DAG`` class's ``__init__``.""" - dag_class = next(n for n in ast.iter_child_nodes(mod) if isinstance(n, ast.ClassDef) and n.name == "DAG") - return next( - node - for node in ast.iter_child_nodes(dag_class) - if isinstance(node, ast.FunctionDef) and node.name == "__init__" - ) - - -def _find_dag_deco(mod: ast.Module) -> ast.FunctionDef: - """Find definition of the ``@dag`` decorator.""" - return next(n for n in ast.iter_child_nodes(mod) if isinstance(n, ast.FunctionDef) and n.name == "dag") - - -def _find_tg_init(mod: ast.Module) -> ast.FunctionDef: - """Find definition of the ``TaskGroup`` class's ``__init__``.""" - task_group_class = next( - node - for node in ast.iter_child_nodes(mod) - if isinstance(node, ast.ClassDef) and node.name == "TaskGroup" - ) - return next( - node - for node in ast.iter_child_nodes(task_group_class) - if isinstance(node, ast.FunctionDef) and node.name == "__init__" - ) - - -def _find_tg_deco(mod: ast.Module) -> ast.FunctionDef: - """Find definition of the ``@task_group`` decorator. - - The decorator has multiple overloads, but we want the first one, which - contains task group init arguments. - """ - return next( - node - for node in ast.iter_child_nodes(mod) - if isinstance(node, ast.FunctionDef) and node.name == "task_group" - ) - - -# The new unparse() output is much more readable; fallback to dump() otherwise. -if hasattr(ast, "unparse"): - _reveal = ast.unparse # type: ignore[attr-defined] -else: - _reveal = ast.dump - - -def _match_arguments( - init_def: tuple[str, list[ast.arg]], - deco_def: tuple[str, list[ast.arg]], -) -> collections.abc.Iterator[str]: - init_name, init_args = init_def - deco_name, deco_args = deco_def - for i, (ini, dec) in enumerate(itertools.zip_longest(init_args, deco_args, fillvalue=None)): - if ini is None and dec is not None: - yield f"Argument present in @{deco_name} but missing from {init_name}: {dec.arg}" - return - if dec is None and ini is not None: - yield f"Argument present in {init_name} but missing from @{deco_name}: {ini.arg}" - return - - if TYPE_CHECKING: - assert ini is not None and dec is not None # Because None is only possible as fillvalue. - - if ini.arg != dec.arg: - yield f"Argument {i + 1} mismatch: {init_name} has {ini.arg} but @{deco_name} has {dec.arg}" - return - - if getattr(ini, "type_comment", None): # 3.8+ - yield f"Do not use type comments on {init_name} argument: {ini.arg}" - if getattr(dec, "type_comment", None): # 3.8+ - yield f"Do not use type comments on @{deco_name} argument: {dec.arg}" - - # Poorly implemented node equality check. - if ini.annotation and dec.annotation and ast.dump(ini.annotation) != ast.dump(dec.annotation): - yield ( - f"Type annotations differ on argument {ini.arg} between {init_name} and @{deco_name}: " - f"{_reveal(ini.annotation)} != {_reveal(dec.annotation)}" - ) - else: - if not ini.annotation: - yield f"Type annotation missing on {init_name} argument: {ini.arg}" - if not dec.annotation: - yield f"Type annotation missing on @{deco_name} argument: {ini.arg}" - - -def _match_defaults( - arg_names: list[str], - init_def: tuple[str, list[ast.expr]], - deco_def: tuple[str, list[ast.expr]], -) -> collections.abc.Iterator[str]: - init_name, init_defaults = init_def - deco_name, deco_defaults = deco_def - for i, (ini, dec) in enumerate(zip(init_defaults, deco_defaults), 1): - if ast.dump(ini) != ast.dump(dec): # Poorly implemented equality check. - yield ( - f"Argument {arg_names[i]!r} default mismatch: " - f"{init_name} has {_reveal(ini)} but @{deco_name} has {_reveal(dec)}" - ) - - -def check_dag_init_decorator_arguments() -> int: - dag_mod = ast.parse(DAG_PY.read_text("utf-8"), str(DAG_PY)) - - utils_tg = ast.parse(UTILS_TG_PY.read_text("utf-8"), str(UTILS_TG_PY)) - decos_tg = ast.parse(DECOS_TG_PY.read_text("utf-8"), str(DECOS_TG_PY)) - - items_to_check = [ - ("DAG", _find_dag_init(dag_mod), "dag", _find_dag_deco(dag_mod), "dag_id", ""), - ("TaskGroup", _find_tg_init(utils_tg), "task_group", _find_tg_deco(decos_tg), "group_id", None), - ] - - for init_name, init, deco_name, deco, id_arg, id_default in items_to_check: - if getattr(init.args, "posonlyargs", None) or getattr(deco.args, "posonlyargs", None): - print(f"{init_name} and @{deco_name} should not declare positional-only arguments") - return -1 - if init.args.vararg or init.args.kwarg or deco.args.vararg or deco.args.kwarg: - print(f"{init_name} and @{deco_name} should not declare *args and **kwargs") - return -1 - - # Feel free to change this and make some of the arguments keyword-only! - if init.args.kwonlyargs or deco.args.kwonlyargs: - print(f"{init_name}() and @{deco_name}() should not declare keyword-only arguments") - return -2 - if init.args.kw_defaults or deco.args.kw_defaults: - print(f"{init_name}() and @{deco_name}() should not declare keyword-only arguments") - return -2 - - init_arg_names = [a.arg for a in init.args.args] - deco_arg_names = [a.arg for a in deco.args.args] - - if init_arg_names[0] != "self": - print(f"First argument in {init_name} must be 'self'") - return -3 - if init_arg_names[1] != id_arg: - print(f"Second argument in {init_name} must be {id_arg!r}") - return -3 - if deco_arg_names[0] != id_arg: - print(f"First argument in @{deco_name} must be {id_arg!r}") - return -3 - - if len(init.args.defaults) != len(init_arg_names) - 2: - print(f"All arguments on {init_name} except self and {id_arg} must have defaults") - return -4 - if len(deco.args.defaults) != len(deco_arg_names): - print(f"All arguments on @{deco_name} must have defaults") - return -4 - if isinstance(deco.args.defaults[0], ast.Constant) and deco.args.defaults[0].value != id_default: - print(f"Default {id_arg} on @{deco_name} must be {id_default!r}") - return -4 - - for init_name, init, deco_name, deco, _, _ in items_to_check: - errors = list(_match_arguments((init_name, init.args.args[1:]), (deco_name, deco.args.args))) - if errors: - break - init_defaults_def = (init_name, init.args.defaults) - deco_defaults_def = (deco_name, deco.args.defaults[1:]) - errors = list(_match_defaults(deco_arg_names, init_defaults_def, deco_defaults_def)) - if errors: - break - - for error in errors: - print(error) - return len(errors) - - -if __name__ == "__main__": - sys.exit(check_dag_init_decorator_arguments()) diff --git a/scripts/ci/pre_commit/update_build_dependencies.py b/scripts/ci/pre_commit/update_build_dependencies.py deleted file mode 100755 index 61cd2bd929cb1..0000000000000 --- a/scripts/ci/pre_commit/update_build_dependencies.py +++ /dev/null @@ -1,110 +0,0 @@ -#!/usr/bin/env python -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -from __future__ import annotations - -import os -import re -import shutil -import subprocess -import sys -import tempfile -from pathlib import Path - -COMMON_PRECOMMIT_PATH = Path(__file__).parent.resolve() -sys.path.insert(0, COMMON_PRECOMMIT_PATH.as_posix()) # make sure common_precommit_utils is imported -from common_precommit_utils import console - -AIRFLOW_SOURCES = Path(__file__).parents[3].resolve() -PYPROJECT_TOML_FILE = AIRFLOW_SOURCES / "pyproject.toml" - -HATCHLING_MATCH = re.compile(r"hatchling==[0-9.]*") - -FILES_TO_REPLACE_HATCHLING_IN = [ - AIRFLOW_SOURCES / ".pre-commit-config.yaml", - AIRFLOW_SOURCES / "clients" / "python" / "pyproject.toml", - AIRFLOW_SOURCES / "docker_tests" / "requirements.txt", -] - -files_changed = False - - -if __name__ == "__main__": - python38_bin = shutil.which("python3.8") - if not python38_bin: - print("Python 3.8 is required to run this script.") - sys.exit(1) - temp_dir = Path(tempfile.mkdtemp()) - hatchling_spec = "" - try: - subprocess.check_call([python38_bin, "-m", "venv", temp_dir.as_posix()]) - venv_python = temp_dir / "bin" / "python" - subprocess.check_call([venv_python, "-m", "pip", "install", "gitpython", "hatchling"]) - frozen_deps = subprocess.check_output([venv_python, "-m", "pip", "freeze"], text=True) - deps = [dep for dep in sorted(frozen_deps.splitlines()) if not dep.startswith("pip==")] - pyproject_toml_content = PYPROJECT_TOML_FILE.read_text() - result = [] - skipping = False - for line in pyproject_toml_content.splitlines(): - if not skipping: - result.append(line) - if line == "requires = [": - skipping = True - for dep in deps: - # Tomli is only needed for Python < 3.11, otherwise stdlib tomllib is used - if dep.startswith("tomli=="): - dep = dep + "; python_version < '3.11'" - result.append(f' "{dep}",') - if dep.startswith("hatchling=="): - hatchling_spec = dep - if skipping and line == "]": - skipping = False - result.append(line) - result.append("") - new_pyproject_toml_file_content = "\n".join(result) - if new_pyproject_toml_file_content != pyproject_toml_content: - if os.environ.get("SKIP_TROVE_CLASSIFIERS_ONLY", "false").lower() == "true": - diff = set(new_pyproject_toml_file_content.splitlines()) - ( - set(pyproject_toml_content.splitlines()) - ) - if len(diff) == 1 and "trove-classifiers" in next(iter(diff)): - console.print("\n[yellow]Trove classifiers were changed. Please update them manually.\n") - console.print( - "\n[blue]Please run:[/blue]\n\n" - "pre-commit run --hook-stage manual update-build-dependencies --all-files\n" - ) - console.print("\n[blue]Then commit the resulting files.\n") - sys.exit(0) - files_changed = True - PYPROJECT_TOML_FILE.write_text(new_pyproject_toml_file_content) - for file_to_replace_hatchling in FILES_TO_REPLACE_HATCHLING_IN: - old_file_content = file_to_replace_hatchling.read_text() - new_file_content = HATCHLING_MATCH.sub(hatchling_spec, old_file_content, re.MULTILINE) - if new_file_content != old_file_content: - files_changed = True - file_to_replace_hatchling.write_text(new_file_content) - finally: - shutil.rmtree(temp_dir) - - if files_changed: - console.print("\n[red]Build dependencies have changed. Please update them manually.\n") - console.print( - "\n[blue]Please run:[/blue]\n\n" - "pre-commit run --hook-stage manual update-build-dependencies --all-files\n" - ) - console.print("\n[blue]Then commit the resulting files.\n") - sys.exit(1) diff --git a/scripts/ci/pre_commit/update_common_sql_api_stubs.py b/scripts/ci/pre_commit/update_common_sql_api_stubs.py index 954302804e6f1..371c758146a2e 100755 --- a/scripts/ci/pre_commit/update_common_sql_api_stubs.py +++ b/scripts/ci/pre_commit/update_common_sql_api_stubs.py @@ -39,10 +39,12 @@ from common_precommit_black_utils import black_format from common_precommit_utils import AIRFLOW_SOURCES_ROOT_PATH -PROVIDERS_ROOT = (AIRFLOW_SOURCES_ROOT_PATH / "airflow" / "providers").resolve(strict=True) +PROVIDERS_ROOT = (AIRFLOW_SOURCES_ROOT_PATH / "providers" / "src" / "airflow" / "providers").resolve( + strict=True +) COMMON_SQL_ROOT = (PROVIDERS_ROOT / "common" / "sql").resolve(strict=True) OUT_DIR = AIRFLOW_SOURCES_ROOT_PATH / "out" -OUT_DIR_PROVIDERS = OUT_DIR / "airflow" / "providers" +OUT_DIR_PROVIDERS = OUT_DIR / PROVIDERS_ROOT.relative_to(AIRFLOW_SOURCES_ROOT_PATH) COMMON_SQL_PACKAGE_PREFIX = "airflow.providers.common.sql." @@ -317,7 +319,7 @@ def compare_stub_files(generated_stub_path: Path, force_override: bool) -> tuple shutil.rmtree(OUT_DIR, ignore_errors=True) subprocess.run( - ["stubgen", *[os.fspath(path) for path in COMMON_SQL_ROOT.rglob("*.py")]], + ["stubgen", f"--out={ OUT_DIR }", COMMON_SQL_ROOT], cwd=AIRFLOW_SOURCES_ROOT_PATH, ) total_removals, total_additions = 0, 0 diff --git a/scripts/ci/pre_commit/update_er_diagram.py b/scripts/ci/pre_commit/update_er_diagram.py index e660b47c6e6ae..c4f3cb797cf21 100755 --- a/scripts/ci/pre_commit/update_er_diagram.py +++ b/scripts/ci/pre_commit/update_er_diagram.py @@ -21,7 +21,11 @@ from pathlib import Path sys.path.insert(0, str(Path(__file__).parent.resolve())) -from common_precommit_utils import console, initialize_breeze_precommit, run_command_via_breeze_shell +from common_precommit_utils import ( + initialize_breeze_precommit, + run_command_via_breeze_shell, + validate_cmd_result, +) initialize_breeze_precommit(__name__, __file__) @@ -36,9 +40,4 @@ }, ) -if cmd_result.returncode != 0: - console.print( - "[warning]\nIf you see strange stacktraces above, " - "run `breeze ci-image build --python 3.8` and try again." - ) - sys.exit(cmd_result.returncode) +validate_cmd_result(cmd_result) diff --git a/scripts/ci/pre_commit/update_example_dags_paths.py b/scripts/ci/pre_commit/update_example_dags_paths.py index 8b2c461ec8c1f..17d2a2ccea453 100755 --- a/scripts/ci/pre_commit/update_example_dags_paths.py +++ b/scripts/ci/pre_commit/update_example_dags_paths.py @@ -34,7 +34,7 @@ console = Console(color_system="standard", width=200) AIRFLOW_SOURCES_ROOT = Path(__file__).parents[3].resolve() - +PROVIDERS_SRC = AIRFLOW_SOURCES_ROOT / "providers" / "src" / "airflow" / "providers" EXAMPLE_DAGS_URL_MATCHER = re.compile( r"^(.*)(https://github.com/apache/airflow/tree/(.*)/airflow/providers/(.*)/example_dags)(/?>.*)$" @@ -45,10 +45,7 @@ def get_provider_and_version(url_path: str) -> tuple[str, str]: candidate_folders = url_path.split("/") while candidate_folders: try: - with open( - (AIRFLOW_SOURCES_ROOT / "airflow" / "providers").joinpath(*candidate_folders) - / "provider.yaml" - ) as f: + with PROVIDERS_SRC.joinpath(*candidate_folders, "provider.yaml").open() as f: provider_info = yaml.safe_load(f) version = provider_info["versions"][0] provider = "-".join(candidate_folders) @@ -68,13 +65,11 @@ def replace_match(file: Path, line: str) -> str | None: if match: url_path_to_dir = match.group(4) folders = url_path_to_dir.split("/") - example_dags_folder = (AIRFLOW_SOURCES_ROOT / "airflow" / "providers").joinpath( - *folders - ) / "example_dags" + example_dags_folder = PROVIDERS_SRC.joinpath(*folders, "example_dags") provider, version = get_provider_and_version(url_path_to_dir) proper_system_tests_url = ( f"https://github.com/apache/airflow/tree/providers-{provider}/{version}" - f"/tests/system/providers/{url_path_to_dir}" + f"/providers/tests/system/{url_path_to_dir}" ) if not example_dags_folder.exists(): if proper_system_tests_url in file.read_text(): diff --git a/scripts/ci/pre_commit/update_installers.py b/scripts/ci/pre_commit/update_installers.py deleted file mode 100755 index f55a937df0cdf..0000000000000 --- a/scripts/ci/pre_commit/update_installers.py +++ /dev/null @@ -1,152 +0,0 @@ -#!/usr/bin/env python -# Licensed to the Apache Software Foundation (ASF) under one -# or more contributor license agreements. See the NOTICE file -# distributed with this work for additional information -# regarding copyright ownership. The ASF licenses this file -# to you under the Apache License, Version 2.0 (the -# "License"); you may not use this file except in compliance -# with the License. You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, -# software distributed under the License is distributed on an -# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY -# KIND, either express or implied. See the License for the -# specific language governing permissions and limitations -# under the License. -from __future__ import annotations - -import os -import re -import sys -from pathlib import Path - -import requests - -sys.path.insert(0, str(Path(__file__).parent.resolve())) # make sure common_precommit_utils is imported -from common_precommit_utils import AIRFLOW_SOURCES_ROOT_PATH, console - -FILES_TO_UPDATE = [ - AIRFLOW_SOURCES_ROOT_PATH / "Dockerfile", - AIRFLOW_SOURCES_ROOT_PATH / "Dockerfile.ci", - AIRFLOW_SOURCES_ROOT_PATH / "scripts" / "ci" / "install_breeze.sh", - AIRFLOW_SOURCES_ROOT_PATH / "scripts" / "docker" / "common.sh", - AIRFLOW_SOURCES_ROOT_PATH / "pyproject.toml", - AIRFLOW_SOURCES_ROOT_PATH / "dev" / "breeze" / "src" / "airflow_breeze" / "global_constants.py", - AIRFLOW_SOURCES_ROOT_PATH - / "dev" - / "breeze" - / "src" - / "airflow_breeze" - / "commands" - / "release_management_commands.py", -] - - -DOC_FILES_TO_UPDATE: list[Path] = [ - AIRFLOW_SOURCES_ROOT_PATH / "dev/" / "breeze" / "doc" / "ci" / "02_images.md" -] - - -def get_latest_pypi_version(package_name: str) -> str: - response = requests.get(f"https://pypi.org/pypi/{package_name}/json") - response.raise_for_status() # Ensure we got a successful response - data = response.json() - latest_version = data["info"]["version"] # The version info is under the 'info' key - return latest_version - - -AIRFLOW_PIP_PATTERN = re.compile(r"(AIRFLOW_PIP_VERSION=)([0-9.]+)") -AIRFLOW_PIP_QUOTED_PATTERN = re.compile(r"(AIRFLOW_PIP_VERSION = )(\"[0-9.]+\")") -PIP_QUOTED_PATTERN = re.compile(r"(PIP_VERSION = )(\"[0-9.]+\")") -AIRFLOW_PIP_DOC_PATTERN = re.compile(r"(\| *`AIRFLOW_PIP_VERSION` *\| *)(`[0-9.]+`)( *\|)") -AIRFLOW_PIP_UPGRADE_PATTERN = re.compile(r"(python -m pip install --upgrade pip==)([0-9.]+)") - -AIRFLOW_UV_PATTERN = re.compile(r"(AIRFLOW_UV_VERSION=)([0-9.]+)") -AIRFLOW_UV_QUOTED_PATTERN = re.compile(r"(AIRFLOW_UV_VERSION = )(\"[0-9.]+\")") -UV_QUOTED_PATTERN = re.compile(r"(UV_VERSION = )(\"[0-9.]+\")") -AIRFLOW_UV_DOC_PATTERN = re.compile(r"(\| *`AIRFLOW_UV_VERSION` *\| *)(`[0-9.]+`)( *\|)") -UV_GREATER_PATTERN = re.compile(r'"(uv>=)([0-9]+)"') - -UPGRADE_UV: bool = os.environ.get("UPGRADE_UV", "true").lower() == "true" -UPGRADE_PIP: bool = os.environ.get("UPGRADE_PIP", "true").lower() == "true" - - -def replace_group_2_while_keeping_total_length(pattern: re.Pattern[str], replacement: str, text: str) -> str: - def replacer(match): - original_length = len(match.group(2)) - padding = "" - if len(match.groups()) > 2: - padding = match.group(3) - new_length = len(replacement) - diff = new_length - original_length - if diff <= 0: - padding = " " * -diff + padding - else: - padding = padding[diff:] - padded_replacement = match.group(1) + replacement + padding - return padded_replacement.strip() - - return re.sub(pattern, replacer, text) - - -if __name__ == "__main__": - pip_version = get_latest_pypi_version("pip") - console.print(f"[bright_blue]Latest pip version: {pip_version}") - uv_version = get_latest_pypi_version("uv") - console.print(f"[bright_blue]Latest uv version: {uv_version}") - - changed = False - for file in FILES_TO_UPDATE: - console.print(f"[bright_blue]Updating {file}") - file_content = file.read_text() - new_content = file_content - if UPGRADE_PIP: - new_content = replace_group_2_while_keeping_total_length( - AIRFLOW_PIP_PATTERN, pip_version, new_content - ) - new_content = replace_group_2_while_keeping_total_length( - AIRFLOW_PIP_UPGRADE_PATTERN, pip_version, new_content - ) - new_content = replace_group_2_while_keeping_total_length( - AIRFLOW_PIP_QUOTED_PATTERN, f'"{pip_version}"', new_content - ) - new_content = replace_group_2_while_keeping_total_length( - PIP_QUOTED_PATTERN, f'"{pip_version}"', new_content - ) - if UPGRADE_UV: - new_content = replace_group_2_while_keeping_total_length( - AIRFLOW_UV_PATTERN, uv_version, new_content - ) - new_content = replace_group_2_while_keeping_total_length( - UV_GREATER_PATTERN, uv_version, new_content - ) - new_content = replace_group_2_while_keeping_total_length( - AIRFLOW_UV_QUOTED_PATTERN, f'"{uv_version}"', new_content - ) - new_content = replace_group_2_while_keeping_total_length( - UV_QUOTED_PATTERN, f'"{uv_version}"', new_content - ) - if new_content != file_content: - file.write_text(new_content) - console.print(f"[bright_blue]Updated {file}") - changed = True - for file in DOC_FILES_TO_UPDATE: - console.print(f"[bright_blue]Updating {file}") - file_content = file.read_text() - new_content = file_content - if UPGRADE_PIP: - new_content = replace_group_2_while_keeping_total_length( - AIRFLOW_PIP_DOC_PATTERN, f"`{pip_version}`", new_content - ) - if UPGRADE_UV: - new_content = replace_group_2_while_keeping_total_length( - AIRFLOW_UV_DOC_PATTERN, f"`{uv_version}`", new_content - ) - if new_content != file_content: - file.write_text(new_content) - console.print(f"[bright_blue]Updated {file}") - changed = True - if changed: - sys.exit(1) diff --git a/scripts/ci/pre_commit/update_installers_and_pre_commit.py b/scripts/ci/pre_commit/update_installers_and_pre_commit.py new file mode 100755 index 0000000000000..6bfec6fa012b6 --- /dev/null +++ b/scripts/ci/pre_commit/update_installers_and_pre_commit.py @@ -0,0 +1,189 @@ +#!/usr/bin/env python +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from __future__ import annotations + +import os +import re +import sys +from enum import Enum +from pathlib import Path + +import requests + +sys.path.insert(0, str(Path(__file__).parent.resolve())) # make sure common_precommit_utils is imported +from common_precommit_utils import AIRFLOW_SOURCES_ROOT_PATH, console + +# List of files to update and whether to keep total length of the original value when replacing. +FILES_TO_UPDATE: list[tuple[Path, bool]] = [ + (AIRFLOW_SOURCES_ROOT_PATH / "Dockerfile", False), + (AIRFLOW_SOURCES_ROOT_PATH / "Dockerfile.ci", False), + (AIRFLOW_SOURCES_ROOT_PATH / "scripts" / "ci" / "install_breeze.sh", False), + (AIRFLOW_SOURCES_ROOT_PATH / "scripts" / "docker" / "common.sh", False), + (AIRFLOW_SOURCES_ROOT_PATH / "scripts" / "tools" / "setup_breeze", False), + (AIRFLOW_SOURCES_ROOT_PATH / "pyproject.toml", False), + (AIRFLOW_SOURCES_ROOT_PATH / "dev" / "breeze" / "src" / "airflow_breeze" / "global_constants.py", False), + ( + AIRFLOW_SOURCES_ROOT_PATH + / "dev" + / "breeze" + / "src" + / "airflow_breeze" + / "commands" + / "release_management_commands.py", + False, + ), + (AIRFLOW_SOURCES_ROOT_PATH / ".github" / "actions" / "install-pre-commit" / "action.yml", False), + (AIRFLOW_SOURCES_ROOT_PATH / "dev/" / "breeze" / "doc" / "ci" / "02_images.md", True), +] + + +def get_latest_pypi_version(package_name: str) -> str: + response = requests.get(f"https://pypi.org/pypi/{package_name}/json") + response.raise_for_status() # Ensure we got a successful response + data = response.json() + latest_version = data["info"]["version"] # The version info is under the 'info' key + return latest_version + + +class Quoting(Enum): + UNQUOTED = 0 + SINGLE_QUOTED = 1 + DOUBLE_QUOTED = 2 + REVERSE_SINGLE_QUOTED = 3 + + +PIP_PATTERNS: list[tuple[re.Pattern, Quoting]] = [ + (re.compile(r"(AIRFLOW_PIP_VERSION=)([0-9.]+)"), Quoting.UNQUOTED), + (re.compile(r"(python -m pip install --upgrade pip==)([0-9.]+)"), Quoting.UNQUOTED), + (re.compile(r"(AIRFLOW_PIP_VERSION = )(\"[0-9.]+\")"), Quoting.DOUBLE_QUOTED), + (re.compile(r"(PIP_VERSION = )(\"[0-9.]+\")"), Quoting.DOUBLE_QUOTED), + (re.compile(r"(PIP_VERSION=)(\"[0-9.]+\")"), Quoting.DOUBLE_QUOTED), + (re.compile(r"(\| *`AIRFLOW_PIP_VERSION` *\| *)(`[0-9.]+`)( *\|)"), Quoting.REVERSE_SINGLE_QUOTED), +] + +UV_PATTERNS: list[tuple[re.Pattern, Quoting]] = [ + (re.compile(r"(AIRFLOW_UV_VERSION=)([0-9.]+)"), Quoting.UNQUOTED), + (re.compile(r"(uv>=)([0-9]+)"), Quoting.UNQUOTED), + (re.compile(r"(AIRFLOW_UV_VERSION = )(\"[0-9.]+\")"), Quoting.DOUBLE_QUOTED), + (re.compile(r"(UV_VERSION = )(\"[0-9.]+\")"), Quoting.DOUBLE_QUOTED), + (re.compile(r"(UV_VERSION=)(\"[0-9.]+\")"), Quoting.DOUBLE_QUOTED), + (re.compile(r"(\| *`AIRFLOW_UV_VERSION` *\| *)(`[0-9.]+`)( *\|)"), Quoting.REVERSE_SINGLE_QUOTED), + ( + re.compile( + r"(default: \")([0-9.]+)(\" # Keep this comment to " + r"allow automatic replacement of uv version)" + ), + Quoting.UNQUOTED, + ), +] + +PRE_COMMIT_PATTERNS: list[tuple[re.Pattern, Quoting]] = [ + (re.compile(r"(AIRFLOW_PRE_COMMIT_VERSION=)([0-9.]+)"), Quoting.UNQUOTED), + (re.compile(r"(AIRFLOW_PRE_COMMIT_VERSION = )(\"[0-9.]+\")"), Quoting.DOUBLE_QUOTED), + (re.compile(r"(pre-commit>=)([0-9]+)"), Quoting.UNQUOTED), + (re.compile(r"(PRE_COMMIT_VERSION = )(\"[0-9.]+\")"), Quoting.DOUBLE_QUOTED), + (re.compile(r"(PRE_COMMIT_VERSION=)(\"[0-9.]+\")"), Quoting.DOUBLE_QUOTED), + ( + re.compile(r"(\| *`AIRFLOW_PRE_COMMIT_VERSION` *\| *)(`[0-9.]+`)( *\|)"), + Quoting.REVERSE_SINGLE_QUOTED, + ), + ( + re.compile( + r"(default: \")([0-9.]+)(\" # Keep this comment to allow automatic " + r"replacement of pre-commit version)" + ), + Quoting.UNQUOTED, + ), +] + + +def get_replacement(value: str, quoting: Quoting) -> str: + if quoting == Quoting.DOUBLE_QUOTED: + return f'"{value}"' + elif quoting == Quoting.SINGLE_QUOTED: + return f"'{value}'" + elif quoting == Quoting.REVERSE_SINGLE_QUOTED: + return f"`{value}`" + return value + + +UPGRADE_UV: bool = os.environ.get("UPGRADE_UV", "true").lower() == "true" +UPGRADE_PIP: bool = os.environ.get("UPGRADE_PIP", "true").lower() == "true" +UPGRADE_PRE_COMMIT: bool = os.environ.get("UPGRADE_PRE_COMMIT", "true").lower() == "true" + + +def replace_version(pattern: re.Pattern[str], version: str, text: str, keep_total_length: bool = True) -> str: + # Assume that the pattern has up to 3 replacement groups: + # 1. Prefix + # 2. Original version + # 3. Suffix + # + # (prefix)(version)(suffix) + # In case "keep_total_length" is set to True, the replacement will be padded with spaces to match + # the original length + def replacer(match): + prefix = match.group(1) + postfix = match.group(3) if len(match.groups()) > 2 else "" + if not keep_total_length: + return prefix + version + postfix + original_length = len(match.group(2)) + new_length = len(version) + diff = new_length - original_length + if diff <= 0: + postfix = " " * -diff + postfix + else: + postfix = postfix[diff:] + padded_replacement = prefix + version + postfix + return padded_replacement.strip() + + return re.sub(pattern, replacer, text) + + +if __name__ == "__main__": + changed = False + for file, keep_length in FILES_TO_UPDATE: + console.print(f"[bright_blue]Updating {file}") + file_content = file.read_text() + new_content = file_content + if UPGRADE_PIP: + pip_version = get_latest_pypi_version("pip") + console.print(f"[bright_blue]Latest pip version: {pip_version}") + for line_pattern, quoting in PIP_PATTERNS: + new_content = replace_version( + line_pattern, get_replacement(pip_version, quoting), new_content, keep_length + ) + if UPGRADE_UV: + uv_version = get_latest_pypi_version("uv") + console.print(f"[bright_blue]Latest uv version: {uv_version}") + for line_pattern, quoting in UV_PATTERNS: + new_content = replace_version( + line_pattern, get_replacement(uv_version, quoting), new_content, keep_length + ) + if UPGRADE_PRE_COMMIT: + pre_commit_version = "3.5.0" # Latest version that supports Python 3.8 + console.print(f"[bright_blue]Latest pre-commit version: {pre_commit_version}") + for line_pattern, quoting in PRE_COMMIT_PATTERNS: + new_content = replace_version( + line_pattern, get_replacement(pre_commit_version, quoting), new_content, keep_length + ) + if new_content != file_content: + file.write_text(new_content) + console.print(f"[bright_blue]Updated {file}") + changed = True + if changed: + sys.exit(1) diff --git a/scripts/ci/pre_commit/update_providers_build_files.py b/scripts/ci/pre_commit/update_providers_build_files.py new file mode 100755 index 0000000000000..e6e9fc81da3e1 --- /dev/null +++ b/scripts/ci/pre_commit/update_providers_build_files.py @@ -0,0 +1,112 @@ +#!/usr/bin/env python +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. +from __future__ import annotations + +import subprocess +import sys +from pathlib import Path + +sys.path.insert(0, str(Path(__file__).parent.resolve())) +from common_precommit_utils import console, initialize_breeze_precommit + +initialize_breeze_precommit(__name__, __file__) + +providers: set[str] = set() + +file_list = sys.argv[1:] +console.print(f"[bright_blue]Determining providers to regenerate from: {file_list}\n") + + +# TODO: remove it when we move all providers to the new structure +def _find_old_providers_structure() -> None: + console.print(f"[bright_blue]Looking at {examined_file} for old structure provider.yaml") + # find the folder where provider.yaml is + for parent in Path(examined_file).parents: + console.print(f"[bright_blue]Checking {parent}") + if (parent / "provider.yaml").exists(): + provider_folder = parent + break + else: + console.print(f"[yellow]\nCould not find `provider.yaml` in any parent of {examined_file}[/]") + return + # find base for the provider sources + for parent in provider_folder.parents: + if parent.name == "providers": + base_folder = parent + console.print(f"[bright_blue]Found base folder {base_folder}") + break + else: + console.print(f"[red]\nCould not find old structure base folder for {provider_folder}") + sys.exit(1) + provider_name = ".".join(provider_folder.relative_to(base_folder).as_posix().split("/")) + providers.add(provider_name) + + +def _find_new_providers_structure() -> None: + console.print(f"[bright_blue]Looking at {examined_file} for new structure provider.yaml") + # find the folder where provider.yaml is + for parent in Path(examined_file).parents: + console.print(f"[bright_blue]Checking {parent} for provider.yaml") + if (parent / "provider.yaml").exists(): + console.print(f"[bright_blue]Found {parent} with provider.yaml") + provider_folder = parent + break + else: + console.print(f"[yellow]\nCould not find `provider.yaml` in any parent of {examined_file}[/]") + return + # find base for the provider sources + for parent in provider_folder.parents: + if parent.name == "providers": + base_folder = parent + console.print(f"[bright_blue]Found base folder {base_folder}") + break + else: + console.print(f"[red]\nCould not find new structure base folder for {provider_folder}") + sys.exit(1) + provider_name = ".".join(provider_folder.relative_to(base_folder).as_posix().split("/")) + providers.add(provider_name) + + +# get all folders from arguments +for examined_file in file_list: + if not examined_file.startswith("providers/src"): + _find_new_providers_structure() + else: + _find_old_providers_structure() + +console.print(f"[bright_blue]Regenerating build files for providers: {providers}[/]") + +if not providers: + console.print("[red]\nThe found providers list cannot be empty[/]") + sys.exit(1) + +res = subprocess.run( + [ + "breeze", + "release-management", + "prepare-provider-documentation", + "--reapply-templates-only", + "--skip-git-fetch", + "--only-min-version-update", + *list(providers), + ], + check=False, +) +if res.returncode != 0: + console.print("[red]\nError while regenerating provider init files.") + sys.exit(res.returncode) diff --git a/scripts/ci/pre_commit/validate_operators_init.py b/scripts/ci/pre_commit/validate_operators_init.py index 43404020915fd..0a5f58a9bf87d 100755 --- a/scripts/ci/pre_commit/validate_operators_init.py +++ b/scripts/ci/pre_commit/validate_operators_init.py @@ -85,12 +85,12 @@ def _handle_parent_constructor_kwargs( field. TODO: Enhance this function to work with nested inheritance trees through dynamic imports. - :param missing_assignments: List[str] - List of template fields that have not been assigned a value. + :param missing_assignments: list[str] - List of template fields that have not been assigned a value. :param ctor_stmt: ast.Expr - AST node representing the constructor statement. - :param invalid_assignments: List[str] - List of template fields that have been assigned incorrectly. - :param template_fields: List[str] - List of template fields to be assigned. + :param invalid_assignments: list[str] - List of template fields that have been assigned incorrectly. + :param template_fields: list[str] - List of template fields to be assigned. - :return: List[str] - List of template fields that are still missing assignments. + :return: list[str] - List of template fields that are still missing assignments. """ if isinstance(ctor_stmt, ast.Expr): if ( diff --git a/scripts/ci/pre_commit/vendor_k8s_json_schema.py b/scripts/ci/pre_commit/vendor_k8s_json_schema.py index 3348a73840e8d..e4354b522e441 100755 --- a/scripts/ci/pre_commit/vendor_k8s_json_schema.py +++ b/scripts/ci/pre_commit/vendor_k8s_json_schema.py @@ -19,7 +19,7 @@ from __future__ import annotations import json -from typing import Iterator +from collections.abc import Iterator import requests diff --git a/scripts/ci/pre_commit/version_heads_map.py b/scripts/ci/pre_commit/version_heads_map.py index 4277c4656472d..6796819444d8c 100755 --- a/scripts/ci/pre_commit/version_heads_map.py +++ b/scripts/ci/pre_commit/version_heads_map.py @@ -23,21 +23,28 @@ from pathlib import Path import re2 -from packaging.version import parse as parse_version PROJECT_SOURCE_ROOT_DIR = Path(__file__).resolve().parent.parent.parent.parent DB_FILE = PROJECT_SOURCE_ROOT_DIR / "airflow" / "utils" / "db.py" MIGRATION_PATH = PROJECT_SOURCE_ROOT_DIR / "airflow" / "migrations" / "versions" +PROVIDERS_SRC = PROJECT_SOURCE_ROOT_DIR / "providers" / "src" +FAB_DB_FILE = PROVIDERS_SRC / "airflow" / "providers" / "fab" / "auth_manager" / "models" / "db.py" +FAB_MIGRATION_PATH = PROVIDERS_SRC / "airflow" / "providers" / "fab" / "migrations" / "versions" + sys.path.insert(0, str(Path(__file__).parent.resolve())) # make sure common_precommit_utils is importable -def revision_heads_map(): +def revision_heads_map(migration_path): rh_map = {} pattern = r'revision = "[a-fA-F0-9]+"' - airflow_version_pattern = r'airflow_version = "\d+\.\d+\.\d+"' - filenames = os.listdir(MIGRATION_PATH) + version_pattern = None + if migration_path == MIGRATION_PATH: + version_pattern = r'airflow_version = "\d+\.\d+\.\d+"' + elif migration_path == FAB_MIGRATION_PATH: + version_pattern = r'fab_version = "\d+\.\d+\.\d+"' + filenames = os.listdir(migration_path) def sorting_key(filen): prefix = filen.split("_")[0] @@ -46,43 +53,46 @@ def sorting_key(filen): sorted_filenames = sorted(filenames, key=sorting_key) for filename in sorted_filenames: - if not filename.endswith(".py"): + if not filename.endswith(".py") or filename == "__init__.py": continue - with open(os.path.join(MIGRATION_PATH, filename)) as file: + with open(os.path.join(migration_path, filename)) as file: content = file.read() revision_match = re2.search(pattern, content) - airflow_version_match = re2.search(airflow_version_pattern, content) - if revision_match and airflow_version_match: + _version_match = re2.search(version_pattern, content) + if revision_match and _version_match: revision = revision_match.group(0).split('"')[1] - version = airflow_version_match.group(0).split('"')[1] - if parse_version(version) >= parse_version("2.0.0"): - rh_map[version] = revision + version = _version_match.group(0).split('"')[1] + rh_map[version] = revision return rh_map if __name__ == "__main__": - with open(DB_FILE) as file: - content = file.read() - - pattern = r"_REVISION_HEADS_MAP = {[^}]+\}" - match = re2.search(pattern, content) - if not match: - print( - f"_REVISION_HEADS_MAP not found in {DB_FILE}. If this has been removed intentionally, " - "please update scripts/ci/pre_commit/version_heads_map.py" - ) - sys.exit(1) - - existing_revision_heads_map = match.group(0) - rh_map = revision_heads_map() - updated_revision_heads_map = "_REVISION_HEADS_MAP = {\n" - for k, v in rh_map.items(): - updated_revision_heads_map += f' "{k}": "{v}",\n' - updated_revision_heads_map += "}" - if existing_revision_heads_map != updated_revision_heads_map: - new_content = content.replace(existing_revision_heads_map, updated_revision_heads_map) - - with open(DB_FILE, "w") as file: - file.write(new_content) - print("_REVISION_HEADS_MAP updated in db.py. Please commit the changes.") - sys.exit(1) + paths = [(DB_FILE, MIGRATION_PATH), (FAB_DB_FILE, FAB_MIGRATION_PATH)] + for dbfile, mpath in paths: + with open(dbfile) as file: + content = file.read() + + pattern = r"_REVISION_HEADS_MAP:\s*dict\[\s*str\s*,\s*str\s*\]\s*=\s*\{[^}]*\}" + match = re2.search(pattern, content) + if not match: + print( + f"_REVISION_HEADS_MAP not found in {dbfile}. If this has been removed intentionally, " + "please update scripts/ci/pre_commit/version_heads_map.py" + ) + sys.exit(1) + + existing_revision_heads_map = match.group(0) + rh_map = revision_heads_map(mpath) + updated_revision_heads_map = "_REVISION_HEADS_MAP: dict[str, str] = {\n" + for k, v in rh_map.items(): + updated_revision_heads_map += f' "{k}": "{v}",\n' + updated_revision_heads_map += "}" + if updated_revision_heads_map == "_REVISION_HEADS_MAP: dict[str, str] = {\n}": + updated_revision_heads_map = "_REVISION_HEADS_MAP: dict[str, str] = {}" + if existing_revision_heads_map != updated_revision_heads_map: + new_content = content.replace(existing_revision_heads_map, updated_revision_heads_map) + + with open(dbfile, "w") as file: + file.write(new_content) + print(f"_REVISION_HEADS_MAP updated in {dbfile}. Please commit the changes.") + sys.exit(1) diff --git a/scripts/ci/testing/run_breeze_command_with_retries.sh b/scripts/ci/testing/run_breeze_command_with_retries.sh new file mode 100755 index 0000000000000..7f2bb56785c4c --- /dev/null +++ b/scripts/ci/testing/run_breeze_command_with_retries.sh @@ -0,0 +1,43 @@ +#!/usr/bin/env bash +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +# If you want different number of retries for your breeze command, please set NUMBER_OF_ATTEMPT environment variable. +# Default number of retries is 3 unless NUMBER_OF_ATTEMPT is set. +export COLOR_RED=$'\e[31m' +export COLOR_YELLOW=$'\e[33m' +export COLOR_RESET=$'\e[0m' + +NUMBER_OF_ATTEMPT="${NUMBER_OF_ATTEMPT:-3}" + +for i in $(seq 1 "$NUMBER_OF_ATTEMPT") ; do + breeze down + set +e + if breeze "$@"; then + set -e + exit 0 + else + echo + echo "${COLOR_YELLOW}Breeze Command failed. Retrying again.${COLOR_RESET}" + echo + echo "This could be due to a flaky test, re-running once to re-check it After restarting docker." + echo "Current Attempt: ${i}, Attempt Left: $((NUMBER_OF_ATTEMPT-i))" + echo + fi + set -e + sudo service docker restart +done diff --git a/scripts/ci/testing/run_integration_tests_with_retry.sh b/scripts/ci/testing/run_integration_tests_with_retry.sh index 4fd25a75ecdff..afb3003eceff6 100755 --- a/scripts/ci/testing/run_integration_tests_with_retry.sh +++ b/scripts/ci/testing/run_integration_tests_with_retry.sh @@ -20,33 +20,34 @@ export COLOR_RED=$'\e[31m' export COLOR_YELLOW=$'\e[33m' export COLOR_RESET=$'\e[0m' -if [[ ! "$#" -eq 1 ]]; then - echo "${COLOR_RED}You must provide exactly one argument!.${COLOR_RESET}" +if [[ ! "$#" -eq 2 ]]; then + echo "${COLOR_RED}You must provide 2 arguments. Test group and integration!.${COLOR_RESET}" exit 1 fi -INTEGRATION=${1} +TEST_GROUP=${1} +INTEGRATION=${2} breeze down set +e -breeze testing integration-tests --integration "${INTEGRATION}" +breeze testing "${TEST_GROUP}-integration-tests" --integration "${INTEGRATION}" RESULT=$? set -e if [[ ${RESULT} != "0" ]]; then echo - echo "${COLOR_YELLOW}Integration Tests failed. Retrying once${COLOR_RESET}" + echo "${COLOR_YELLOW}The ${TEST_GROUP} Integration Tests failed. Retrying once${COLOR_RESET}" echo echo "This could be due to a flaky test, re-running once to re-check it After restarting docker." echo sudo service docker restart breeze down set +e - breeze testing integration-tests --integration "${INTEGRATION}" + breeze testing "${TEST_GROUP}-integration-tests" --integration "${INTEGRATION}" RESULT=$? set -e if [[ ${RESULT} != "0" ]]; then echo - echo "${COLOR_RED}The integration tests failed for the second time! Giving up${COLOR_RESET}" + echo "${COLOR_RED}The ${TEST_GROUP} integration tests failed for the second time! Giving up${COLOR_RESET}" echo exit ${RESULT} fi diff --git a/scripts/ci/images/ci_stop_arm_instance.sh b/scripts/ci/testing/run_system_tests.sh similarity index 61% rename from scripts/ci/images/ci_stop_arm_instance.sh rename to scripts/ci/testing/run_system_tests.sh index 57a42c80a3259..4ca876c4d9a0d 100755 --- a/scripts/ci/images/ci_stop_arm_instance.sh +++ b/scripts/ci/testing/run_system_tests.sh @@ -15,16 +15,18 @@ # KIND, either express or implied. See the License for the # specific language governing permissions and limitations # under the License. -# This is an AMI that is based on Basic Amazon Linux AMI with installed and configured docker service -WORKING_DIR="/tmp/armdocker" -INSTANCE_INFO="${WORKING_DIR}/instance_info.json" -AUTOSSH_LOGFILE="${WORKING_DIR}/autossh.log" -function stop_arm_instance() { - INSTANCE_ID=$(jq < "${INSTANCE_INFO}" ".Instances[0].InstanceId" -r) - docker buildx rm --force airflow_cache || true - aws ec2 terminate-instances --instance-ids "${INSTANCE_ID}" - cat ${AUTOSSH_LOGFILE} || true -} +export COLOR_RED=$'\e[31m' +export COLOR_YELLOW=$'\e[33m' +export COLOR_RESET=$'\e[0m' -stop_arm_instance +set +e +breeze testing system-tests "${@}" +RESULT=$? +set -e +if [[ ${RESULT} != "0" ]]; then + echo + echo "${COLOR_RED}The ${TEST_GROUP} system test ${TEST_TO_RUN} failed! Giving up${COLOR_RESET}" + echo + exit ${RESULT} +fi diff --git a/scripts/ci/testing/run_unit_tests.sh b/scripts/ci/testing/run_unit_tests.sh new file mode 100755 index 0000000000000..c6f65f558acd3 --- /dev/null +++ b/scripts/ci/testing/run_unit_tests.sh @@ -0,0 +1,140 @@ +#!/usr/bin/env bash +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +export COLOR_RED=$'\e[31m' +export COLOR_BLUE=$'\e[34m' +export COLOR_YELLOW=$'\e[33m' +export COLOR_RESET=$'\e[0m' + +if [[ ! "$#" -eq 2 ]]; then + echo "${COLOR_RED}You must provide 2 arguments: Group, Scope!.${COLOR_RESET}" + exit 1 +fi + +TEST_GROUP=${1} +TEST_SCOPE=${2} + +function core_tests() { + echo "${COLOR_BLUE}Running core tests${COLOR_RESET}" + set +e + if [[ "${TEST_SCOPE}" == "DB" ]]; then + set -x + breeze testing core-tests --run-in-parallel --run-db-tests-only + RESULT=$? + set +x + elif [[ "${TEST_SCOPE}" == "Non-DB" ]]; then + set -x + breeze testing core-tests --use-xdist --skip-db-tests --no-db-cleanup --backend none + RESULT=$? + set +x + elif [[ "${TEST_SCOPE}" == "All" ]]; then + set -x + breeze testing core-tests --run-in-parallel + RESULT=$? + set +x + elif [[ "${TEST_SCOPE}" == "Quarantined" ]]; then + set -x + breeze testing core-tests --test-type "All-Quarantined" || true + RESULT=$? + set +x + elif [[ "${TEST_SCOPE}" == "ARM collection" ]]; then + set -x + breeze testing core-tests --collect-only --remove-arm-packages --test-type "All" --no-db-reset + RESULT=$? + set +x + elif [[ "${TEST_SCOPE}" == "System" ]]; then + set -x + breeze testing system-tests tests/system/example_empty.py + RESULT=$? + set +x + else + echo "Unknown test scope: ${TEST_SCOPE}" + set -e + exit 1 + fi + set -e + if [[ ${RESULT} != "0" ]]; then + echo + echo "${COLOR_RED}The ${TEST_GROUP} test ${TEST_SCOPE} failed! Giving up${COLOR_RESET}" + echo + exit "${RESULT}" + fi + echo "${COLOR_GREEN}Core tests completed successfully${COLOR_RESET}" +} + +function providers_tests() { + echo "${COLOR_BLUE}Running providers tests${COLOR_RESET}" + set +e + if [[ "${TEST_SCOPE}" == "DB" ]]; then + set -x + breeze testing providers-tests --run-in-parallel --run-db-tests-only + RESULT=$? + set +x + elif [[ "${TEST_SCOPE}" == "Non-DB" ]]; then + set -x + breeze testing providers-tests --use-xdist --skip-db-tests --no-db-cleanup --backend none + RESULT=$? + set +x + elif [[ "${TEST_SCOPE}" == "All" ]]; then + set -x + breeze testing providers-tests --run-in-parallel + RESULT=$? + set +x + elif [[ "${TEST_SCOPE}" == "Quarantined" ]]; then + set -x + breeze testing providers-tests --test-type "All-Quarantined" || true + RESULT=$? + set +x + elif [[ "${TEST_SCOPE}" == "ARM collection" ]]; then + set -x + breeze testing providers-tests --collect-only --remove-arm-packages --test-type "All" --no-db-reset + RESULT=$? + set +x + elif [[ "${TEST_SCOPE}" == "System" ]]; then + set -x + breeze testing system-tests providers/tests/system/example_empty.py + RESULT=$? + set +x + else + echo "Unknown test scope: ${TEST_SCOPE}" + set -e + exit 1 + fi + set -e + if [[ ${RESULT} != "0" ]]; then + echo + echo "${COLOR_RED}The ${TEST_GROUP} test ${TEST_SCOPE} failed! Giving up${COLOR_RESET}" + echo + exit "${RESULT}" + fi + echo "${COLOR_GREEB}Providers tests completed successfully${COLOR_RESET}" +} + + +function run_tests() { + if [[ "${TEST_GROUP}" == "core" ]]; then + core_tests + elif [[ "${TEST_GROUP}" == "providers" ]]; then + providers_tests + else + echo "Unknown test group: ${TEST_GROUP}" + exit 1 + fi +} + +run_tests diff --git a/scripts/tools/free_up_disk_space.sh b/scripts/tools/free_up_disk_space.sh new file mode 100644 index 0000000000000..f959f65e9e180 --- /dev/null +++ b/scripts/tools/free_up_disk_space.sh @@ -0,0 +1,39 @@ +#!/usr/bin/env bash +# Licensed to the Apache Software Foundation (ASF) under one +# or more contributor license agreements. See the NOTICE file +# distributed with this work for additional information +# regarding copyright ownership. The ASF licenses this file +# to you under the Apache License, Version 2.0 (the +# "License"); you may not use this file except in compliance +# with the License. You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, +# software distributed under the License is distributed on an +# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY +# KIND, either express or implied. See the License for the +# specific language governing permissions and limitations +# under the License. + +COLOR_BLUE=$'\e[34m' +COLOR_RESET=$'\e[0m' + +echo "${COLOR_BLUE}Disk space before cleanup${COLOR_RESET}" +df -h + +echo "${COLOR_BLUE}Freeing up disk space${COLOR_RESET}" +sudo rm -rf /usr/share/dotnet/ +sudo rm -rf /usr/local/graalvm/ +sudo rm -rf /usr/local/.ghcup/ +sudo rm -rf /usr/local/share/powershell +sudo rm -rf /usr/local/share/chromium +sudo rm -rf /usr/local/share/boost +sudo rm -rf /usr/local/lib/android +sudo rm -rf /opt/hostedtoolcache/CodeQL +sudo rm -rf /opt/hostedtoolcache/Ruby +sudo rm -rf /opt/hostedtoolcache/go +sudo rm -rf /opt/ghc +sudo apt-get clean +echo "${COLOR_BLUE}Disk space after cleanup${COLOR_RESET}" +df -h diff --git a/tests/api_connexion/test_auth.py b/tests/api_connexion/test_auth.py index cccd04eb3c1f2..8a8e47739fcef 100644 --- a/tests/api_connexion/test_auth.py +++ b/tests/api_connexion/test_auth.py @@ -27,7 +27,11 @@ from tests.test_utils.db import clear_db_pools from tests.test_utils.www import client_with_login -pytestmark = [pytest.mark.db_test, pytest.mark.skip_if_database_isolation_mode] +pytestmark = [ + pytest.mark.db_test, + pytest.mark.skip_if_database_isolation_mode, + pytest.mark.filterwarnings("default::airflow.exceptions.RemovedInAirflow3Warning"), +] class BaseTestAuth: diff --git a/tests/operators/test_bash.py b/tests/operators/test_bash.py index ac629423e8000..b51def35bbadf 100644 --- a/tests/operators/test_bash.py +++ b/tests/operators/test_bash.py @@ -283,6 +283,7 @@ def test_templated_fields(self, create_task_instance_of_operator): assert task.env == {"FOO": "2024-02-01"} assert task.cwd == Path(__file__).absolute().parent.as_posix() + @pytest.mark.db_test def test_templated_bash_script(self, dag_maker, tmp_path, session): """ Creates a .sh script with Jinja template. diff --git a/tests/plugins/test_plugins_manager.py b/tests/plugins/test_plugins_manager.py index cb59afd36742a..2426352fc8531 100644 --- a/tests/plugins/test_plugins_manager.py +++ b/tests/plugins/test_plugins_manager.py @@ -36,7 +36,10 @@ from tests.test_utils.config import conf_vars from tests.test_utils.mock_plugins import mock_plugin_manager -pytestmark = pytest.mark.db_test +pytestmark = [ + pytest.mark.db_test, + pytest.mark.filterwarnings("default::airflow.exceptions.RemovedInAirflow3Warning"), +] AIRFLOW_SOURCES_ROOT = Path(__file__).parents[2].resolve() diff --git a/tests/sensors/test_external_task_sensor.py b/tests/sensors/test_external_task_sensor.py index 0ca13b343f230..0bb53d65b88fd 100644 --- a/tests/sensors/test_external_task_sensor.py +++ b/tests/sensors/test_external_task_sensor.py @@ -56,7 +56,10 @@ from tests.test_utils.db import clear_db_runs from tests.test_utils.mock_operators import MockOperator -pytestmark = pytest.mark.db_test +pytestmark = [ + pytest.mark.db_test, + pytest.mark.filterwarnings("default::airflow.exceptions.RemovedInAirflow3Warning"), +] DEFAULT_DATE = datetime(2015, 1, 1) diff --git a/tests/system/providers/papermill/input_notebook.ipynb b/tests/system/providers/papermill/input_notebook.ipynb index 6c1d53a5a780c..511ef76ccdc96 100644 --- a/tests/system/providers/papermill/input_notebook.ipynb +++ b/tests/system/providers/papermill/input_notebook.ipynb @@ -91,7 +91,7 @@ } ], "source": [ - "sb.glue('message', msgs)" + "sb.glue(\"message\", msgs)" ] } ],