Skip to content

Conversation

@Parthsuii
Copy link


^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in airflow-core/newsfragments.

kaxil and others added 30 commits September 23, 2025 03:16
Add async support for secrets backends in _async_get_connection using
sync_to_async to prevent event loop blocking. This resolves the
'async_to_sync forbidden in event loop' error when triggers call
get_connection() and secrets backends need to be checked.

Also fix Databricks provider to use async connection methods in async
contexts instead of cached properties that cause sync calls.

Fixes triggerer failures when providers access connections during
async trigger execution.
…gers (apache#55915)

Classes decorated with `@conf_vars` and other context managers were disappearing
during pytest collection, causing tests to be silently skipped. This affected
several test classes including `TestWorkerStart` in the Celery provider tests.

Root cause: `ContextDecorator` transforms decorated classes into callable wrappers.
Since pytest only collects actual type objects as test classes, these wrapped
classes are ignored during collection.

Simple reproduction (no Airflow needed):

```py
import contextlib
import inspect

@contextlib.contextmanager
def simple_cm():
    yield

@simple_cm()
class TestExample:
    def test_method(self):
        pass

print(f'Is class? {inspect.isclass(TestExample)}')  # False - pytest won't collect
```

and then run

```shell
pytest test_example.py --collect-only
```

Airflow reproduction:

```shell
breeze run pytest providers/celery/tests/unit/celery/cli/test_celery_command.py --collect-only -v

breeze run pytest providers/celery/tests/unit/celery/cli/test_celery_command.py --collect-only -v
```

Solution:
1. Fixed affected test files by replacing class-level `@conf_vars` decorators
   with pytest fixtures
2. Created pytest fixtures to apply configuration changes
3. Used `@pytest.mark.usefixtures` to apply configuration to test classes
4. Added custom linter to prevent future occurrences and integrated it
   into pre-commit hooks

Files changed:
- Fixed 3 test files with problematic class decorators
- Added custom linter with pre-commit integration

This ensures pytest properly collects all test classes and prevents similar
issues in the future through automated detection.
…pache#55975)

This migration enables Airflow downgrades by converting v3 serialized DAGs
back to v2 format. The `upgrade()` is a no-op since the server handles v1/v2/v3
at runtime, but `downgrade()` removes client_defaults sections and updates
version numbers to ensure compatibility with older Airflow versions.

closes apache#55949
* Improve get dag structure endpoint speed


---------

Co-authored-by: Elad Kalif <[email protected]>
…pache#55850)

This is commonly very useful when debugging when you want to set _something_
at debug but not everything, espeically not sqlalchemy logs (as they are so
noisy as to overwhelm everything)

This gives users an easy way to do that.

The main functionality to do this was added not long ago when we ported
Airflow over to structlog. This exposes that capability via a config option.

As part of this I needed to make a change to how logs are handled from task
processes -- they were being mistakenly level flitered _again_, meaning that
if the task decided it wanted to log anything at debug, but the main process
was at info, it would get dropped and not make it to the log file. This now
changes it so that anything send by the subprocess makes it to the log file.
…e#55979)

Improves storage efficiency and query performance for DAG serialization
data by using PostgreSQL's native `JSONB` type instead of `JSON`. `JSONB`
provides better compression, faster equality comparisons, and removes
whitespace/duplicate keys. Would have also made the query in apache#55975 simpler.
* Only send hostname to celery worker if passed in cli

We were setting the hostname in the celery worker, even if the user did
not specify one. This meant we were passing on None. Historically, that
has still resulted in the default value being used instead of None, but
that has changed in click 8.3.0 (which celery uses in its cli). Either
way, no need for us to pass it!

* Fix test

It'll actually run when apache#55915 is merged, but it will pass when it does
start running.
Replace 'formActions.reset' with proper 'common:reset' translation key
apache#55976)

Fixes apache#55974.

According to the docstrings for these kwargs, powershell should accept
arguments and parameters.
* add CLI token endpoint uses the CLI API expiration time from conf
Less manual work with text blocks
* Add SSO documentation to FAB auth manager

- Add comprehensive SSO documentation with OAuth2 provider examples
- Include configuration for Google, Okta, Azure Entra ID, and generic providers
- Add troubleshooting section and references
- Update FAB auth manager index to include SSO documentation

This addresses the need for clear SSO configuration guidance in the FAB provider.

* CI: install gh and jq early in ci-amd; docs: strip trailing spaces in SSO doc

* Revert workflow changes per review: no workflow edits in docs PR

* docs(spellcheck): allow ‘Entra’ in Sphinx spell list

* docs(fab): fix RST title overline; spellcheck: keep ‘Entra’ sorted in wordlist

* docs(fab): inline spelling allow for ‘Entra’ to satisfy spellcheck

* docs(spellcheck): keep only one ‘Entra’ entry; remove inline spelling directive

* docs(spellcheck): re-add 'Entra' to spelling wordlist - needed for Azure Entra ID references

---------

Co-authored-by: Vincent <[email protected]>
* fix(ui): Reset pagination on filter change

* Update airflow-core/src/airflow/ui/src/utils/useFiltersHandler.ts

* fix(ui): Change SearchParamsKeys from type import to value import

---------

Co-authored-by: Brent Bovenzi <[email protected]>
amoghrajesh and others added 30 commits October 9, 2025 11:43
* Add is_favorite to ui dags list

* Update dagdetails
Updates the release notes for the snowflake provider 6.3.0 to
appropriately flag a breaking change.
* fix openlineage dag state event emit on dag with skipped tasks

* fix tests

* rid of else
* add `DataDog` as company using airflow

* Update INTHEWILD.md

Co-authored-by: Aaron Schutzengel <[email protected]>

* Update INTHEWILD.md

Co-authored-by: HeroCC <[email protected]>

---------

Co-authored-by: Aaron Schutzengel <[email protected]>
Co-authored-by: HeroCC <[email protected]>
…che#56527)

* added how to guide for provider info

* updated provider.yaml to include how to guide path
* init

fx

fx

pre-commit

progress

fx

fx

fx

* fx

* add config

* conf doc

* fx

* fix1

* update version

* fix doc

* prek fix

* updates

* fix

* Update pyproject.toml

* Update pyproject.toml

* fix

* try

* pass

* doc and test

---------

Co-authored-by: Karun Poudel <[email protected]>
Co-authored-by: Karun Poudel <[email protected]>
* Fix broken main after pydantic 2.12.0 - partly cleanup

* Fix broken main after pydantic 2.12.0 - partly cleanup, align task SDK as well

* Regenerate generated code
…g applied to pod spec (apache#56402)

* Fix KubernetesPodOperator to apply termination_grace_period to pod spec

* Fix the description of termination_grace_period
…pache#55106)

* Update webserver secret note in Helm chart and prod guide

* Add note block to clarify API/webserver differences
Pydantic 2.12.0 implemented experimental Sentinel that requires
newer version of typing extensions. Our migration scripts however
downgrade airflow to 2.11.0 and since airflow 2.11 does not have
pydantic specified as required dependency, it does not downgrade
it - but it downgrades typing-extensions that are airflow dependency.

This causes a mismatch between expected version of typing extensions
by Pydantic (4.14.1) and the one that we have installed in airflow
2.11 (4.13.1). However - in fact - pydantic is a dependency of
Airflow 2.11 - becuase serialization uses pydantic serializer
in 2.11 and it fails being imported if typing extensions is too low.

This is only a problem when downgrading to Airflow 2.11 with constraints
when you do not specify pydantic as extra. This should be fixed in
2.11.1 as there constraints should include latest version of
typing-extension and pydantic.

For now - the fix is to add pydantic as extra when downgrading
airflow to 2.11.0
* Remove airflow.utils deprecations in Edge

* Remove airflow.utils deprecations in Edge

* Change to Airflow 3.1 plus only

* Move to version_compat as central import

* Uuuups import in pytest

* Uuuups import in pytest
After the [move of macros to the Task SDK](https://github.com/apache/airflow/pull/46867/files#diff-854d19db18bae58289f4ce996ca0fb34341bc0f22930620627afccbd9d6facfcL30), the babel dependency is missing.  This is probably due to the fact that in the past, the babel dependency was available through the flask-babel dependency.  This PR add the babel dependency to the Task SDK so that macro's depending on it can work again.

closes: apache#56552
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.