Skip to content

Commit b71c0f5

Browse files
RNHTTRkaxil
authored andcommitted
Edit airflow 3 migration guide (#49414)
(cherry picked from commit 1b44508)
1 parent 35e4622 commit b71c0f5

File tree

1 file changed

+33
-36
lines changed

1 file changed

+33
-36
lines changed

airflow-core/docs/installation/upgrading_to_airflow3.rst

Lines changed: 33 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -18,52 +18,49 @@
1818
Upgrading to Airflow 3
1919
=======================
2020

21-
Apache Airflow 3 is a major release. This guide walks you through the steps required to upgrade from Airflow 2.x to Airflow 3.0.
21+
Apache Airflow 3 is a major release and contains :ref:`breaking changes<breaking-changes>`. This guide walks you through the steps required to upgrade from Airflow 2.x to Airflow 3.0.
2222

2323
Step 1: Take care of prerequisites
2424
----------------------------------
2525

2626
- Make sure that you are on Airflow 2.7 or later.
2727
- Make sure that your Python version is in the supported list. Airflow 3.0.0 supports the following Python versions: Python 3.9, 3.10, 3.11 and 3.12.
28-
- Ensure that you are not using SubDAGs. These were deprecated in Airflow 2.0 and removed in Airflow 3.
29-
- For a complete list of breaking changes, which you should note before the upgrade, please check the breaking changes section below.
28+
- Ensure that you are not using any features or functionality that have been :ref:`removed in Airflow 3<breaking-changes>`.
29+
3030

3131
Step 2: Clean and back up your existing Airflow Instance
3232
---------------------------------------------------------
3333

34-
- It is highly recommended to make a backup of your Airflow instance specifically including your Airflow metadata DB before starting the migration process.
35-
- If you do not have a "hot backup" capability for your DB, you should do it after shutting down your Airflow instances, so that the backup of your database will be consistent.
36-
- If you did not make a backup and your migration fails, you might end up in a half-migrated state and restoring DB from backup and repeating the migration
37-
might be the only easy way out. This can for example be caused by a broken network connection between your CLI and the database while the migration happens, so taking a
38-
backup is an important precaution to avoid problems like this.
39-
- A long running Airflow instance can accumulate a certain amount of silt, in the form of old database entries, which are no longer
40-
required. This is typically in the form of old XCom data which is no longer required, and so on. As part of the Airflow 3 upgrade
41-
process, there will be schema changes. Based on the size of the Airflow meta-database this can be somewhat time
42-
consuming. For a faster, safer migration, we recommend that you clean up your Airflow meta-database before the upgrade.
43-
You can use ``airflow db clean`` command for that.
34+
- It is highly recommended that you make a backup of your Airflow instance, specifically your Airflow metadata database before starting the migration process.
35+
36+
- If you do not have a "hot backup" capability for your database, you should do it after shutting down your Airflow instances, so that the backup of your database will be consistent. For example, if you don't turn off your Airflow instance, the backup of the database will not include all TaskInstances or DagRuns.
37+
38+
- If you did not make a backup and your migration fails, you might end up in a half-migrated state. This can be caused by, for example, a broken network connection between your Airflow CLI and the database during the migration. Having a backup is an important precaution to avoid problems like this.
39+
40+
- A long running Airflow instance can accumulate a substantial amount of data that are no longer required (for example, old XCom data). Schema changes will be a part of the Airflow 3
41+
upgrade process. These schema changes can take a long time if the database is large. For a faster, safer migration, we recommend that you clean up your Airflow meta-database before the upgrade.
42+
You can use the ``airflow db clean`` :ref:`Airflow CLI command<cli-db-clean>` to trim your Airflow database.
43+
4444

4545
Step 3: DAG Authors - Check your Airflow DAGs for compatibility
4646
----------------------------------------------------------------
4747

48-
To minimize friction for users upgrading from prior versions of Airflow, we have created a DAG upgrade check utility using `Ruff <https://docs.astral.sh/ruff/>`_.
48+
To minimize friction for users upgrading from prior versions of Airflow, we have created a dag upgrade check utility using `Ruff <https://docs.astral.sh/ruff/>`_.
4949

50-
Use the latest available ``ruff`` version to get updates to the rules but at the very least use ``0.11.6``:
50+
The latest available ``ruff`` version will have the most up-to-date rules, but be sure to use at least version ``0.11.6``. The below example demonstrates how to check
51+
for dag incompatibilities that will need to be fixed before they will work as expected on Airflow 3.
5152

5253
.. code-block:: bash
5354
5455
ruff check dag/ --select AIR301
5556
56-
This command above shows you all the errors which need to be fixed before these DAGs can be used on Airflow 3.
57-
58-
Some of these changes are automatically fixable and you can also rerun the command above with the auto-fix option as shown below.
59-
60-
To preview the changes:
57+
To preview the recommended fixes, run the following command:
6158

6259
.. code-block:: bash
6360
6461
ruff check dag/ --select AIR301 --show-fixes
6562
66-
To auto-fix:
63+
Some changes can be automatically fixed. To do so, run the following command:
6764

6865
.. code-block:: bash
6966
@@ -72,15 +69,16 @@ To auto-fix:
7269
Step 4: Install the Standard Providers
7370
--------------------------------------
7471

75-
- Some of the commonly used Operators which were bundled as part of the Core Airflow OSS package such as the
76-
Bash and Python Operators have now been split out into a separate package: ``apache-airflow-providers-standard``.
77-
- For user convenience, this package can also be installed on Airflow 2.x versions, so that DAGs can be modified to reference these Operators from the Standard Provider package instead of Airflow Core.
72+
- Some of the commonly used Operators which were bundled as part of the ``airflow-core`` package (for example ``BashOperator`` and ``PythonOperator``)
73+
have now been split out into a separate package: ``apache-airflow-providers-standard``.
74+
- For convenience, this package can also be installed on Airflow 2.x versions, so that DAGs can be modified to reference these Operators from the standard provider
75+
package instead of Airflow Core.
7876

7977

8078
Step 5: Deployment Managers - Upgrade your Airflow Instance
8179
------------------------------------------------------------
8280

83-
For an easier and safer upgrade process, we have also created a utility to upgrade your Airflow instance configuration as a deployment manager.
81+
For an easier and safer upgrade process, we have also created a utility to upgrade your Airflow instance configuration.
8482

8583
The first step is to run this configuration check utility as shown below:
8684

@@ -97,8 +95,7 @@ This configuration utility can also update your configuration to automatically b
9795
airflow config update --fix
9896
9997
100-
The biggest part of an Airflow upgrade is the database upgrade. The database upgrade process for Airflow 3 is the same as for Airflow 2.7 or later.
101-
98+
The biggest part of an Airflow upgrade is the database upgrade. The database upgrade process for Airflow 3 is the same as for Airflow 2.7 or later:
10299

103100
.. code-block:: bash
104101
@@ -111,19 +108,21 @@ You should now be able to start up your Airflow 3 instance.
111108
Step 6: Changes to your startup scripts
112109
---------------------------------------
113110

114-
- In Airflow 3, the Webserver has now become a generic API-server. The api-server can be started up using the following command:
111+
In Airflow 3, the Webserver has become a generic API server. The API server can be started up using the following command:
115112

116113
.. code-block:: bash
117114
118115
airflow api-server
119116
120-
- The DAG processor must now be started independently, even for local or development setups.
117+
The dag processor must now be started independently, even for local or development setups:
121118

122119
.. code-block:: bash
123120
124121
airflow dag-processor
125122
126123
124+
.. _breaking-changes:
125+
127126
Breaking Changes
128127
================
129128

@@ -133,9 +132,8 @@ These include:
133132
- **SubDAGs**: Replaced by TaskGroups, Datasets, and Data Aware Scheduling.
134133
- **Sequential Executor**: Replaced by LocalExecutor, which can be used with SQLite for local development use cases.
135134
- **SLAs**: Deprecated and removed; Will be replaced by forthcoming `Deadline Alerts <https://cwiki.apache.org/confluence/x/tglIEw>`_.
136-
- **Subdir**: Used as an argument on many CLI commands (``--subdir`` or ``-S`` has been superseded by DAG bundles.
137-
- **Following keys are no longer available in task context. If not replaced, will cause DAG errors**:
138-
135+
- **Subdir**: Used as an argument on many CLI commands, ``--subdir`` or ``-S`` has been superseded by :doc:`DAG bundles </administration-and-deployment/dag-bundles>`.
136+
- **Some Airflow context variables**: The following keys are no longer available in a :ref:`task instance's context <templates:variables>`. If not replaced, will cause dag errors:
139137
- ``tomorrow_ds``
140138
- ``tomorrow_ds_nodash``
141139
- ``yesterday_ds``
@@ -148,10 +146,9 @@ These include:
148146
- ``next_ds_nodash``
149147
- ``next_ds``
150148
- ``execution_date``
151-
152-
- ``catchup_by_default`` is now ``False`` by default.
153-
- ``create_cron_data_intervals`` is now ``False``. This means that the ``CronTriggerTimetable`` will be used by default instead of the ``CronDataIntervalTimetable``
154-
- **Simple Auth** is now default ``auth_manager``. To continue using FAB as the Auth Manager, please install the FAB provider and set ``auth_manager`` to
149+
- The ``catchup_by_default`` dag parameter is now ``False`` by default.
150+
- The ``create_cron_data_intervals`` configuration is now ``False`` by default. This means that the ``CronTriggerTimetable`` will be used by default instead of the ``CronDataIntervalTimetable``
151+
- **Simple Auth** is now default ``auth_manager``. To continue using FAB as the Auth Manager, please install the FAB provider and set ``auth_manager`` to ``FabAuthManager``:
155152

156153
.. code-block:: ini
157154

0 commit comments

Comments
 (0)