release notes
release notes
Published 4/8/2024
MinorContains breaking changesLifecycle events:
on_startingbefore_stoppingDagRun State Change Events:
on_dag_run_runningon_dag_run_successon_dag_run_failedTaskInstance State Change Events:
on_task_instance_runningon_task_instance_successon_task_instance_failedAfter discussion <https://lists.apache.org/thread/r06j306hldg03g2my1pd4nyjxg78b3h4>__
and a voting process <https://lists.apache.org/thread/pgcgmhf6560k8jbsmz8nlyoxosvltph2>__,
the Airflow's PMC and Committers have reached a resolution to no longer maintain MsSQL as a supported Database Backend.
As of Airflow 2.9.0 support of MsSQL has been removed for Airflow Database Backend.
A migration script which can help migrating the database before upgrading to Airflow 2.9.0 is available in
airflow-mssql-migration repo on Github <https://github.com/apache/airflow-mssql-migration>_.
Note that the migration script is provided without support and warranty.
This does not affect the existing provider packages (operators and hooks), DAGs can still access and process data from MsSQL.
Datasets must use a URI that conform to rules laid down in AIP-60, and the value
will be automatically normalized when the DAG file is parsed. See
documentation on Datasets <https://airflow.apache.org/docs/apache-airflow/stable/authoring-and-scheduling/datasets.html>_ for
a more detailed description on the rules.
You may need to change your Dataset identifiers if they look like a URI, but are used in a less mainstream way, such as relying on the URI's auth section, or have a case-sensitive protocol name.
get_permitted_menu_items in BaseAuthManager has been renamed filter_permitted_menu_items (#37627)The Audit Log event name for REST API events will be prepended with api. or ui., depending on if it came from the Airflow UI or externally.
There are a few caveats though:
Pendulum2 does not support Python 3.12. For Python 3.12 you need to use
Pendulum 3 <https://pendulum.eustace.io/blog/announcing-pendulum-3-0-0.html>_
Minimum SQLAlchemy version supported when Pandas is installed for Python 3.12 is 1.4.36 released in
April 2022. Airflow 2.9.0 increases the minimum supported version of SQLAlchemy to 1.4.36 for all
Python versions.
Not all Providers support Python 3.12. At the initial release of Airflow 2.9.0 the following providers are released without support for Python 3.12:
apache.beam - pending on Apache Beam support for 3.12 <https://github.com/apache/beam/issues/29149>_papermill - pending on Releasing Python 3.12 compatible papermill client version
including this merged issue <https://github.com/nteract/papermill/pull/771>_There's now a limit to the length of data that can be stored in the Rendered Template Fields.
The limit is set to 4096 characters. If the data exceeds this limit, it will be truncated. You can change this limit
by setting the [core]max_template_field_length configuration option in your airflow config.
Xcom table column value type has changed from blob to longblob. This will allow you to store relatively big data in Xcom but process can take a significant amount of time if you have a lot of large data stored in Xcom.
To downgrade from revision: b4078ac230a1, ensure that you don't have Xcom values larger than 65,535 bytes. Otherwise, you'll need to clean those rows or run airflow db clean xcom to clean the Xcom table.
Matomo as an option for analytics_tool. (#38221)hashable (#37465)queuedEvent endpoint to get/delete DatasetDagRunQueue (#37176)DatasetOrTimeSchedule (#36710)on_skipped_callback to BaseOperator (#36374)[@task](https://github.com/task).bash TaskFlow decorator (#30176, #37875)ExternalPythonOperator use version from sys.version_info (#38377)run_id column to log table (#37731)tryNumber to grid task instance tooltip (#37911)ExternalPythonOperator (#37409)Pathlike (#36947)nowait and skip_locked into with_row_locks (#36889)dag/dagRun in the REST API (#36641)Connexion from auth manager interface (#36209)total_entries count on the event logs endpoint (#38625)tz in next run ID info (#38482)chakra styles to keep dropdowns in filter bar (#38456)__exit__ is called in decorator context managers (#38383)BaseAuthManager.is_authorized_custom_view abstract (#37915)/get_logs_with_metadata endpoint (#37756)encoding to the SQL engine in SQLAlchemy v2 (#37545)consuming_dags attr eagerly before dataset listener (#36247)importlib_metadata with compat to Python 3.10/3.12 stdlib (#38366)__new__ magic method of BaseOperatorMeta to avoid bad mixing classic and decorated operators (#37937)sys.version_info for determine Python Major.Minor (#38372)blinker add where it requires (#38140)> 39.0.0 (#38112)assert outside of the tests (#37718)flask._request_ctx_stack (#37522)login attribute in airflow.__init__.py (#37565)datetime.datetime.utcnow by airflow.utils.timezone.utcnow in core (#35448)is_authorized_cluster_activity from auth manager (#36175)exception to templates ref list (#36656)release notes
Published 4/8/2024
MinorContains breaking changesLifecycle events:
on_startingbefore_stoppingDagRun State Change Events:
on_dag_run_runningon_dag_run_successon_dag_run_failedTaskInstance State Change Events:
on_task_instance_runningon_task_instance_successon_task_instance_failedAfter discussion <https://lists.apache.org/thread/r06j306hldg03g2my1pd4nyjxg78b3h4>__
and a voting process <https://lists.apache.org/thread/pgcgmhf6560k8jbsmz8nlyoxosvltph2>__,
the Airflow's PMC and Committers have reached a resolution to no longer maintain MsSQL as a supported Database Backend.
As of Airflow 2.9.0 support of MsSQL has been removed for Airflow Database Backend.
A migration script which can help migrating the database before upgrading to Airflow 2.9.0 is available in
airflow-mssql-migration repo on Github <https://github.com/apache/airflow-mssql-migration>_.
Note that the migration script is provided without support and warranty.
This does not affect the existing provider packages (operators and hooks), DAGs can still access and process data from MsSQL.
Datasets must use a URI that conform to rules laid down in AIP-60, and the value
will be automatically normalized when the DAG file is parsed. See
documentation on Datasets <https://airflow.apache.org/docs/apache-airflow/stable/authoring-and-scheduling/datasets.html>_ for
a more detailed description on the rules.
You may need to change your Dataset identifiers if they look like a URI, but are used in a less mainstream way, such as relying on the URI's auth section, or have a case-sensitive protocol name.
get_permitted_menu_items in BaseAuthManager has been renamed filter_permitted_menu_items (#37627)The Audit Log event name for REST API events will be prepended with api. or ui., depending on if it came from the Airflow UI or externally.
There are a few caveats though:
Pendulum2 does not support Python 3.12. For Python 3.12 you need to use
Pendulum 3 <https://pendulum.eustace.io/blog/announcing-pendulum-3-0-0.html>_
Minimum SQLAlchemy version supported when Pandas is installed for Python 3.12 is 1.4.36 released in
April 2022. Airflow 2.9.0 increases the minimum supported version of SQLAlchemy to 1.4.36 for all
Python versions.
Not all Providers support Python 3.12. At the initial release of Airflow 2.9.0 the following providers are released without support for Python 3.12:
apache.beam - pending on Apache Beam support for 3.12 <https://github.com/apache/beam/issues/29149>_papermill - pending on Releasing Python 3.12 compatible papermill client version
including this merged issue <https://github.com/nteract/papermill/pull/771>_There's now a limit to the length of data that can be stored in the Rendered Template Fields.
The limit is set to 4096 characters. If the data exceeds this limit, it will be truncated. You can change this limit
by setting the [core]max_template_field_length configuration option in your airflow config.
Xcom table column value type has changed from blob to longblob. This will allow you to store relatively big data in Xcom but process can take a significant amount of time if you have a lot of large data stored in Xcom.
To downgrade from revision: b4078ac230a1, ensure that you don't have Xcom values larger than 65,535 bytes. Otherwise, you'll need to clean those rows or run airflow db clean xcom to clean the Xcom table.
Matomo as an option for analytics_tool. (#38221)hashable (#37465)queuedEvent endpoint to get/delete DatasetDagRunQueue (#37176)DatasetOrTimeSchedule (#36710)on_skipped_callback to BaseOperator (#36374)[@task](https://github.com/task).bash TaskFlow decorator (#30176, #37875)ExternalPythonOperator use version from sys.version_info (#38377)run_id column to log table (#37731)tryNumber to grid task instance tooltip (#37911)ExternalPythonOperator (#37409)Pathlike (#36947)nowait and skip_locked into with_row_locks (#36889)dag/dagRun in the REST API (#36641)Connexion from auth manager interface (#36209)total_entries count on the event logs endpoint (#38625)tz in next run ID info (#38482)chakra styles to keep dropdowns in filter bar (#38456)__exit__ is called in decorator context managers (#38383)BaseAuthManager.is_authorized_custom_view abstract (#37915)/get_logs_with_metadata endpoint (#37756)encoding to the SQL engine in SQLAlchemy v2 (#37545)consuming_dags attr eagerly before dataset listener (#36247)importlib_metadata with compat to Python 3.10/3.12 stdlib (#38366)__new__ magic method of BaseOperatorMeta to avoid bad mixing classic and decorated operators (#37937)sys.version_info for determine Python Major.Minor (#38372)blinker add where it requires (#38140)> 39.0.0 (#38112)assert outside of the tests (#37718)flask._request_ctx_stack (#37522)login attribute in airflow.__init__.py (#37565)datetime.datetime.utcnow by airflow.utils.timezone.utcnow in core (#35448)is_authorized_cluster_activity from auth manager (#36175)exception to templates ref list (#36656)Apache Airflow - A platform to programmatically author, schedule, and monitor workflows