NextLytics Blog

Apache Airflow 3.0: Everything you need to know about the new release

Written by Markus | 15 May 2025

Apache Airflow 3.0 has been publicly available since April 22, 2025. It is the first major release of the popular open source orchestration service in five years and brings with it a large number of changes. What new features are there? What is changing? Should I update my existing system? What do I need to consider when updating? The release raises many questions that we want to answer for you.

Do you know Airflow?

What modern agent-based AI models, machine learning workflows, enterprise reporting and business intelligence have in common is that they need to be regularly and reliably supplied with up-to-date data. Coordinating these visible or invisible processes at all levels and keeping them in sync requires process orchestration: time-controlled routines, event-based or dependent tasks must be defined, reliably and traceably executed and monitored. The widely used open source project Apache Airflow provides the leading code-centric development and operating platform for all of this. Airflow is scalable, expandable, completely open and, thanks to its per-code focus, can be used indefinitely. The areas of application are diverse, to name just a few:

Apache Airflow is one of the few large open source projects in the data and BI world that does not hold back any features in the free version. There are a wide range of integration options and extension modules and, thanks to the active community and professional support from the Apache Foundation, long-term planning security.

New features: Airflow in dark mode

The new release brings with it a whole range of new features. Few are as eye-catching as the new dark mode of the user interface. However, the new UI is not only (optionally) dark, but has been completely overhauled: A new front end with React Framework and a new web server based on the FastAPI Python library give Apache Airflow 3.0 a fresh coat of paint. Submenus have been integrated into graphical elements in useful places. At first glance, it seems as if many options are suddenly missing, but these are now more cleverly embedded in a lean overall picture. The Airflow UI still takes some getting used to with the new release, but it shortens loading times and requires fewer clicks for the same information than before.

Airflow 3.0 introduces dark mode user interface among many other UI and architectural improvements.

 

Decoupling of main system and task executor

In the long term, the visible changes will be less important than the invisible improvements to the system architecture: the new release introduces a new API for communication with distributed worker nodes. Until now, Airflow worker nodes had to have direct access to a shared backend database in order to communicate with the Airflow core system. Distributed operation of tasks across different network zones or even cloud providers was particularly difficult. The new Task Execution API makes this possible and allows the communication of tasks to be completed, configuration parameters, credentials, logs and results via encrypted HTTPS messages. This is also the prerequisite for the future development of worker nodes in programming languages other than Python. The vision of the Airflow developers is that tasks can ultimately be coded in any programming language and controlled via a central Airflow.

Effective workflow management with
Apache Airflow 2.0

DAG versioning

In Airflow, tasks are defined using so-called DAGs, directed acyclic graphs. DAGs are defined in the Python programming language and therefore enable strict code-based version control of changes, e.g. using Git and CI/CD pipelines. With Airflow 3.0, it is now possible for the first time to see and track these changes in the user interface. If the composition of tasks within a DAG changes and, for example, new tasks are added or old tasks are deactivated, the various states of the workflow can be displayed. Particularly important for long-lasting systems: the status information and logs are retained even if the workflow is changed - previously a blind spot of the platform.

DAG versioning is a major new feature introduced with Airflow 3.0 and
finally makes changes to workflows visible in the UI.

Improved event scheduling

With the new major release, the already known datasets are renamed as assets and have a broader code base. Datasets or assets allow the linking of tasks and DAGs beyond their own context and thus, for example, the immediate chaining of DAGs whenever a connecting asset is updated. New watcher tasks can now react to events in third-party systems and trigger the execution of Airflow DAGs. The assets also have a fully-fledged REST API, which allows active triggers through messages from third-party systems and enables further integration into networked system landscapes.

In addition to these features that we are highlighting for you, Apache Airflow 3.0 brings countless smaller improvements that may seem inconspicuous at first glance, but which solve specific problems or make new ways of working with Airflow possible.

Smaller UI improvements like scoped DAG dependency views make working
with Apache Airflow 3.0 more convenient than ever before.

Requirements for an upgrade

A new major release naturally also means that the version change in existing systems requires certain adjustments. The good news is that many of the new features have already been prepared with the previous minor releases of the Airflow 2.x code base. Systems on a version of 2.7 or higher should be compatible with few restrictions. For example, if you are already running Airflow 2.x with a standalone DAG processor, only the services need to be adapted for the new frontend and an upgrade should be successful. Nevertheless, it is advisable to carry out a series of compatibility checks in advance, for which the Airflow project team even provides its own tools. Start with a local instance and work your way through the various system environments along your development processes.

If you want to take a look at Airflow 3 without much effort, you can use the new Docker image to start a local environment with just one line of code:

docker run -it --rm -p 8080:8080 apache/airflow:3.0.1 airflow standalone

This command downloads a container version of Airflow 3.0.1 and starts it in test mode. A randomly generated password for the admin user is displayed in the log, which you must copy and enter when logging into the web interface:

  • URL: http://localhost:8080
  • Username: admin
  • Password can be found in the container log

When should you carry out the upgrade?

The list of new features and improvements in Apache Airflow 3.0 is long and even for us very specific in parts. Whether and when you upgrade from 2.x to the new major version depends heavily on whether you have already reached the limits of your existing system environment. Do you lack connectivity between cloud and on-premise environments? Have you implemented complicated and error-prone workarounds for event-based scheduling? Then you've probably been waiting for Airflow 3.0 and can start preparing. If Airflow 2.x fulfills all your requirements, slow down the process. Despite high development quality, bugs and inefficiencies are usually still found in new releases over several months, especially those with far-reaching changes. The first patch with version number 3.0.1 was released at the beginning of May 2025 - the next patches are likely to follow at short intervals.

After a long development period, Apache Airflow has positioned itself well for the future with version 3. The release should not only contain exciting changes for night owls, as our overview only scratches the surface slightly and cannot go into all the new features in detail. Which features are particularly important for your work with Airflow? Do you want to switch to Airflow 3 as soon as possible or wait and see?

Perhaps you would be interested in upgrading to Airflow as a managed service? If you are unsure and would like to discuss your specific challenges with experienced Airflow experts, please contact us. As an outstanding open source project with countless possible applications, Apache Airflow will continue to play an important role in the engine room of data teams around the world for a long time to come. A good network is at least as important for long term success as the code itself: We look forward to hearing from you!