NextLytics Blog

Apache Airflow Updates 2025: A Deep Dive into Features Added After 3.0

Written by Robin | 08 January 2026

2025 is over and with that, it is a good opportunity to review and assess the changes the year has brought to the ever evolving landscape of open source data tools. Apache Airflow has established itself as an invaluable component of many companies' data stacks over the last decade, while continually evolving and improving its feature set through the tireless work of many contributors from the active community.

Although there has not been a major version release since our last discussion of Airflow’s feature scope with version 3.0, the platform has continued to evolve, introducing several new and improved features between 3.0 and the current 3.15 release that are worth highlighting.

A short refresher on Airflow

Apache Airflow is a modern open-source platform for orchestrating and monitoring complex business and data workflows. In the current business landscape, where analytics, AI, reporting and operational systems intersect and all depend on up-to-date and reliable data, Airflow acts as the control layer that ensures all workflows run at the right time, in the right order and with full transparency. It allows teams to define dependencies, automate recurring tasks and react to any events that might occur during the execution of those data pipelines.

From a strategic perspective, Airflow provides a scalable and future-proof foundation for automation. It is code-centric, highly extensible and completely free to use with its full feature set, enabling teams to standardize orchestration tasks across teams and technologies without leading to vendor lock-in.
Backed by a large and active open-source community and the knowledge and resources of the Apache Foundation, Airflow has proven to offer enterprise-grade reliability, making it suitable for long-term adoption across a variety of use cases such as:

Read our Whitepaper:

Effective workflow management with
Apache Airflow 2.0

What’s new since 3.0?

Most of the Updates Airflow has received since the release and today have been bugfixes and stability improvements, but there have been a couple of noteworthy new features that were introduced since that we want to share with you.

Human in the Loop (HITL)

The Human-in-the-loop functionality added in version 3.1 allows users to design workflows to conditionally pause and wait for human decision-making. The feature especially lends itself to use cases like approval processes, manual quality checks or other scenarios where human judgement is needed. It is implemented via the new HITLOperator and it offers a variety of interactions for the judgement and the workflow handling following the input by the alerted human.

A few example use cases would be:

- Manual validation of data quality anomalies during data quality checks by alerting a data steward and asking them whether to proceed, rerun upstream tasks or to stop  the workflow entirely.

- Allowing human review of AI/LLM generated content, where the output of a task that generates some text via an LLM can be checked for correctness before passing the output to follow up tasks.

- Collecting executive approval before publishing reports or datasets by requiring explicit sign-off from specific authorized users before continuing.

- Modeling business steps that require accountability as HITL tasks ensures that all decisions are captured directly in the workflow execution history.


Example DAG demonstrating how the Human-in-the-loop feature can be used to select the next task from a predefined selection

These examples only scratch the surface of what is possible with this operator and this feature will most likely expand the way Airflow can be integrated into key business processes.

Deadline Alerts

Deadline alerts allow users to configure time thresholds for DAG runs and to automatically trigger notifications when these thresholds are exceeded. To use this feature, you will need to set up the following configuration:

- Reference point: When to start counting from. This could be a fixed datetime or the queued time or logical date of a DAG.

- Interval: How far before or after the reference point to trigger the alert.

- Callback: Response action using Airflow Notifiers or custom functions.

This feature can be used to proactively monitor time-critical tasks, for example by automatically sending a Slack message when a critical task has been in the “queued” status for more than 30 minutes or if daily ELT processes haven't completed in an expected time window after its scheduled time.

These alerts will enable even more transparency in the platform, leading to data teams being able to identify performance bottlenecks more easily and preventing down times in your process chains.

Deadline Alerts are an experimental feature introduced in Airflow 3.1. This means that this feature in its current implementation may be subject to changes in future versions without further warning.

UI Internationalization & improvements

With Airflow 3.1, internationalization (i18n) support has been added. This means in practice, that the web UI is now available in 17 languages, with a robust translation infrastructure for future additions and custom components.

Language Selection dialog in the Airflow UI

It is also now possible to pin and favorite DAGs to improve dashboard organization.
Additionally, the calendar and gantt views have been rebuilt, offering a new visual design and filtering capabilities.

Translated new calendar component from the DAG detail view

Rebuilt version of the gantt chart displaying task runtimes within a DAG

New Trigger Rule

The new task trigger rule ALL_DONE_MIN_ONE_SUCCESS allows for additional control when orchestrating workflows, by enabling a downstream task to run once all upstream tasks are completed and at least one of those has succeeded. This covers real-world scenarios where partial completion is sufficient, like for example if you were to pull the same data in parallel from multiple sources like a primary system and a fallback source from an archive.

React Plugins architecture

Airflow has long supported extensibility through its plugin systems, allowing users to add custom functionalities by adding Python modules, APIs, and UI elements directly into the platform. This made it possible to customize Airflow for specific enterprise environments adding custom logic and domain-specific integrations on top of a shared orchestration foundation.

With Airflow 3.1, this framework is extended by the introduction of React-based UI components.
While previous plugins could already add server-rendered views and APIs, the new React app capability enables teams to embed modern, client-side applications directly into the Airflow user interface. These apps can appear as standalone pages, embedded dashboards, or contextual views alongside DAGs, runs, and tasks.

This could for example allow embedding a dashboard for pipeline health and KPIs derived from business data directly in Airflow, giving stakeholders immediate visibility over critical metrics directly within the platform.

Apache Airflow 2025 Retrospective: Our conclusion and perspective for the future

Apache Airflow has long since outgrown its origins as a simple scheduler. With each release, it continues to mature into a flexible, production-grade orchestration platform that adapts to real-world data challenges without abandoning the pragmatism that made it successful in the first place. While a major release such as Airflow 4.0 is likely still some distance away, the platform continues to progress steadily. Continuous contributions from a large and active open-source community ensure that new features, enhancements, and refinements are delivered consistently with each release.

The new airflow-related features we’ve highlighted are a clear signal of where the platform is headed: more flexibility, cleaner abstractions and fewer compromises when integrating modern data architectures.

Are you curious how these developments could translate into concrete benefits for your own use case? Whether you are planning the next architectural step or simply reviewing your current setup, we are happy to share our experience.

Feel free to get in touch to discuss how your Airflow pipelines could evolve with these new capabilities.