Skip to content
NextLytics
Megamenü_2023_Über-uns

Shaping Business Intelligence

Whether clever add-on products for SAP BI, development of meaningful dashboards or implementation of AI-based applications - we shape the future of Business Intelligence together with you. 

Megamenü_2023_Über-uns_1

About us

As a partner with deep process know-how, knowledge of the latest SAP technologies as well as high social competence and many years of project experience, we shape the future of Business Intelligence in your company too.

Megamenü_2023_Methodik

Our Methodology

The mixture of classic waterfall model and agile methodology guarantees our projects a high level of efficiency and satisfaction on both sides. Learn more about our project approach.

Products
Megamenü_2023_NextTables

NextTables

Edit data in SAP BW out of the box: NextTables makes editing tables easier, faster and more intuitive, whether you use SAP BW on HANA, SAP S/4HANA or SAP BW 4/HANA.

Megamenü_2023_Connector

NextLytics Connectors

The increasing automation of processes requires the connectivity of IT systems. NextLytics Connectors allow you to connect your SAP ecosystem with various open-source technologies.

IT-Services
Megamenü_2023_Data-Science

Data Science & Engineering

Ready for the future? As a strong partner, we will support you in the design, implementation and optimization of your AI application.

Megamenü_2023_Planning

SAP Planning

We design new planning applications using SAP BPC Embedded, IP or SAC Planning which create added value for your company.

Megamenü_2023_Dashboarding

Business Intelligence

We help you with our expertise to create meaningful dashboards based on Tableau, Power BI, SAP Analytics Cloud or SAP Lumira. 

Megamenü_2023_Data-Warehouse-1

SAP Data Warehouse

Are you planning a migration to SAP HANA? We show you the challenges and which advantages a migration provides.

Business Analytics
Megamenü_2023_Procurement

Procurement Analytics

Transparent and valid figures are important, especially in companies with a decentralized structure. SAP Procurement Analytics allows you to evaluate SAP ERP data in SAP BI.

Megamenü_2023_Reporting

SAP HR Reporting & Analytics

With our standard model for reporting from SAP HCM with SAP BW, you accelerate business activities and make data from various systems available centrally and validly.

Megamenü_2023_Dataquality

Data Quality Management

In times of Big Data and IoT, maintaining high data quality is of the utmost importance. With our Data Quality Management (DQM) solution, you always keep the overview.

Career
Megamenü_2023_Karriere-2b

Working at NextLytics

If you would like to work with pleasure and don't want to miss out on your professional and personal development, we are the right choice for you!

Megamenü_2023_Karriere-1

Senior

Time for a change? Take your next professional step and work with us to shape innovation and growth in an exciting business environment!

Megamenü_2023_Karriere-5

Junior

Enough of grey theory - time to get to know the colourful reality! Start your working life with us and enjoy your work with interesting projects.

Megamenü_2023_Karriere-4-1

Students

You don't just want to study theory, but also want to experience it in practice? Check out theory and practice with us and experience where the differences are made.

Megamenü_2023_Karriere-3

Jobs

You can find all open vacancies here. Look around and submit your application - we look forward to it! If there is no matching position, please send us your unsolicited application.

Blog
NextLytics Newsletter
Subscribe for our monthly newsletter:
Sign up for newsletter
 

Apache Airflow Updates 2025: A Deep Dive into Features Added After 3.0

2025 is over and with that, it is a good opportunity to review and assess the changes the year has brought to the ever evolving landscape of open source data tools. Apache Airflow has established itself as an invaluable component of many companies' data stacks over the last decade, while continually evolving and improving its feature set through the tireless work of many contributors from the active community.

Although there has not been a major version release since our last discussion of Airflow’s feature scope with version 3.0, the platform has continued to evolve, introducing several new and improved features between 3.0 and the current 3.15 release that are worth highlighting.

A short refresher on Airflow

Apache Airflow is a modern open-source platform for orchestrating and monitoring complex business and data workflows. In the current business landscape, where analytics, AI, reporting and operational systems intersect and all depend on up-to-date and reliable data, Airflow acts as the control layer that ensures all workflows run at the right time, in the right order and with full transparency. It allows teams to define dependencies, automate recurring tasks and react to any events that might occur during the execution of those data pipelines.

From a strategic perspective, Airflow provides a scalable and future-proof foundation for automation. It is code-centric, highly extensible and completely free to use with its full feature set, enabling teams to standardize orchestration tasks across teams and technologies without leading to vendor lock-in.
Backed by a large and active open-source community and the knowledge and resources of the Apache Foundation, Airflow has proven to offer enterprise-grade reliability, making it suitable for long-term adoption across a variety of use cases such as:


Read our Whitepaper:

Effective workflow management with
Apache Airflow 2.0

NextLytics Whitepaper Apache Airflow


What’s new since 3.0?

Most of the Updates Airflow has received since the release and today have been bugfixes and stability improvements, but there have been a couple of noteworthy new features that were introduced since that we want to share with you.

Human in the Loop (HITL)

The Human-in-the-loop functionality added in version 3.1 allows users to design workflows to conditionally pause and wait for human decision-making. The feature especially lends itself to use cases like approval processes, manual quality checks or other scenarios where human judgement is needed. It is implemented via the new HITLOperator and it offers a variety of interactions for the judgement and the workflow handling following the input by the alerted human.

A few example use cases would be:

- Manual validation of data quality anomalies during data quality checks by alerting a data steward and asking them whether to proceed, rerun upstream tasks or to stop  the workflow entirely.

- Allowing human review of AI/LLM generated content, where the output of a task that generates some text via an LLM can be checked for correctness before passing the output to follow up tasks.

- Collecting executive approval before publishing reports or datasets by requiring explicit sign-off from specific authorized users before continuing.

- Modeling business steps that require accountability as HITL tasks ensures that all decisions are captured directly in the workflow execution history.


hitl_selectionExample DAG demonstrating how the Human-in-the-loop feature can be used to select the next task from a predefined selection

These examples only scratch the surface of what is possible with this operator and this feature will most likely expand the way Airflow can be integrated into key business processes.

Deadline Alerts

Deadline alerts allow users to configure time thresholds for DAG runs and to automatically trigger notifications when these thresholds are exceeded. To use this feature, you will need to set up the following configuration:

- Reference point: When to start counting from. This could be a fixed datetime or the queued time or logical date of a DAG.

- Interval: How far before or after the reference point to trigger the alert.

- Callback: Response action using Airflow Notifiers or custom functions.

This feature can be used to proactively monitor time-critical tasks, for example by automatically sending a Slack message when a critical task has been in the “queued” status for more than 30 minutes or if daily ELT processes haven't completed in an expected time window after its scheduled time.

These alerts will enable even more transparency in the platform, leading to data teams being able to identify performance bottlenecks more easily and preventing down times in your process chains.

Deadline Alerts are an experimental feature introduced in Airflow 3.1. This means that this feature in its current implementation may be subject to changes in future versions without further warning.

UI Internationalization & improvements

With Airflow 3.1, internationalization (i18n) support has been added. This means in practice, that the web UI is now available in 17 languages, with a robust translation infrastructure for future additions and custom components.

lang_selectionLanguage Selection dialog in the Airflow UI

It is also now possible to pin and favorite DAGs to improve dashboard organization.
Additionally, the calendar and gantt views have been rebuilt, offering a new visual design and filtering capabilities.

calendar_and_translationTranslated new calendar component from the DAG detail view

ganttRebuilt version of the gantt chart displaying task runtimes within a DAG

New Trigger Rule

The new task trigger rule ALL_DONE_MIN_ONE_SUCCESS allows for additional control when orchestrating workflows, by enabling a downstream task to run once all upstream tasks are completed and at least one of those has succeeded. This covers real-world scenarios where partial completion is sufficient, like for example if you were to pull the same data in parallel from multiple sources like a primary system and a fallback source from an archive.

React Plugins architecture

Airflow has long supported extensibility through its plugin systems, allowing users to add custom functionalities by adding Python modules, APIs, and UI elements directly into the platform. This made it possible to customize Airflow for specific enterprise environments adding custom logic and domain-specific integrations on top of a shared orchestration foundation.

With Airflow 3.1, this framework is extended by the introduction of React-based UI components.
While previous plugins could already add server-rendered views and APIs, the new React app capability enables teams to embed modern, client-side applications directly into the Airflow user interface. These apps can appear as standalone pages, embedded dashboards, or contextual views alongside DAGs, runs, and tasks.

This could for example allow embedding a dashboard for pipeline health and KPIs derived from business data directly in Airflow, giving stakeholders immediate visibility over critical metrics directly within the platform.

Apache Airflow 2025 Retrospective: Our conclusion and perspective for the future

Apache Airflow has long since outgrown its origins as a simple scheduler. With each release, it continues to mature into a flexible, production-grade orchestration platform that adapts to real-world data challenges without abandoning the pragmatism that made it successful in the first place. While a major release such as Airflow 4.0 is likely still some distance away, the platform continues to progress steadily. Continuous contributions from a large and active open-source community ensure that new features, enhancements, and refinements are delivered consistently with each release.

The new airflow-related features we’ve highlighted are a clear signal of where the platform is headed: more flexibility, cleaner abstractions and fewer compromises when integrating modern data architectures.

Are you curious how these developments could translate into concrete benefits for your own use case? Whether you are planning the next architectural step or simply reviewing your current setup, we are happy to share our experience.

Feel free to get in touch to discuss how your Airflow pipelines could evolve with these new capabilities.

 

Learn more about Apache Airflow

 

 

FAQ - Apache Airflow 2025 Retrospective

Here you can find some of the most frequently asked questions about Apache Airflow's updates in 2025 and its latest features.

What role does Apache Airflow play in modern data architectures? Apache Airflow acts as the central control layer for orchestrating, monitoring, and governing complex data and business workflows.
Has Apache Airflow evolved since the 3.0 release? Yes. Even without a new major version, Airflow introduced several notable functions in version 3.1
What is Human-in-the-Loop (HITL) in Airflow? HITL allows workflows to pause and wait for human decisions, enabling approvals, validations, or manual quality checks within DAGs.
Which are some possible use cases for HITL? Typical scenarios include data quality validation, approval of AI-generated outputs, and executive sign-off before publishing data or reports.
What are Deadline Alerts in Airflow? Deadline Alerts notify users when DAG runs exceed defined time thresholds, helping teams identify delays and performance issues early.
Is the Deadline Alert function production-ready? No. Deadline Alerts are currently experimental and may change in future Airflow releases.
What UI improvements were introduced after Airflow 3.0? Airflow now supports 17 UI languages, allows pinning and favoriting DAGs, and offers redesigned calendar and Gantt views.
How do React Plugins extend Airflow? React Plugins allow embedding modern, client-side applications directly into the Airflow UI, such as dashboards or custom monitoring views.

,

avatar

Robin

Robin Brandt is a consultant for Machine Learning and Data Engineering. With many years of experience in software and data engineering, he has expertise in automation, data transformation and database management - especially in the area of open source solutions. He spends his free time making music or creating spicy dishes.

Got a question about this blog?
Ask Robin

Apache Airflow Updates 2025: A Deep Dive into Features Added After 3.0
8:56

Blog - NextLytics AG 

Welcome to our blog. In this section we regularly report on news and background information on topics such as SAP Business Intelligence (BI), SAP Dashboarding with Lumira Designer or SAP Analytics Cloud, Machine Learning with SAP BW, Data Science and Planning with SAP Business Planning and Consolidation (BPC), SAP Integrated Planning (IP) and SAC Planning and much more.

Subscribe to our newsletter

Related Posts

Recent Posts