Skip to content
NextLytics
Megamenü_2023_Über-uns

Shaping Business Intelligence

Whether clever add-on products for SAP BI, development of meaningful dashboards or implementation of AI-based applications - we shape the future of Business Intelligence together with you. 

Megamenü_2023_Über-uns_1

About us

As a partner with deep process know-how, knowledge of the latest SAP technologies as well as high social competence and many years of project experience, we shape the future of Business Intelligence in your company too.

Megamenü_2023_Methodik

Our Methodology

The mixture of classic waterfall model and agile methodology guarantees our projects a high level of efficiency and satisfaction on both sides. Learn more about our project approach.

Products
Megamenü_2023_NextTables

NextTables

Edit data in SAP BW out of the box: NextTables makes editing tables easier, faster and more intuitive, whether you use SAP BW on HANA, SAP S/4HANA or SAP BW 4/HANA.

Megamenü_2023_Connector

NextLytics Connectors

The increasing automation of processes requires the connectivity of IT systems. NextLytics Connectors allow you to connect your SAP ecosystem with various open-source technologies.

IT-Services
Megamenü_2023_Data-Science

Data Science & Engineering

Ready for the future? As a strong partner, we will support you in the design, implementation and optimization of your AI application.

Megamenü_2023_Planning

SAP Planning

We design new planning applications using SAP BPC Embedded, IP or SAC Planning which create added value for your company.

Megamenü_2023_Dashboarding

Dashboarding

We help you with our expertise to create meaningful dashboards based on Tableau, Power BI, SAP Analytics Cloud or SAP Lumira. 

Megamenü_2023_Data-Warehouse-1

SAP Data Warehouse

Are you planning a migration to SAP HANA? We show you the challenges and which advantages a migration provides.

Business Analytics
Megamenü_2023_Procurement

Procurement Analytics

Transparent and valid figures are important, especially in companies with a decentralized structure. SAP Procurement Analytics allows you to evaluate SAP ERP data in SAP BI.

Megamenü_2023_Reporting

SAP HR Reporting & Analytics

With our standard model for reporting from SAP HCM with SAP BW, you accelerate business activities and make data from various systems available centrally and validly.

Megamenü_2023_Dataquality

Data Quality Management

In times of Big Data and IoT, maintaining high data quality is of the utmost importance. With our Data Quality Management (DQM) solution, you always keep the overview.

Career
Megamenü_2023_Karriere-2b

Working at NextLytics

If you would like to work with pleasure and don't want to miss out on your professional and personal development, we are the right choice for you!

Megamenü_2023_Karriere-1

Senior

Time for a change? Take your next professional step and work with us to shape innovation and growth in an exciting business environment!

Megamenü_2023_Karriere-5

Junior

Enough of grey theory - time to get to know the colourful reality! Start your working life with us and enjoy your work with interesting projects.

Megamenü_2023_Karriere-4-1

Students

You don't just want to study theory, but also want to experience it in practice? Check out theory and practice with us and experience where the differences are made.

Megamenü_2023_Karriere-3

Jobs

You can find all open vacancies here. Look around and submit your application - we look forward to it! If there is no matching position, please send us your unsolicited application.

Blog
NextLytics Newsletter Teaser
Sign up now for our monthly newsletter!
Sign up for newsletter
 

Apache Airflow 3.0: Everything you need to know about the new release

Apache Airflow 3.0 has been publicly available since April 22, 2025. It is the first major release of the popular open source orchestration service in five years and brings with it a large number of changes. What new features are there? What is changing? Should I update my existing system? What do I need to consider when updating? The release raises many questions that we want to answer for you.

Do you know Airflow?

What modern agent-based AI models, machine learning workflows, enterprise reporting and business intelligence have in common is that they need to be regularly and reliably supplied with up-to-date data. Coordinating these visible or invisible processes at all levels and keeping them in sync requires process orchestration: time-controlled routines, event-based or dependent tasks must be defined, reliably and traceably executed and monitored. The widely used open source project Apache Airflow provides the leading code-centric development and operating platform for all of this. Airflow is scalable, expandable, completely open and, thanks to its per-code focus, can be used indefinitely. The areas of application are diverse, to name just a few:

Apache Airflow is one of the few large open source projects in the data and BI world that does not hold back any features in the free version. There are a wide range of integration options and extension modules and, thanks to the active community and professional support from the Apache Foundation, long-term planning security.

New features: Airflow in dark mode

The new release brings with it a whole range of new features. Few are as eye-catching as the new dark mode of the user interface. However, the new UI is not only (optionally) dark, but has been completely overhauled: A new front end with React Framework and a new web server based on the FastAPI Python library give Apache Airflow 3.0 a fresh coat of paint. Submenus have been integrated into graphical elements in useful places. At first glance, it seems as if many options are suddenly missing, but these are now more cleverly embedded in a lean overall picture. The Airflow UI still takes some getting used to with the new release, but it shortens loading times and requires fewer clicks for the same information than before.

Apache_Airflow-3.0-darkmode-dag-graph

Airflow 3.0 introduces dark mode user interface among many other UI and architectural improvements.

 

Decoupling of main system and task executor

In the long term, the visible changes will be less important than the invisible improvements to the system architecture: the new release introduces a new API for communication with distributed worker nodes. Until now, Airflow worker nodes had to have direct access to a shared backend database in order to communicate with the Airflow core system. Distributed operation of tasks across different network zones or even cloud providers was particularly difficult. The new Task Execution API makes this possible and allows the communication of tasks to be completed, configuration parameters, credentials, logs and results via encrypted HTTPS messages. This is also the prerequisite for the future development of worker nodes in programming languages other than Python. The vision of the Airflow developers is that tasks can ultimately be coded in any programming language and controlled via a central Airflow.


Effective workflow management with
Apache Airflow 2.0

NextLytics Whitepaper Apache Airflow


DAG versioning

In Airflow, tasks are defined using so-called DAGs, directed acyclic graphs. DAGs are defined in the Python programming language and therefore enable strict code-based version control of changes, e.g. using Git and CI/CD pipelines. With Airflow 3.0, it is now possible for the first time to see and track these changes in the user interface. If the composition of tasks within a DAG changes and, for example, new tasks are added or old tasks are deactivated, the various states of the workflow can be displayed. Particularly important for long-lasting systems: the status information and logs are retained even if the workflow is changed - previously a blind spot of the platform.

Apache_Airflow_3.0_dag_versions

DAG versioning is a major new feature introduced with Airflow 3.0 and
finally makes changes to workflows visible in the UI.

Improved event scheduling

With the new major release, the already known datasets are renamed as assets and have a broader code base. Datasets or assets allow the linking of tasks and DAGs beyond their own context and thus, for example, the immediate chaining of DAGs whenever a connecting asset is updated. New watcher tasks can now react to events in third-party systems and trigger the execution of Airflow DAGs. The assets also have a fully-fledged REST API, which allows active triggers through messages from third-party systems and enables further integration into networked system landscapes.

In addition to these features that we are highlighting for you, Apache Airflow 3.0 brings countless smaller improvements that may seem inconspicuous at first glance, but which solve specific problems or make new ways of working with Airflow possible.

Apache_Airflow_3.0_dependency_view

Smaller UI improvements like scoped DAG dependency views make working
with Apache Airflow 3.0 more convenient than ever before.

Requirements for an upgrade

A new major release naturally also means that the version change in existing systems requires certain adjustments. The good news is that many of the new features have already been prepared with the previous minor releases of the Airflow 2.x code base. Systems on a version of 2.7 or higher should be compatible with few restrictions. For example, if you are already running Airflow 2.x with a standalone DAG processor, only the services need to be adapted for the new frontend and an upgrade should be successful. Nevertheless, it is advisable to carry out a series of compatibility checks in advance, for which the Airflow project team even provides its own tools. Start with a local instance and work your way through the various system environments along your development processes.

If you want to take a look at Airflow 3 without much effort, you can use the new Docker image to start a local environment with just one line of code:

docker run -it --rm -p 8080:8080 apache/airflow:3.0.1 airflow standalone

This command downloads a container version of Airflow 3.0.1 and starts it in test mode. A randomly generated password for the admin user is displayed in the log, which you must copy and enter when logging into the web interface:

  • URL: http://localhost:8080
  • Username: admin
  • Password can be found in the container log

When should you carry out the upgrade?

The list of new features and improvements in Apache Airflow 3.0 is long and even for us very specific in parts. Whether and when you upgrade from 2.x to the new major version depends heavily on whether you have already reached the limits of your existing system environment. Do you lack connectivity between cloud and on-premise environments? Have you implemented complicated and error-prone workarounds for event-based scheduling? Then you've probably been waiting for Airflow 3.0 and can start preparing. If Airflow 2.x fulfills all your requirements, slow down the process. Despite high development quality, bugs and inefficiencies are usually still found in new releases over several months, especially those with far-reaching changes. The first patch with version number 3.0.1 was released at the beginning of May 2025 - the next patches are likely to follow at short intervals.

After a long development period, Apache Airflow has positioned itself well for the future with version 3. The release should not only contain exciting changes for night owls, as our overview only scratches the surface slightly and cannot go into all the new features in detail. Which features are particularly important for your work with Airflow? Do you want to switch to Airflow 3 as soon as possible or wait and see?

Perhaps you would be interested in upgrading to Airflow as a managed service? If you are unsure and would like to discuss your specific challenges with experienced Airflow experts, please contact us. As an outstanding open source project with countless possible applications, Apache Airflow will continue to play an important role in the engine room of data teams around the world for a long time to come. A good network is at least as important for long term success as the code itself: We look forward to hearing from you!

Learn more about Apache Airflow

,

avatar

Markus

Markus has been a Senior Consultant for Machine Learning and Data Engineering at NextLytics AG since 2022. With significant experience as a system architect and team leader in data engineering, he is an expert in micro services, databases and workflow orchestration - especially in the field of open source solutions. In his spare time he tries to optimize the complex system of growing vegetables in his own garden.

Got a question about this blog?
Ask Markus

Apache Airflow 3.0: Everything you need to know about the new release
9:03

Blog - NextLytics AG 

Welcome to our blog. In this section we regularly report on news and background information on topics such as SAP Business Intelligence (BI), SAP Dashboarding with Lumira Designer or SAP Analytics Cloud, Machine Learning with SAP BW, Data Science and Planning with SAP Business Planning and Consolidation (BPC), SAP Integrated Planning (IP) and SAC Planning and much more.

Subscribe to our newsletter

Related Posts

Recent Posts