Shaping Business Intelligence
About us
Our Methodology
NextTables
NextLytics Connectors
Visual BI ValQ
Data Science & Engineering
SAP Planning
Today's day-to-day business activities are fully intertwined with digital processes. The number of those digital processes and their implementation as workflows is increasing rapidly, not least because of the growing importance of machine learning applications. Nowadays, analyses and forecasts are only started manually in the prototype state; a productive system relies on automation. Here, the choice of the workflow management platform is a key factor for long-term success.
The challenge is that these digital processes must be centrally managed and organized. Especially for business-critical processes, reliable execution and flexibility in the workflow design are essential. In addition to pure execution, great importance is also attached to the monitoring and optimization of workflows and error management. Ideally, the processes are also designed in such a way that they can be easily scaled up.
Only if both the technical and professional side of the users is involved, acceptance and a sustainable integration of digital processes into the daily work routine can be achieved. The execution as workflows should therefore be as simple and comprehensible as possible.
Creating advanced workflows in Python
In Apache Airflow the workflows are created with the programming language Python. The entry hurdle is low. In a few minutes you can define even complex workflows with external dependencies to third party systems and conditional branches.
Schedule, execute and monitor workflows
The program-controlled planning, execution and monitoring of workflows runs smoothly thanks to the interaction of the components. Performance and availability can be adapted to even your most demanding requirements.
Best suited for Machine Learning
Here, your Machine Learning requirements are met in the best possible way. Even their complex workflows can be ideally orchestrated and managed using Apache Airflow. The different requirements regarding software and hardware can be easily implemented.
Robust orchestration of third-party systems
Already in the standard installation of Apache Airflow numerous integrations to common third party systems are included. This allows you to realize a robust connection in no time. Without risk: The connection data is stored encrypted in the backend.
Ideal for the Enterprise Context
The requirements of start-ups and large corporations are equally met by the excellent scalability. As a top level project of the Apache Software Foundation and with its origins at Airbnb, the economic deployment on a large scale was intended from the beginning.
A major advantage of Apache Airflow is the modern, comprehensive web interface. With role-based authentication, the interface gives you a quick overview or serves as a convenient access point for managing and monitoring workflows.
Flexibility by customization
The adaptability is given by numerous plugins, macros and individual classes. Since Airflow is completely based on Python, the platform is theoretically changeable up to the basics. Adapt Apache Airflow to your current needs at any time.
Truly scalable
Scaling with common systems like Celery, Kubernetes and Mesos is possible at just any time. In this context a lightweight containerization can be installed.
Completely free of charge
The workflow management platform is quickly available without license fees and with minimal installation effort. You can always use the latest versions to the full extent without any fees.
Benefit from a whole community
As the de facto standard for workflow management, the Airflow Community not only includes users, but the platform also benefits from dedicated developers from around the world. Current ideas and their implementation in code can be found online.
Agility by simplicity
The workflow definition is greatly accelerated by the implementation in Python and the workflows benefit from the flexibility offered. In the web interface with excellent usability, troubleshooting and changes to the workflows can be implemented quickly.
The new major release of Apache Airflow offers a modern user interface and new functions:
By leveraging Apache Airflow for SAP BW/4HANA change data capture, organizations can optimize the processing of data changes in their SAP systems.
By tracking airflow metrics, you can get important information about the progress and performance of your workflows.
We compare the flagship of open source orchestration service, Apache Airflow with the Microsoft Cloud product, Azure Data Factory.
Read all about data-aware scheduling with Apache Airflow Datasets in today's blog post.
Datasphere supports the Federated Governance and Self Service Platform and can therefore be an important part of your data mesh landscape.
Read how the Apache Airflow Celery Executor Engine can be used for parallel computing in today's blog post.
Discover in our new blogpost how data mesh architecture can help you make important data accessible to as many people in an company as possible.
In this article, we will show you how to run a benchmark and we will help you better estimate the energy consumption of your machine learning algorithms.
In our blog post, we explain how you can introduce carbon accounting in your company and we present metrics for this.
The emissions caused by artificial intelligence are growing constantly. We have collected recommendations for more corporate sustainability.