Skip to content
NextLytics
Megamenü_2023_Über-uns

Shaping Business Intelligence

Whether clever add-on products for SAP BI, development of meaningful dashboards or implementation of AI-based applications - we shape the future of Business Intelligence together with you. 

Megamenü_2023_Über-uns_1

About us

As a partner with deep process know-how, knowledge of the latest SAP technologies as well as high social competence and many years of project experience, we shape the future of Business Intelligence in your company too.

Megamenü_2023_Methodik

Our Methodology

The mixture of classic waterfall model and agile methodology guarantees our projects a high level of efficiency and satisfaction on both sides. Learn more about our project approach.

Products
Megamenü_2023_NextTables

NextTables

Edit data in SAP BW out of the box: NextTables makes editing tables easier, faster and more intuitive, whether you use SAP BW on HANA, SAP S/4HANA or SAP BW 4/HANA.

Megamenü_2023_Connector

NextLytics Connectors

The increasing automation of processes requires the connectivity of IT systems. NextLytics Connectors allow you to connect your SAP ecosystem with various open-source technologies.

IT-Services
Megamenü_2023_Data-Science

Data Science & Engineering

Ready for the future? As a strong partner, we will support you in the design, implementation and optimization of your AI application.

Megamenü_2023_Planning

SAP Planning

We design new planning applications using SAP BPC Embedded, IP or SAC Planning which create added value for your company.

Megamenü_2023_Dashboarding

Business Intelligence

We help you with our expertise to create meaningful dashboards based on Tableau, Power BI, SAP Analytics Cloud or SAP Lumira. 

Megamenü_2023_Data-Warehouse-1

SAP Data Warehouse

Are you planning a migration to SAP HANA? We show you the challenges and which advantages a migration provides.

Business Analytics
Megamenü_2023_Procurement

Procurement Analytics

Transparent and valid figures are important, especially in companies with a decentralized structure. SAP Procurement Analytics allows you to evaluate SAP ERP data in SAP BI.

Megamenü_2023_Reporting

SAP HR Reporting & Analytics

With our standard model for reporting from SAP HCM with SAP BW, you accelerate business activities and make data from various systems available centrally and validly.

Megamenü_2023_Dataquality

Data Quality Management

In times of Big Data and IoT, maintaining high data quality is of the utmost importance. With our Data Quality Management (DQM) solution, you always keep the overview.

Career
Megamenü_2023_Karriere-2b

Working at NextLytics

If you would like to work with pleasure and don't want to miss out on your professional and personal development, we are the right choice for you!

Megamenü_2023_Karriere-1

Senior

Time for a change? Take your next professional step and work with us to shape innovation and growth in an exciting business environment!

Megamenü_2023_Karriere-5

Junior

Enough of grey theory - time to get to know the colourful reality! Start your working life with us and enjoy your work with interesting projects.

Megamenü_2023_Karriere-4-1

Students

You don't just want to study theory, but also want to experience it in practice? Check out theory and practice with us and experience where the differences are made.

Megamenü_2023_Karriere-3

Jobs

You can find all open vacancies here. Look around and submit your application - we look forward to it! If there is no matching position, please send us your unsolicited application.

Blog
Megamenü_CTA_Webinar
Live Webinar:
Bridging Business and Analytics: The Plug-and-Play Future of Data Platforms
Register for free!
 

Databricks AutoML for Time-Series: Fast, Reliable Sales Forecasting

Steering your business based on evidence and insight from data promises great competitive advantage and large potential for optimized and improved operations. To reap these benefits, the right data first needs to be collected and then properly submitted to the right analytical models, e.g. in predicting turnover or sales for future periods. Business Intelligence software like PowerBI or SAP Analytics Cloud can support simple prediction methods and planning but quickly run out of steam when use cases become more complex.

Databricks is a comprehensive Machine Learning, Data Science and AI-development platform that can be used to create innovative and highly precise prediction applications. In between the extremes, predictive analytics of moderate complexity can solve many business problems. “AutoML” seeks to cover this middle ground, giving accurate predictions without requiring a doctorate in higher mathematics. We showcase Databricks AutoML capabilities for time series forecasting to demonstrate how the platform can help turning business data into actionable insight with little to no programming effort.

Databricks AutoML

Databricks brings data engineering, analytics, and machine learning together on the Lakehouse - unifying scalable compute, reliable storage, and governed collaboration. In a previous article on Databricks & MLflow, we tracked experiments, versioned models, and streamlined deployment. We’ll build on that foundation here: starting from a simple sales dataset, we’ll first visualize trend and seasonality, then let Databricks AutoML explore strong forecasting candidates. AutoML not only trains and tunes models at scale, it also generates transparent notebooks and registers results with MLflow, so you can reproduce, customize, and promote the best model into production.

01. Databricks AutoML Flowchart NB

AutoML highlights we’ll leverage:

  • Automated time-series featurization
    Auto-generated lags, moving averages, holiday signals, and calendar features to capture trend and seasonality without manual engineering.

  • Model search + tuning, out of the box
    A plethora of models like Prophet, ARIMA/SARIMA, gradient-boosted regressors, and deep learners (where available), with scalable hyperparameter optimization.

  • Built-for-humans notebooks
    Auto-generated, readable notebooks showing data prep, feature creation, training code, and evaluation—so you can audit, customize, and rerun.

  • Robust evaluation & backtesting
    Rolling-window validation, forecast plots, error metrics (MAPE, RMSE), and baseline comparisons to ensure lift over naïve models.

  • One-click operationalization
    Seamless MLflow tracking, model registry versioning, and straightforward promotion to batch scoring or real-time Model Serving.


Databricks & AutoML end-to-end example

Our goal is to forecast total monthly sales of a retail business. We start by visualizing the sales dataset in order to identify trends, patterns and seasonality.

02.Databricks & AutoML end-to-end example

There is a clear pattern in the dataset: sales peak around March and dip in the months right after, followed by a slow recovery the rest of the year. We can now proceed with the forecasting models training and optimization. We can define the experiment parameters like the time and target column, forecast horizon and storage locations for model registration and predictions.

03.Databricks & AutoML


Secure your spot now for our upcoming webinar on November 11th!

 Free Webinar: Bridging Business and Analytics: The Plug-and-Play Future of Data Platforms Navigating between Databricks, SAP Business Data Cloud, Fabric & Dremio     Register now! 

Now we are ready to start the experiment and let AutoML do its tricks. Data preprocessing, model tuning and training will be completed in a matter of minutes without user intervention and the experiment runs will be available for further inspection. Moreover, the best model will be stored in Unity Catalog, ready for deployment and serving.

04.Databricks & AutoML end-to-end example

Several time series prediction models and algorithms are explored, scored and optimized in order to acquire the most accurate predictions based on metrics like MAPE (mean absolute percentage error), RMSE (root mean square error) and others. A Python notebook will be automatically generated, letting us execute the model and make predictions for future periods.

05.Databricks & AutoML

The optimal model is loaded and predictions for the next seven months are generated. We can get a glimpse of the future by looking at the plot generated by AutoML which indicates that the forecast is quite accurate and close to the actual figures, capturing the trends and patterns found in the historical data.

06. BatchInference

The sales forecast accurately predicted a significant spring sales surge, followed by a sudden decline. This dip was then succeeded by a moderate, gradual recovery. This demonstrates the model's ability to capture both cyclical and transient market dynamics, offering valuable insights for business planning.

AutoML in SAP Business Data Cloud

SAP systems are wide spread among organizations and one of the most prolific sources of operational and business data. Running advanced analytics outside of the cloud ecosystem is always possible but may require third-party tooling or custom coding to bring data from SAP to the analytical environment and back. SAP Business Data Cloud integrates the SAP Databricks workspace for scenarios as the above AutoML time-series forecast. Data products can now be shared between SAP Datasphere and SAP Databricks without replication. SAP Databricks can be used to run AutoML or custom machine learning applications. Results can then be shared back into the SAP analytics ecosystem or to outside BI tools like Microsoft Fabric or PowerBI.

07.bdc-sap-databricks

SAP systems rely heavily on low-code/no-code user interfaces. While this is convenient for less technically-minded personnel, functionality is always limited in comparison to a pro-code system like Databricks. With the AutoML features we have demonstrated above, SAP Databricks bridges this gap between user experience philosophies and encourages experimentation and innovation on reliable business data straight from the primary operational source systems.

Databricks AutoML: Our Conclusion

Databricks AutoML makes time-series forecasting practical, transparent, and production-ready in one sweep: it automates smart featurization and model search, generates human-readable notebooks you can audit and adapt, evaluates rigorously with rolling backtests and metrics like MAPE/RMSE, and registers the winner for seamless serving via MLflow/Unity Catalog.

In our sales example, that end-to-end flow not only surfaced an accurate model but also captured the expected peak, post-peak dip, and gradual recovery - evidence that you can move from exploration to reliable forecasts without hand-crafting every step. If you need fast, trustworthy sales predictions that you can explain and deploy, AutoML on the Lakehouse is a compelling default.

Do you have questions about this or another topic? Our experts are here to help - get a free consultation by getting in touch with us!

 

Learn more about  Databricks

 

FAQ - Databricks AutoML

These are some of the most frequently asked questions about Databricks AutoML.

What is Databricks AutoML used for? Databricks AutoML is a tool for automating the process of time-series forecasting. It simplifies the creation of predictive models by automatically generating features, training models, and optimizing them for forecasting tasks, such as sales predictions.
How does AutoML handle time-series data? AutoML automates time-series featurization, including generating lags, moving averages, holiday signals, and calendar features. This allows it to capture trends and seasonality in the data without manual intervention, enabling robust forecasting models.
What models are used in Databricks AutoML for forecasting? Databricks AutoML explores a wide range of forecasting models, including Prophet, ARIMA/SARIMA, gradient-boosted regressors, and deep learning models, with built-in hyperparameter tuning for optimal performance.
How does Databricks AutoML evaluate model performance? AutoML uses robust evaluation techniques such as rolling-window validation, forecast plots, error metrics like MAPE and RMSE, and baseline comparisons to ensure that the model provides a significant improvement over simple naïve models.
What is the benefit of the generated notebooks in AutoML? The generated notebooks are human-readable and contain the full process from data preparation to model training and evaluation. This transparency allows users to audit, customize, and rerun the models, making it easier to understand and adapt the forecasting solution.
How can the best model be deployed in Databricks? The best model found through AutoML is stored in Unity Catalog and can be seamlessly operationalized via MLflow. It can be deployed for batch scoring or real-time model serving, providing an easy transition from development to production.

,

avatar

Apostolos

Apostolos has been a Data Engineering Consultant for NextLytics AG since 2022. He holds experience in research projects regarding deep learning methodologies and their applications in Fintech, as well as background in backend development. In his spare time he enjoys playing the guitar and stay up to date with the latest news on technology and economics.

Got a question about this blog?
Ask Apostolos

Databricks AutoML for Time-Series: Fast, Reliable Sales Forecasting
7:01

Blog - NextLytics AG 

Welcome to our blog. In this section we regularly report on news and background information on topics such as SAP Business Intelligence (BI), SAP Dashboarding with Lumira Designer or SAP Analytics Cloud, Machine Learning with SAP BW, Data Science and Planning with SAP Business Planning and Consolidation (BPC), SAP Integrated Planning (IP) and SAC Planning and much more.

Subscribe to our newsletter

Related Posts

Recent Posts