Skip to content
NextLytics
Megamenü_2023_Über-uns

Shaping Business Intelligence

Whether clever add-on products for SAP BI, development of meaningful dashboards or implementation of AI-based applications - we shape the future of Business Intelligence together with you. 

Megamenü_2023_Über-uns_1

About us

As a partner with deep process know-how, knowledge of the latest SAP technologies as well as high social competence and many years of project experience, we shape the future of Business Intelligence in your company too.

Megamenü_2023_Methodik

Our Methodology

The mixture of classic waterfall model and agile methodology guarantees our projects a high level of efficiency and satisfaction on both sides. Learn more about our project approach.

Products
Megamenü_2023_NextTables

NextTables

Edit data in SAP BW out of the box: NextTables makes editing tables easier, faster and more intuitive, whether you use SAP BW on HANA, SAP S/4HANA or SAP BW 4/HANA.

Megamenü_2023_Connector

NextLytics Connectors

The increasing automation of processes requires the connectivity of IT systems. NextLytics Connectors allow you to connect your SAP ecosystem with various open-source technologies.

IT-Services
Megamenü_2023_Data-Science

Data Science & Engineering

Ready for the future? As a strong partner, we will support you in the design, implementation and optimization of your AI application.

Megamenü_2023_Planning

SAP Planning

We design new planning applications using SAP BPC Embedded, IP or SAC Planning which create added value for your company.

Megamenü_2023_Dashboarding

Business Intelligence

We help you with our expertise to create meaningful dashboards based on Tableau, Power BI, SAP Analytics Cloud or SAP Lumira. 

Megamenü_2023_Data-Warehouse-1

SAP Data Warehouse

Are you planning a migration to SAP HANA? We show you the challenges and which advantages a migration provides.

Business Analytics
Megamenü_2023_Procurement

Procurement Analytics

Transparent and valid figures are important, especially in companies with a decentralized structure. SAP Procurement Analytics allows you to evaluate SAP ERP data in SAP BI.

Megamenü_2023_Reporting

SAP HR Reporting & Analytics

With our standard model for reporting from SAP HCM with SAP BW, you accelerate business activities and make data from various systems available centrally and validly.

Megamenü_2023_Dataquality

Data Quality Management

In times of Big Data and IoT, maintaining high data quality is of the utmost importance. With our Data Quality Management (DQM) solution, you always keep the overview.

Career
Megamenü_2023_Karriere-2b

Working at NextLytics

If you would like to work with pleasure and don't want to miss out on your professional and personal development, we are the right choice for you!

Megamenü_2023_Karriere-1

Senior

Time for a change? Take your next professional step and work with us to shape innovation and growth in an exciting business environment!

Megamenü_2023_Karriere-5

Junior

Enough of grey theory - time to get to know the colourful reality! Start your working life with us and enjoy your work with interesting projects.

Megamenü_2023_Karriere-4-1

Students

You don't just want to study theory, but also want to experience it in practice? Check out theory and practice with us and experience where the differences are made.

Megamenü_2023_Karriere-3

Jobs

You can find all open vacancies here. Look around and submit your application - we look forward to it! If there is no matching position, please send us your unsolicited application.

Blog

How Delta Sharing Enables a New Level of Data Platform Interoperability

With the launch of SAP Business Data Cloud (BDC) in 2025, one thing has become unmistakably clear: the paradigm shift from traditional data warehouse system architectures to the data lakehouse model is complete. Proprietary, monolithic database systems are being replaced by loosely coupled, open service architectures - a trend popularized on a large scale in recent years, particularly by Databricks. Independently scalable compute and storage resources are gradually replacing ever-larger, increasingly expensive in-memory technologies.

Even SAP has now made object storage a central component of the BDC, implicitly confirming what competitors have long argued: the lakehouse model is simply more efficient and more cost-effective to operate.

However, we see an even more far-reaching implication in this technical convergence across the data platform market: by adopting fundamentally open technologies and protocols such as Apache Iceberg, Delta Lake, and Unity Catalog, new integration possibilities are emerging between the products of different platform providers. SAP, Microsoft Fabric, Dremio, Databricks, and many others could soon access each other’s data with minimal friction - without massive replication or complex synchronization mechanisms.

This opens the door to valuable combinations of software from different vendors. SAP’s gradual evolution from its traditional warehouse systems toward the quiet “lakehouse revolution” embodied in BDC is a prime example of this convergence.

Is SAP Business Data Cloud a Data Lakehouse?

Whether BDC qualifies as a lakehouse or not is probably of little concern to SAP. Historically, SAP has treated its data warehousing and analytics products as extensions of its core process and business software offerings. The technologies involved have always been subordinate to a broader portfolio strategy - a trend that can be traced all the way back to the introduction of Business Warehouse (BW).

SAP's journey

The evolution of SAP’s data warehouse offerings reflects its overall product strategy: from classic relational database systems as the foundation, to the HANA in-memory database, then to the cloud, and now to a comprehensive AI-first strategy - which has defined the company’s positioning since at least 2024.

Business Data Cloud follows in this line, succeeding BW, BW/4HANA, and the hastily rebranded Data Warehouse Cloud → Datasphere. BDC is not a rupture, but rather an integration layer: it embeds Datasphere and SAP Analytics Cloud and extends them with additional components - notably the BDC Object Storage for SAP “data products,” SAP Databricks as a full-fledged developer environment for machine learning and AI applications, and a growing suite of AI components (Joule, Analytics Agents, and others).

While Datasphere - powered by the HANA database - remains a central element, another new feature could soon become even more important: SAP Data Products. These are automatically populated from various SAP application modules and stored in object storage - the lakehouse layer of BDC.
Thanks to the openness of modern storage and access protocols, these data products can be exchanged across a wide range of systems with minimal effort - without CSV dumps, REST API bottlenecks, or synchronization headaches.


Watch the recording of our webinar: "“Bridging Business and Analytics: The Plug-and-Play Future of Data Platforms"

Webinar DataPlatforms Recording EN


Delta Sharing and BDC Partner Connect

The technological cornerstone that makes seamless interoperability possible is the Delta Sharing protocol. It underpins the much-cited “zero-copy” principle, which eliminates the need to replicate data across environments.

Delta Sharing is an open protocol co-developed and promoted by Databricks. We have provided an hands-on example in one of our recent Blogposts where data is shared between SAP Databricks and SAP BDC without copying. It acts as a mediation layer between the technical data catalogs of lakehouse architectures. A client requesting access to shared data connects to a sharing server (essentially the data catalog), which validates access permissions. The client then receives short-lived access tokens granting direct access to the underlying raw data in object storage.

The result: your Databricks workspace, for example, can query SAP data products in real time - without physical replication - while maintaining full data governance.

This forms the foundation of the BDC Partner Connect program, which enables bidirectional data access between BDC and enterprise Databricks workspaces on Google Cloud, AWS, or Azure, and - starting in 2026 - directly with Google BigQuery and Snowflake as well.

Data Platforms

Beyond the already announced BDC Connect partners, further integrations are likely. Due to the widespread adoption of the Delta Sharing protocol, technical interoperability with numerous providers and tools already exists. The open sharing ecosystem from Databricks enables data exchange today with other lakehouse platforms like Dremio and Starburst, as well as with BI applications such as Tableau and Power BI.

Open-source clients are available for all major programming languages, along with connectors for Microsoft Excel and Google Sheets.

From a technological standpoint, there is therefore no reason why SAP data products in BDC should not soon interact seamlessly with all these counterparts. It seems only a matter of time - and of SAP’s licensing and go-to-market strategy - before frictionless data exchange with all major ecosystem players becomes reality.

Delta Sharing in Practice: Multi-Cloud, Multi-Tool Architecture

Where can this new level of interoperability be applied in practice?

In our webinar on November 11th, 2025 “Bridging Business and Analytics: The Plug-and-Play Future of Data Platforms”, we illustrated the benefits using a realistic example scenario: imagine a company that needs to combine central SAP ERP master data with high-volume IIoT streaming data from its factories - while ensuring that sensitive HR and compliance data remain in an EU-hosted, compliant cloud or even on-premises.

In this setup:

  • SAP Business Data Cloud (BDC) serves as the authoritative source for SAP master data.
  • Databricks acts as the stream-processing engine, efficiently handling the high-frequency IIoT data streams.
  • Dremio processes the sensitive data, running either on-premises or in an EU-hosted cloud to meet data sovereignty requirements, and provides SQL-based self-service analytics access.

SAP BDC, Databricks & Dremio

BDC shares the master data as data products (via the Delta Sharing protocol) with Databricks, where it is joined with the IIoT data extracts. The aggregated results are then shared with Dremio, where they are combined with sensitive data.

End users can then access the fully integrated dataset directly through Dremio’s SQL interface or visualize it in their preferred BI tools.

This architecture combines the strengths of all three systems while minimizing replication, latency, and synchronization overhead.

Delta Sharing - Our Conclusion: A Strong Foundation for Interoperable Data Architectures

The industry’s technological shift toward the data lakehouse model - and the adoption of open standards by vendors such as SAP - means that the foundation for future plug-and-play data platforms is now in place.

The lakehouse is becoming the de facto standard for exchanging large volumes of data - not only between analytical systems, but increasingly with operational systems such as SAP business modules.
In the coming months, it will become clear how quickly SAP can overcome its licensing and product strategy hurdles to enable true interoperability - laying the groundwork for ever faster innovation on ever-growing datasets.

Competitors like Databricks and Dremio, which have long built their strategies around open-source components, open protocols, and efficient data exchange, may hold a strategic advantage in this emerging era of interconnected lakehouse ecosystems.

Every organization’s data landscape is unique - let’s explore together how your business can benefit from an open, future-proof lakehouse architecture. Book a personal, non- binding consultation here.

Learn more about  Databricks


 

FAQ - Delta Sharing for Data Platform Interoperability.

These are some of the most frequently asked questions about how Delta Sharing Enables a New Level of Data Platform Interoperability.

What problem does Delta Sharing solve in modern data platforms? Delta Sharing removes the need for heavy data replication and complex ETL pipelines between platforms. Instead of copying data, consumers access shared datasets directly in object storage via a secure, open protocol.
How does SAP Business Data Cloud (BDC) fit into the lakehouse trend? BDC extends SAP’s warehouse lineage (BW, BW/4HANA, Datasphere) into a lakehouse-style architecture. It introduces BDC Object Storage for SAP “data products,” embeds Datasphere and SAP Analytics Cloud, and adds AI components and SAP Databricks as a full developer environment.
What are SAP “data products” in BDC? SAP data products are curated, application-driven datasets automatically populated from SAP modules and stored in object storage. Thanks to open formats and protocols, they can be shared with other platforms with minimal friction using Delta Sharing.
What exactly is Delta Sharing? Delta Sharing is an open data-sharing protocol co-developed by Databricks. A client connects to a sharing server (the catalog), which checks permissions and issues short-lived tokens. The client then reads the underlying files in object storage directly — secure, governed, and without copying the data.
How does BDC Partner Connect use Delta Sharing? BDC Partner Connect uses Delta Sharing to enable bidirectional data access between BDC and partners such as Databricks workspaces (AWS, Azure, GCP). Other platforms like BigQuery and Snowflake are expected to follow, making “zero-copy” data exchange the default.
Which tools and platforms can already consume Delta Sharing data? Beyond SAP and Databricks, Delta Sharing is supported by platforms such as Dremio, Starburst and BI tools like Tableau and Power BI. There are also open-source clients for common programming languages and connectors for Excel and Google Sheets.
What does a typical multi-cloud, multi-tool architecture look like in practice? A typical setup might use BDC as the source of SAP master data, Databricks for IIoT stream processing, and Dremio in an EU cloud or on-prem for sensitive HR/compliance data. BDC shares master data to Databricks; Databricks shares aggregated results to Dremio — all via Delta Sharing.
What is the strategic impact for enterprises?  By standardizing on lakehouse technologies and open protocols, enterprises gain a plug-and-play ecosystem: best-of-breed tools can be combined without vendor lock-in or massive replication. Vendors like Databricks and Dremio, who built around openness from the start, may gain an edge in this new interoperable landscape.

 

,

avatar

Markus

Markus has been a Senior Consultant for Machine Learning and Data Engineering at NextLytics AG since 2022. With significant experience as a system architect and team leader in data engineering, he is an expert in micro services, databases and workflow orchestration - especially in the field of open source solutions. In his spare time he tries to optimize the complex system of growing vegetables in his own garden.

Got a question about this blog?
Ask Markus

How Delta Sharing Enables a New Level of Data Platform Interoperability
8:30

Blog - NextLytics AG 

Welcome to our blog. In this section we regularly report on news and background information on topics such as SAP Business Intelligence (BI), SAP Dashboarding with Lumira Designer or SAP Analytics Cloud, Machine Learning with SAP BW, Data Science and Planning with SAP Business Planning and Consolidation (BPC), SAP Integrated Planning (IP) and SAC Planning and much more.

Subscribe to our newsletter

Related Posts

Recent Posts