With the launch of SAP Business Data Cloud (BDC) in 2025, one thing has become unmistakably clear: the paradigm shift from traditional data warehouse system architectures to the data lakehouse model is complete. Proprietary, monolithic database systems are being replaced by loosely coupled, open service architectures - a trend popularized on a large scale in recent years, particularly by Databricks. Independently scalable compute and storage resources are gradually replacing ever-larger, increasingly expensive in-memory technologies.
Even SAP has now made object storage a central component of the BDC, implicitly confirming what competitors have long argued: the lakehouse model is simply more efficient and more cost-effective to operate.
However, we see an even more far-reaching implication in this technical convergence across the data platform market: by adopting fundamentally open technologies and protocols such as Apache Iceberg, Delta Lake, and Unity Catalog, new integration possibilities are emerging between the products of different platform providers. SAP, Microsoft Fabric, Dremio, Databricks, and many others could soon access each other’s data with minimal friction - without massive replication or complex synchronization mechanisms.
This opens the door to valuable combinations of software from different vendors. SAP’s gradual evolution from its traditional warehouse systems toward the quiet “lakehouse revolution” embodied in BDC is a prime example of this convergence.
Is SAP Business Data Cloud a Data Lakehouse?
Whether BDC qualifies as a lakehouse or not is probably of little concern to SAP. Historically, SAP has treated its data warehousing and analytics products as extensions of its core process and business software offerings. The technologies involved have always been subordinate to a broader portfolio strategy - a trend that can be traced all the way back to the introduction of Business Warehouse (BW).

The evolution of SAP’s data warehouse offerings reflects its overall product strategy: from classic relational database systems as the foundation, to the HANA in-memory database, then to the cloud, and now to a comprehensive AI-first strategy - which has defined the company’s positioning since at least 2024.
Business Data Cloud follows in this line, succeeding BW, BW/4HANA, and the hastily rebranded Data Warehouse Cloud → Datasphere. BDC is not a rupture, but rather an integration layer: it embeds Datasphere and SAP Analytics Cloud and extends them with additional components - notably the BDC Object Storage for SAP “data products,” SAP Databricks as a full-fledged developer environment for machine learning and AI applications, and a growing suite of AI components (Joule, Analytics Agents, and others).
While Datasphere - powered by the HANA database - remains a central element, another new feature could soon become even more important: SAP Data Products. These are automatically populated from various SAP application modules and stored in object storage - the lakehouse layer of BDC.
Thanks to the openness of modern storage and access protocols, these data products can be exchanged across a wide range of systems with minimal effort - without CSV dumps, REST API bottlenecks, or synchronization headaches.
Watch the recording of our webinar: "“Bridging Business and Analytics: The Plug-and-Play Future of Data Platforms"
Delta Sharing and BDC Partner Connect
The technological cornerstone that makes seamless interoperability possible is the Delta Sharing protocol. It underpins the much-cited “zero-copy” principle, which eliminates the need to replicate data across environments.
Delta Sharing is an open protocol co-developed and promoted by Databricks. We have provided an hands-on example in one of our recent Blogposts where data is shared between SAP Databricks and SAP BDC without copying. It acts as a mediation layer between the technical data catalogs of lakehouse architectures. A client requesting access to shared data connects to a sharing server (essentially the data catalog), which validates access permissions. The client then receives short-lived access tokens granting direct access to the underlying raw data in object storage.
The result: your Databricks workspace, for example, can query SAP data products in real time - without physical replication - while maintaining full data governance.
This forms the foundation of the BDC Partner Connect program, which enables bidirectional data access between BDC and enterprise Databricks workspaces on Google Cloud, AWS, or Azure, and - starting in 2026 - directly with Google BigQuery and Snowflake as well.

Beyond the already announced BDC Connect partners, further integrations are likely. Due to the widespread adoption of the Delta Sharing protocol, technical interoperability with numerous providers and tools already exists. The open sharing ecosystem from Databricks enables data exchange today with other lakehouse platforms like Dremio and Starburst, as well as with BI applications such as Tableau and Power BI.
Open-source clients are available for all major programming languages, along with connectors for Microsoft Excel and Google Sheets.
From a technological standpoint, there is therefore no reason why SAP data products in BDC should not soon interact seamlessly with all these counterparts. It seems only a matter of time - and of SAP’s licensing and go-to-market strategy - before frictionless data exchange with all major ecosystem players becomes reality.
Delta Sharing in Practice: Multi-Cloud, Multi-Tool Architecture
Where can this new level of interoperability be applied in practice?
In our webinar on November 11th, 2025 “Bridging Business and Analytics: The Plug-and-Play Future of Data Platforms”, we illustrated the benefits using a realistic example scenario: imagine a company that needs to combine central SAP ERP master data with high-volume IIoT streaming data from its factories - while ensuring that sensitive HR and compliance data remain in an EU-hosted, compliant cloud or even on-premises.
In this setup:
- SAP Business Data Cloud (BDC) serves as the authoritative source for SAP master data.
- Databricks acts as the stream-processing engine, efficiently handling the high-frequency IIoT data streams.
- Dremio processes the sensitive data, running either on-premises or in an EU-hosted cloud to meet data sovereignty requirements, and provides SQL-based self-service analytics access.

BDC shares the master data as data products (via the Delta Sharing protocol) with Databricks, where it is joined with the IIoT data extracts. The aggregated results are then shared with Dremio, where they are combined with sensitive data.
End users can then access the fully integrated dataset directly through Dremio’s SQL interface or visualize it in their preferred BI tools.
This architecture combines the strengths of all three systems while minimizing replication, latency, and synchronization overhead.
Delta Sharing - Our Conclusion: A Strong Foundation for Interoperable Data Architectures
The industry’s technological shift toward the data lakehouse model - and the adoption of open standards by vendors such as SAP - means that the foundation for future plug-and-play data platforms is now in place.
The lakehouse is becoming the de facto standard for exchanging large volumes of data - not only between analytical systems, but increasingly with operational systems such as SAP business modules.
In the coming months, it will become clear how quickly SAP can overcome its licensing and product strategy hurdles to enable true interoperability - laying the groundwork for ever faster innovation on ever-growing datasets.
Competitors like Databricks and Dremio, which have long built their strategies around open-source components, open protocols, and efficient data exchange, may hold a strategic advantage in this emerging era of interconnected lakehouse ecosystems.
Every organization’s data landscape is unique - let’s explore together how your business can benefit from an open, future-proof lakehouse architecture. Book a personal, non- binding consultation here.
FAQ - Delta Sharing for Data Platform Interoperability.
These are some of the most frequently asked questions about how Delta Sharing Enables a New Level of Data Platform Interoperability.
SAP Business Data Cloud, Databricks
/Logo%202023%20final%20dunkelgrau.png?width=221&height=97&name=Logo%202023%20final%20dunkelgrau.png)


