At SAP’s TechEd conference, the company showcased major upgrades to its Datasphere, positioning it as a more powerful tool for enterprise data management. They introduced new AI and data capabilities that cater to the increasing demand for intelligent, context-aware data use across industries.

Meet Joule: SAP’s new AI copilot ready to transform your data

One of the main announcements at TechEd was SAP’s integration of AI agents into its Datasphere ecosystem through Joule, a generative AI copilot. Joule is designed to optimize data workflows, making it easier for enterprises to interact with large datasets and extract actionable insights.

AI has become central to SAP’s toolset, with a focus on automating repetitive tasks and improving decision-making processes across the organization.

Joule’s role within the Datasphere is to improve AI-driven decision-making by leveraging generative AI models. Joule helps analyze vast amounts of data in real-time, helping teams identify patterns, trends, and anomalies that would otherwise be missed.

This empowers C-suite executives and their teams to make data-backed decisions faster, mitigating risks and capitalizing on emerging opportunities. Growing reliance on AI across business functions reiterates its importance, with AI-driven processes offering major productivity and efficiency gains.

SAP showcases the next generation of enterprise data tools

In addition to Joule, SAP introduced new data management features at TechEd, designed to improve how enterprises store and process their data. These updates include the embedded data lake and the knowledge graph engine, both of which are key in expanding the capabilities of SAP’s Datasphere.

The new data lake architecture supports enterprises in managing large-scale data without sacrificing the original context or structure—enabling better integration across different datasets, preserving the integrity of the information.

Enterprises benefit from faster processing times and more accurate insights, especially in mission-critical areas like real-time risk management, supply chain operations, and customer behavior analytics. These tools are key for decision-makers seeking to maximize the potential of their data without overhauling their current systems.

SAP’s latest data lake and knowledge graph innovations

SAP’s new data lake and knowledge graph engine represent key innovations aimed at addressing the challenges of modern enterprise data management. Through simplifying data storage and improving context retention, SAP helps businesses leverage their data more effectively across operations.

SAP’s BTP gives you the ultimate toolkit for data-driven innovation

SAP’s Business Technology Platform (BTP) is designed to provide a comprehensive suite of tools that unify data management, analytics, AI, and automation capabilities. The platform lets businesses build and extend applications while integrating disparate systems within SAP’s cloud ecosystem.

BTP is key for organizations looking to streamline their data processes—supporting a range of use cases, from building customer-facing applications to automating back-office operations.

Through consolidating critical functions into one platform, SAP brings flexibility to development teams to innovate while maintaining seamless integration with existing systems—resulting in greater agility for enterprises—as they can adapt to changing business needs without major disruptions.

Datasphere takes center stage in SAP’s data management revolution

At the core of SAP’s data management approach lies Datasphere, which acts as the hub for data storage, management, and analysis across SAP and non-SAP systems. The tool connects different data sources, making it easier for businesses to manage vast datasets while making sure accuracy and efficiency are both of primary importance.

Datasphere is engineered to work with SAP Analytics Cloud and other advanced tools, providing a cohesive environment for analyzing data in real-time.

This then helps businesses improve decision-making across all departments, from finance and HR to supply chain and logistics. Through integrating non-SAP data systems, Datasphere helps companies leverage external data sources while maintaining a consistent framework for analysis.

SAP’s embedded data lake keeps your data accurate, scalable, and fast

SAP’s previous HANA Cloud Data Lake required businesses to assign a separate Datasphere space, which led to challenges in preserving the original data context. The new embedded data lake improves on this by offering a more seamless solution that retains the full context of structured, semi-structured, and unstructured data.

The object store in the new data lake simplifies the onboarding of data from SAP applications like SAP S/4HANA and SAP BW, for high data integrity while allowing flexible storage at scale.

With Spark compute, businesses can transform and process data in real-time, streamlining operations. SQL on files further improves this by enabling query access without data replication, speeding up decision-making and reducing overhead.

Find hidden connections with the new knowledge graph engine

The knowledge graph engine introduced by SAP helps companies find complex relationships between data points. Built on the Resource Description Framework (RDF), this engine organizes data into interconnected facts by storing three elements: subject, object, and relationship.

This tech also supports AI-specific use cases, for grounded AI models that provide more accurate, context-aware insights.

Enterprises can use these insights to refine decision-making processes, from financial forecasting to supply chain optimization. The knowledge graph also supports SPARQL, a query language that simplifies interactions with the graph, making it easier to extract valuable insights without the need for manual data modeling.

Get ahead of risk with Compass for real-time analysis in SAP Analytics Cloud

To further strengthen decision-making capabilities, SAP introduced Compass, a new feature within SAP Analytics Cloud—modeling complex risk scenarios and simulating their outcomes—providing businesses with predictive insights that help them face uncertainties.

Compass relies on Monte Carlo simulation, a solid technique for computing probabilities based on various inputs. This is particularly useful for companies managing dynamic challenges such as supply chain disruptions or fluctuating commodity prices.

The user-friendly interface presents results in a visual format, making it accessible to both technical and non-technical users. Decision-makers can quickly interpret probability distributions and boundaries, letting them take action before risks materialize.

SAP’s upcoming feature roadmap

SAP plans to roll out its data lake capabilities by the end of Q4 2024, with the knowledge graph and Compass features becoming available in the first half of 2025. These updates will offer businesses more advanced data management and risk analysis tools, helping them stay ahead in a competitive market.

Expect strong ROI from SAP’s data lake improvements

According to a GigaOM case study, enterprises using SAP’s Datasphere business data fabric reported a three-year total cost of ownership (TCO) 42% lower than that of a DIY data management implementation.

Final thoughts

Are you equipping your business with the right tools to stay ahead? With innovations like SAP’s Datasphere shaping the future of data-driven decision-making, now is the time to rethink your approach.

How will you leverage these advancements to transform your data into a competitive advantage? It’s no longer only about managing information—it’s about driving smarter, faster decisions that keep your brand agile in an ever-evolving market.

Tim Boesen

October 21, 2024

6 Min