Mainframe data holds immense value for organizations, but accessing it is notoriously difficult. According to a Rocket Software report, over three-quarters of professionals in data analytics, management, engineering, and architecture say that getting to the data and its contextual metadata from mainframes is a considerable challenge. Reasons stem from the complex and outdated technologies involved, making it hard for modern data tools to interact with these legacy systems.

Nearly two-thirds of professionals also face issues when integrating mainframe data with cloud data sources, highlighting the technological friction between legacy systems and cloud environments.

How mainframe insights drive business forward

Mainframe data is uniquely valuable for businesses because it captures the real-time flow of transactions. More than half of the respondents in the Rocket Software report believe that business growth and the creation of advanced analytical tools depend on access to this data.

Unlike other types of data that might offer high-level overviews or historical snapshots, mainframe data provides the granular, day-to-day information that is essential for business operations. It serves as a live feed of the company’s financials, customer behaviors, and supply chain activities, making it a goldmine for generating business insights.

Despite this potential, only 25% of surveyed professionals report that their organizations are fully able to leverage these mainframe data assets. A gap between the availability and utilization of data is a key concern for executives looking to drive innovation and growth.

The data itself has been locked in legacy systems for years, codified through decades of business decisions, relationships, and integrations. Extracting that data in a useful form is challenging, but it’s increasingly necessary as companies seek to remain competitive in a data-driven market.

Mainframe data is invaluable

The core value of mainframe data is its real-time nature. While other systems might offer historical data, mainframes store real-time transactional data, making it one of the most actionable sources of business information.

Whether it’s banking transactions, supply chain movements, or retail sales, these systems capture critical activities as they happen. Real-time data flow gives businesses the ability to react to market changes and operational demands instantaneously, providing a competitive edge.

IBM reports that 70% of global transactions by value still run through mainframe systems, underscoring their continued importance. The sheer volume and value of transactions stored on mainframes make them indispensable for businesses that rely on up-to-the-minute accuracy.

Transactions provide operational data and the foundations for predictive analytics and AI applications. Businesses looking to drive forward with digital transformation efforts are increasingly focused on integrating this data into newer, cloud-based systems.

Cloud isn’t always king when hybrid models outshine migration

While cloud migration dominates many tech strategies, moving mainframe data to the cloud isn’t always the optimal solution. Rather than pouring vast amounts of mainframe data into cloud systems, many businesses are opting for hybrid models where specific AI tools and tasks are handled on-premise.

Such an approach allows companies to maintain the security and compliance controls necessary for proprietary or sensitive data, a key concern in industries such as finance and healthcare. According to research from Kyndryl, running task-specific AI models on-premises reduces the risk of data exposure and compliance violations, a major advantage over full cloud migration.

Cloud computing is effective for handling massive datasets and running analytics at scale, but for many companies, the value of keeping sensitive mainframe data in-house is clear.

Hybrid environments, where AI and analytics tools are brought to the data (instead of the other way around), help avoid the challenges of transferring enormous amounts of proprietary data to the cloud.

Such an approach minimizes latency issues and improves security, making it a more pragmatic choice for businesses that need to maintain control over their mainframe systems while still leveraging modern technology.

Revitalizing mainframes with generative AI tools

Generative AI coding tools are proving to be a useful asset in modernizing legacy COBOL applications, which remain a staple in many mainframe systems. These AI tools help reduce the time and effort required to update and improve these older applications without a complete rewrite.

Full rewrites of legacy systems are often prone to failure, a risk noted by Forrester in a report commissioned by Rocket Software. As Michael Curry puts it, “Rewriting is the nuclear option,” and many organizations have faced setbacks when attempting to go down that path.

Instead, refactoring, updating parts of the code without altering its core functionality, offers a more reliable way forward. AI can help by automating portions of this process, letting teams focus on higher-level tasks such as integrating modern analytics tools.

When using AI to modernize applications incrementally, companies can extract more value from their mainframe systems without the risk and cost associated with full system replacements.

Executive demands vs. IT realities

IT departments manage the bulk of mainframe operations, but business leaders are increasingly demanding that the valuable data stored in these systems be integrated into modern analytics and cloud platforms.

Executives see the results of cloud-based data initiatives and wonder why their most valuable data, often stored in mainframes, isn’t being used in similar ways. A disconnect between the C-suite’s vision and IT’s day-to-day realities creates tension.

On one hand, IT departments are cautious about migrating or significantly altering mainframe systems due to the risks involved in tampering with critical business processes. On the other hand, business leaders want to extract more value from the data trapped in these legacy systems. Bridging this gap requires collaboration across teams, as well as investments in modernization strategies that allow for gradual integration without disrupting core operations.

Key takeaways

The challenges of accessing mainframe data are large, but the potential value for businesses makes it a priority. As organizations explore hybrid models and generative AI to modernize legacy systems, they balance the need for security with the demand for real-time insights.

While executives push for more data integration, IT leaders are cautious, navigating the complexities of outdated technologies. Using mainframe data effectively will require new tools, greater collaboration between cloud and mainframe teams, and a strategic approach to modernization that minimizes disruption while maximizing data use.

Alexander Procter

October 23, 2024

5 Min