The global financial services sector is currently grappling with a paradox of technological advancement: while customer-facing interfaces have reached a state of sleek, high-speed digitalization, the underlying back-end infrastructure remains anchored in decades-old architecture. For many of the world’s largest banking institutions, the core systems responsible for processing trillions of dollars in transactions are still running on COBOL (Common Business-Oriented Language), a programming language that dates back to the 1950s. As these legacy systems become increasingly brittle and the pool of experts who understand them continues to shrink through retirement, the industry is turning toward Agentic Artificial Intelligence (AI) as a potential solution to a problem that has historically resisted traditional modernization efforts.
For years, the banking industry has pursued a strategy of "digital wrapping," where modern mobile apps and web portals are layered on top of ancient mainframes. While this approach satisfied consumer demand for mobile banking and real-time notifications, it left the core processing logic—the "brain" of the bank—untouched. This technical debt has created a significant bottleneck, preventing traditional banks from competing effectively with "born-in-the-cloud" fintech challengers. However, a new approach spearheaded by IT solutions provider Mphasis suggests that the solution lies not in simple code translation, but in the reconstruction of institutional knowledge through Agentic AI.
The Historical Burden of Legacy Systems
To understand the scale of the challenge, one must look at the history of financial computing. Banks were among the first enterprises to adopt large-scale computing in the 1960s and 1970s. During this era, COBOL was the gold standard for business applications due to its ability to handle complex decimal arithmetic and massive file processing. Over the decades, these systems were modified, patched, and expanded by generations of programmers. In many cases, the original documentation for these systems has been lost, and the logic governing essential functions—such as interest rate calculations or risk assessments—is buried deep within millions of lines of monolithic code.
According to industry estimates, approximately 70% to 80% of global financial transactions are still processed by COBOL-based systems. The cost of maintaining these systems is immense, with some estimates suggesting that large banks spend up to 75% of their IT budgets simply on maintaining existing infrastructure rather than innovating. Past attempts to "rip and replace" these systems have frequently resulted in high-profile failures, leading to service outages and regulatory scrutiny. Consequently, many bank boards have become risk-averse, viewing back-end modernization as a "career-ending" project for Chief Information Officers (CIOs).
Srikumar Ramanathan, Chief Solutions Officer at Mphasis, notes that the primary obstacle is no longer a lack of modern programming languages like Java or Python, but a lack of understanding regarding the original business logic. "They struggle to get budget for back-end modernization because of past failures," Ramanathan explains. The fear is that any change to the back end could "break" a system that, while old, is currently functional and reliable.
From Code Translation to Knowledge Reconstruction
The traditional approach to modernization often involved automated tools designed to translate code from one language to another—for instance, converting COBOL to Java. However, Ramanathan argues that this method is fundamentally flawed. In a monolithic legacy application, the processing logic, data structures, and output formats are often tightly intertwined. Simply translating the code results in a "Java monolith," which retains all the rigidity and complexity of the original system without the benefits of modern, modular architecture.
The emergence of Agentic AI—AI systems capable of autonomous reasoning and goal-oriented task execution—offers a different path. Rather than acting as a simple translator, Agentic AI can be used to perform "knowledge reconstruction." This process involves an AI agent "reading" the legacy code not just to replicate it, but to understand its intent.
Prior to the advent of Large Language Models (LLMs) and Agentic AI, extracting business logic required an "army" of human analysts to manually review code and interview the few remaining legacy programmers. This was a slow, error-prone process where gaps in domain knowledge often led to project failure. With modern AI, the objective is to translate code into a "business ontology"—a structured representation of the business rules and concepts that the code is intended to execute.
The NeoZeta and Ontosphere Framework
Mphasis has addressed this challenge through its NeoZeta platform and the creation of "Mphasis Ontosphere." The NeoZeta platform is designed to extract business logic from existing systems by utilizing LLMs to interpret the intent behind the code in human-readable, business-centric language. For example, where a legacy system might have an obscure variable like "C+M," the AI can identify through context that this refers to a specific loan interest rate calculation.
The extracted logic is then stored in a "knowledge graph." A knowledge graph is a sophisticated database that maps relationships between different entities—such as customer data, regulatory requirements, and transaction logic—in a way that both humans and machines can understand. By creating this machine-readable knowledge base, banks can interrogate their own code using conversational English, asking questions about how specific processes work before attempting to change them.
Once the knowledge is extracted and mapped, the "Mphasis Ontosphere" engine drives the forward-engineering process. This allows for the creation of new, modular code in languages like Python or Java, built directly from the verified logic in the knowledge graph. This approach ensures that "today’s new code" does not become "tomorrow’s legacy code," as the intelligence powering the system remains accessible and updatable within the knowledge graph.
Supporting Data and Case Studies
The efficacy of this AI-driven approach is supported by recent pilot projects conducted by Mphasis. In one notable instance, the NeoZeta platform was deployed to modernize a cards platform that consisted of 50 million lines of COBOL and Assembler code. This platform was critical to the institution’s operations, supporting 1.4 billion accounts and processing 25 billion authorizations annually.
The results of the pilot were significant:
- Operational Efficiency: The AI-driven analysis and re-architecture improved operational efficiency by over 40%.
- Timeline Compression: Modernization tasks that previously took months were completed in weeks.
- Accuracy: The use of LLMs to extract logic reduced the "knowledge gap" that typically plagues manual reviews.
Furthermore, industry data suggests that the move toward modular, cloud-ready platforms can reduce infrastructure costs by 30% to 50% while increasing the speed of new product launches (time-to-market) by as much as 60%. For a major bank, these improvements translate into hundreds of millions of dollars in annual savings and a significantly improved competitive position.
Industry Reactions and Broader Implications
The shift toward Agentic AI in banking has drawn interest from various stakeholders, including regulatory bodies and cloud service providers. Regulators are particularly interested in how these tools can improve "operational resilience." In many jurisdictions, such as the UK and the EU, banks are now required to demonstrate that they can recover quickly from IT failures. By moving away from opaque legacy monoliths toward transparent, modular systems, banks can better meet these regulatory requirements.
Financial analysts suggest that the "agentification" of modernization processes will likely change the labor market for IT professionals in the sector. The demand for "manual code readers" is expected to decline, replaced by a need for "AI Orchestrators" and "Domain Architects" who can guide AI agents through the knowledge extraction process.
However, the transition is not without risks. Critics of AI-driven modernization point to the potential for "hallucinations" in LLMs, where an AI might misinterpret a piece of legacy logic. To mitigate this, Mphasis emphasizes a "human-in-the-loop" model where senior architects verify the outputs of the NeoZeta platform, ensuring that the knowledge graph remains an accurate source of truth.
A New Era for Financial Technology
The arrival of Agentic AI represents a potential turning point for the financial services industry. For decades, the "back-office problem" was viewed as an intractable burden that banks simply had to live with. The ability to transform "frozen" legacy code into a dynamic, machine-readable knowledge base offers a way out of the cycle of failed migrations and mounting technical debt.
As Ramanathan concludes, the primary benefit of this technology is economic. By automating the most labor-intensive and high-risk parts of the modernization process—knowledge extraction and logic mapping—banks can finally afford to address their legacy systems. This "agentification" of the back end does more than just update the code; it creates a foundation for future innovation, allowing banks to build new AI agents on top of a clear and accessible understanding of their own business processes.
In the coming years, the success of traditional banks may depend less on their customer-facing apps and more on their ability to liberate the business logic trapped within their mainframes. With tools like NeoZeta and Ontosphere, the "rip and replace" fear is being replaced by a "reconstruct and evolve" strategy, potentially ending the era of the frozen banking back end.
