Should US banks be moving to next-generation core banking platforms?

Most traditional banks realize they need to move faster, and many have picked up the pace of innovation and delivery through new talent, technologies, and ways of working. But banks usually are hobbled by legacy back-end core tech systems, many of which were designed in the 1980s and 1990s. These technological monoliths, which are hardwired to handle the entirety of business needs, are mostly stable and can process transactions speedily but can be inflexible and slow to change.

In planning technology investments, many banks have deprioritized modernizing their core systems in favor of the technological front ends, including websites, mobile apps, and channel experiences. Several have been “hollowing out” the core, or extracting smaller apps and services to extend the service life of their existing core banking system. Very few have truly moved to a more flexible back end. That prioritization may now need to change.

The logical next step for many banks is a next-generation core banking system that allows them to operate with the speed and agility required in an increasingly fast-paced and complex world. This is not a simple undertaking. The core banking system is the heart of a bank and immensely difficult and expensive to migrate. Also, the truly next-gen systems are still maturing, so committing to change now can be misplaced, costly, and slow. To navigate these challenges, banks need to take concurrent actions: hollow out the existing core while selecting multiple next-gen cores to test on slices of their product portfolio in a real-world test-and-learn process before deciding on a core and migrating fully. This is a delicate and complex exercise with often intangible returns on investment, but it is likely better than the other options and is fast becoming a cost of doing business. Most banks will need to view their systems strategically and technically to decide their path forward.

A rapidly evolving banking landscape is making new cores a necessity

The banking industry has been undergoing technological shifts that offer great potential for improving profitability and customer experience:

  • From closed systems to ecosystems. The banking industry is shifting away from closed systems, in which each bank handles all the work of providing services directly to its customers, toward a model of operating as part of a larger ecosystem. In the ecosystem approach, market participants make their products and services available through other providers’ channels and offer third-party offerings through their own channels. This open-architecture approach is, of course, predicated on banks’ ability to make discrete components of their core systems available to other systems and to use other core systems’ components. Examples of this approach are numerous and include US Bank’s partnership with State Farm, Goldman Sachs’s deal with Apple, and Stripe’s integration with Shopify. Another is Capital One’s DevExchange, which enables the use of application programming interfaces (APIs).1
  • From downstream to upstream. Banks were previously most active at the end of the customer journey, after commerce occurred and financing was requested. Leading banks are now moving to the top of the funnel and into the dynamic and complex search experience to take their customers along personalized journeys. For example, some have embedded credit products at point of search, rather than offering credit options after the point of sale.
  • From batch to real time. Data and decisioning are moving to real time or on demand, rather than waiting for information to be pulled in predetermined batches. Real-time data ingestion and automatic updates of modular components (for example, asset value reporting and analytics) are significantly upgrading the experiences of customers and relationship managers.
  • From reporting to advanced analytics. Banks are starting to unlock more from one of their most valuable assets: proprietary data. Requirements discovery, product selection, portfolio analysis, and service delivery are all becoming more intelligent and personalized. In one case, a fintech’s data-driven personalized customer engagement is embedded in a bank’s mobile app, providing customers with suggestions based on their transactions.
  • From perimeter security to a zero-trust security model. Most banks still rely on perimeter security to guard the entry and exit points with firewalls, proxy servers, and other intrusion prevention tools. Leading banks are moving to a zero-trust model, which is based on the principle of “Never trust, always verify.” This model requires encryption of data both in transit and at rest, multifactor authentication, access monitoring, and features such as tokenized data. These are difficult to achieve in traditional architectures, requiring special expertise to arrive at a holistic solution.
  • From one size fits all to customized offerings. Until recently, banks expected customers to choose from a few basic retail and business products. Now, however, retail and corporate banking customers alike are placing more importance on banks’ ability to personalize by making small changes (for example, configuring a credit product to appeal to franchisees or customizing reports for institutional clients) or offering new products, such as gas cards and foreign-exchange capabilities.

Would you like to learn more about our Financial Services Practice?

What makes the new systems ‘next-gen’?

At their hearts, next-gen cores are modular, meaning the truly core function of transaction management is separate from banking services like floating new accounts, servicing loans, calculating interests, processing deposits and withdrawals, and other customer relationship management activities (Exhibit 1). With modular architecture, localized changes are faster and easier to test and launch. In addition, associated services can be provided via APIs, which wrap their logic and data into a single point that can attach to any other end point. The result is greater configurability.

1
Next-gen core systems versus traditional systems.

In contrast to traditional platforms, where the use of batch processing and tight coupling limits the ability to provide the right customer experience, next-gen core platforms enable real-time (or near real-time) processing, settlement, and posting of transactions through real-time data flows via a fully event-driven architecture (EDA). Instead of batching data at particular time intervals, the system uses events to trigger those pushes or pulls in real time. In the supporting EDA, applications “publish” events to asynchronously trigger the next, separate applications that are “unaware” of upstream applications, by means of a contract established between a producer and any subscriber application. Traditional platforms, by contrast, leverage a direct connection between applications that are “aware” of each other (tight coupling) for request-response interactions defined in a way that is customized to those couplings. The resulting interdependencies make it challenging to orchestrate variations (and therefore make it simpler for the legacy architecture to use batching of data).

In addition, data compression techniques ensure that all the details for an account—including structured data (for example, transactions, customer data) and unstructured data (forms, counter receipts)—are available at transaction level for analysis or usage anywhere at any time.

Because EDA provides dynamism by allowing processes to respond flexibly to an easily configured set of events, it better suits the thinner-core concept. Traditional thick cores comprise multiple components working as a single engine. Typical elements include a general-ledger functionality and the customer data that feed it, workflow logic, document hubs, a rules engine, a pricing engine, account analyses, reporting, security, entitlements, and channel interfaces for call centers, branches, or downstream partners. Wrapping these layers into an integrated platform was once considered the most stable, comprehensive, low-latency, and consistent way to operate. When stability was the main objective, it worked just fine. However, stability now needs to be twinned with adaptability. This requires that the core architecture be thinned into its different components, which are then loosely coupled to work together rather than hardwired out of the box, without losing benefits like low latency.

Another advantage of next-gen cores is that the codebase is usually written in modern languages such as Java, Go, and Python. These languages offer increased productivity and code quality, making full use of agile, DevSecOps, architectures, larger data sets, faster networking, containers, and the cloud.

The data structure for next-gen cores also is different. An example formulation is to set up a customer-based taxonomy and append products and transactions to the customer (as some brokerage systems do for stock trades). Through this approach, customers tagged with multiple products can also be offered multiple personalized interest rates, based on discounts or specific deal terms. This contrasts with the more traditional approach of defining a product and assigning customer accounts to it.

Most next-gen cores use RDBMS for structured data and NoSQL for data-intensive use cases (for example, high-scale, real-time use cases) to accelerate generation of insights and actions across various complex business needs. Some also leverage graph databases, which enable more complex relationships and easier maintenance of data schemes.

It is worth noting here that some banks operate an intermediate second-generation core, which is thicker than a next-gen core and does not encompass all the functionality described here. Unlike next-gen cores, second-generation cores cannot fully leverage cloud-native infrastructure, so they are more limited in achieving economies of scale and cannot be accessed on a pay-as-you-go basis. But they’re a step in the right direction, and most are fully capable of handling a complex bank’s needs today, so they offer a tempting option for banks looking to modernize. However, many banks recognize that these systems might be obsolete by the time they finish migrating to them.

Those seeking to illustrate the benefits of nextgen cores point to fintechs and their modern architecture. Based on our analysis of four large fintechs that publish their financials, we estimate that the operating costs of fintech banks powered by next-gen core platforms are around 10 percent of the operating costs of traditional banks. Revenues are equally depressed (due to limited and simple product suites), but those kinds of operating costs have shown signs of scaling quite significantly with volume and into more complex products as well. However, the comparisons end there. Aside from their lack of complexity, fintechs have different histories: they were built with these cores, while banks must migrate from existing legacy codebases, and that fact raises a whole different set of issues.

Beyond digital transformations: Modernizing core technology for the AI bank of the future

Beyond digital transformations: Modernizing core technology for the AI bank of the future

How to move forward despite the uncertainties

While most banks are still in the experimentation phase, some best practices are emerging. Addressing the following questions can guide decision makers as they consider how to launch on this journey and can help banks mitigate some of the risks:

  1. Where precisely will a next-generation core system deliver value (see Exhibit 2)? For example, will value result from speed, efficiency, and flexibility? From meeting specific business needs by product?
2
Next-gen core banking systems can offer varied differentiating benefits.
  1. What functionalities will the new platform deliver?
  2. What implications will these functionalities have on the architecture? For example, if the required functionality is real-time data at low latency, will a modular architecture be able to deliver this?
  3. What does the architecture imply for whether modules should be built or bought?
  4. How do we determine which vendors to work with? What is their offering today, and how are they likely to be positioned in two to three years when the road map requires them to deliver new functionality? It is possible to test and learn with vendors in a sandbox or lab environment to provide more certainty.
  5. How risky are the projected benefits of the solution option, and what mitigants can be constructed to derisk? For example, the bank might migrate simpler businesses or functionalities to the new platform in a modular fashion.
  6. How long will it take to complete the move to the new platform?
  7. What is the likely cost?
  8. What are other requirements, especially for talent?
  9. What operating model will be needed to function effectively? Consider, for example, the mix of vendors and employees, objectives and key results, outcome-based incentives, and the location of vendors and employees.

Answering these questions can clarify the path ahead. For example, at one bank, we heard the concern that all their COBOL experts would be retiring. We explored the issue together and found that the remaining time to retirement was still several years; in the meantime, there was a bench of loyal, long-tenured employees who were willing to develop skills in COBOL for the bank’s sake. That helped somewhat release the talent constraint so the bank could take a more thoughtful path to the migration.

Once banks have answered these questions, they can proceed down one of two parallel paths.

The first path is to accelerate the hollowing out of the existing core into the more modular functionality required in the future. This can include refactoring the codebase into a modern language, such as Java, Go, and Python, and then modularizing the code into components based on the new target architecture. One way to streamline the replacement process is to leverage the orchestrator approach, which insulates the existing platform while extracting data on a real-time basis to make it available for use and mapping it to the new core to enable incremental migration (see sidebar, “The orchestrator approach”).

The second path involves carving out certain product segments (say, small-business deposits) and testing product concepts with some next-gen cores. One bank tested a digital attacker in one country on one next-gen core system and tested existing products on another next-gen core in a different country. This test-and-learn approach helped the bank build a deeper understanding of the next-gen core providers, their capabilities, and fit with the bank’s needs and culture.

Following this path, a bank can launch the sidecar product they use to test the core by acquiring new customers for that product and gradually migrating the customer base over time. Accordingly, in the first six to 12 months, the new core is installed for targeted products and integrated with existing channels and databases. Then the bank carries out acquisition of new customers for the specified products in the new core, with selected data entered into the old system, which eventually will act as a data store for both old and new systems. Over time, the bank can directly connect other systems, such as the general ledger, to the new core. At this point, there will already be some natural attrition of the existing core’s customers, and the remaining customers can be migrated to the new core. Sometimes this approach is used to launch a new business line—for example, an out-of-footprint attacker bank. If so, the new revenues can help make the business case for the next-gen system.


Banks today urgently need a new core platform, but the path to building one is time-consuming, expensive, and uncertain. To solve the strategic conundrum, banks should first consider and debate a set of strategic questions to get executives aligned on a target state. For making the transition, we recommend a two-track process: on one, banks accelerate their current efforts to hollow out the existing core; on the other, they experiment with possible next-gen cores until settling on the best one for the eventual migration. This journey can seem daunting, but inaction can create even greater challenges.

Explore a career with us