Core banking systems have sat at the heart of banks’ operations for decades. But with the banking landscape changing faster than ever – thanks to new digital innovations, regulations, and consumer expectations – many of these legacy platforms now struggle to keep pace.
The result? Banks are increasingly finding that their existing cores are holding them back from delivering the experiences that today’s customers demand, largely due to limitations around:
- Complexity – Acquisitions and divestments have resulted in convoluted and sub-optimal core banking architectures, with newer regulatory requirements only adding to this complexity.
- Flexibility – Core banking systems originally built with reliability in mind are now stuck on aging mainframe architectures and rely on dwindling pools of specialists, resulting in tech stacks that are often prohibitively slow and expensive to change.
- Capability – Changing consumer demands are pushing legacy cores to the limits of their capabilities. Many are unable to meet modern expectations regarding real-time notifications of transactions, deeper insights into spending habits, and rapid product development. Consumers are rightly asking why established banks can’t offer the same service as digital challengers.
Faced with these challenges, bank operations leaders agree that there is urgent need for change. Accenture’s North America Banking Operations Survey revealed that 80% of bank leaders believe that their organisation’s existence could be threatened if they do not update their technology to be more flexible and capable of supporting rapid innovation.
Some banks have attempted to bridge the innovation gap by investing in short-term incremental gains on legacy platforms – or “hollowing out the core” – helping them buy time as the market continues to innovate. However, in the last few years, a new frontrunner has emerged in the race to find a more permanent solution: cloud-native core banking. This approach addresses many of the concerns I’ve highlighted above, providing banks with a highly scalable API-led core that is nimble enough to enable them to compete with new challenger banks.
So, if cloud-native core banking is the destination, the next question is: What steps need to be taken to get there?
Beyond “big bang”
One of the most complex and expensive elements of any core banking transformation is the migration from legacy to target systems, accounting for, on average, around 40% of overall programme spend.
This unenviable task is a key barrier to realising the benefits of new technologies and breaking free of the constraints of legacy systems.
Accenture has teamed up with FinTech Thought Machine, provider of the next-generation cloud-native core banking solution Vault, to consider how a cloud-native core could not only benefit banks in their day-to-day operations, but also de-risk the migration journey required to get there. Considering Vault as a use case, we have identified the following advantages of migrating onto a cloud-native core:
- Automated API loads: the data loader service is a single migration API that fully automates data loads. It removes otherwise manual steps and pauses in the load process, reducing risk and greatly simplifying the migration event schedule.
- Managed data dependencies: the data loader automatically orchestrates the sequence of data loads based on a highly configurable dependency system, allowing banks to send all migrated data at once. By removing the need to load data files sequentially, this approach reduces the complexity and length of the migration event.
- Real-time reconciliations: the streaming APIs provide messages for all database changes, including loads and activations, in real time. This cuts the time taken to produce financial and operational reconciliations, freeing up more time for teams to analyse and act on the results ahead of go-live.
- Time-travel and simulation testing: the ability to simulate both historic and future scheduled product behaviour (e.g. interest accrual, EOD, incoming transactions, etc.) within compressed tests. As a result, banks can reduce the time needed to test and prove complex multi-day cycles. They can also find and address functional or data issues that typically require lengthy tests in production-like environments, providing greater confidence going into live migration events.
- Creative cutover design: this allows data to be loaded into production in a dormant state, maintained through delta updates, and activated at an individual account level in flexible tranches. This minimises customer impacts and migration complexity by decoupling the loading of data from making that data operationally “live” to customers and colleagues.
Taking the first step
Even as we speak, banks are carefully balancing their desire to be early adopters in the move to the cloud with the risk of high-profile customer-impacting issues if they are not successful. The fallout from well-publicised failed migrations is still fresh in many bankers’ minds – which is why adopting the right tools and strategy is key to making the journey a smooth one.
Forward-thinking bankers recognise that the move to cloud will generate enormous cost efficiencies and create better customer experiences. They also know that they cannot afford to delay this transformation. With the pace of change and customer demands only increasing, banks need to act now to be ready for what’s ahead. Partnering with an experienced systems integrator can accelerate and de-risk cloud-native core banking adoption, helping banks realise the benefits faster.
Special thanks to Gareth Richardson, Thought Machine’s COO, who contributed to this blog.