German bank NORD/LB embraces rapid change with Kafka and Confluent

(Image from Confluent website)

NORD/LB is one of Germany’s largest commercial banks, with total assets of over €126 billion, and is also partly owned by the German government. The bank is carrying out an extensive transformation program – which was triggered by the 2008 financial crisis – and is investing half a billion euros until 2024 to adopt new systems and ways of working.

Key to this transformation is the implementation of new IT systems and tools, where NORD/LB decided to adopt Apache Kafka, running on Confluent Cloud. The goal of using Confluent is to free data from the store, allowing the organization to create products and services faster, adapting more effectively to rapid change.

Although NORD/LB was not directly affected by the financial crisis of 2008, given that it had not invested heavily in real estate, the long-term fallout was significant. The bank has a strong stake in maritime investments, which suffered due to a decline in maritime demand in subsequent years.

Things reached a turning point in 2017 when NORD/LB had to be reimbursed, due to its cash flow needs. However, as the bank is partly owned by the German government, it could not simply issue new shares like other private banks, nor receive taxpayers’ money due to EU state aid rules. .

As a result, the government had to act as if it were a private investor and implement drastic measures. Speaking at the Kafka Summit in London this week, Sven Wilbert, Chief Data Officer at NORD/LB said:

The effect was that 50% of people were made redundant, there was a 50% reduction in costs and a total renewal of the IT infrastructure. It was actually good for us, it was a good thing. The central banking system is changing. We are getting rid of the old centralized Db2 data warehouse and replacing it with a financial services data model.

We modify approximately 60% of all of our IT infrastructure and systems, either modifying them significantly or replacing them with new ones. And we’re also thinking about new ways to integrate data.

Part of that is moving to an event-driven architecture based on Confluent, allowing the store to access siled data. Wilbert said:

Of course, we have a large data warehouse, based on Cloudera. Of course, we have different relational databases. But what we’re doing now is using Kafka as a transport system to transfer data from all of our systems into our new central SAP banking application.

By doing so, we can also transport the same data into our Big Data platform. Because we’re not just streaming data out of the system in raw, bare form, we’re also changing how data integration works.

Normally you get the data from the system, at some point you transform it, then you have standardized stuff, then you ship everything back. What we want to do is distribute the data integration work throughout the system.

Change is no longer scary

The Kafka architecture allows the producer of a data object to be shipped through the Kafka system, and then the consumer of that data can tap into the stream and transform it into whatever it needs. NORD/LB started using Kafka on-premises, but made the decision to adopt Confluent Cloud by the end of the year. Willbert added:

We know that we cannot do it ourselves on site. We don’t have the manpower, the skills, and it’s just more expensive to do it ourselves. That doesn’t mean the cloud is cheaper, but in our case, go with Kafka-as-a-service [Confluent Cloud] is cheaper for us in different ways.

Confluent comes into play because we can’t do it ourselves. We need someone trustworthy as a partner, not only for the implementation phase, but also when we get to managing it. Because if this thing fails, we won’t be able to do anything, because it will transport all our data between systems.

We want to get rid of the point-to-point between computer systems – not by force, but by having the data in Kafka and making it more readily available. Other systems can connect to the Kafka stream they need and just take the data they need, at the rate they need. And we don’t have to do it ourselves.

In the past, there was a centralized team of about 50 to 70 people at the bank that maintained this whole environment. Going forward, once Confluent Cloud is adopted, it will shrink to a small center of excellence of four or five people. They will be involved in supporting projects and other teams using Kafka.

The end result, or end ambition, as NORD/LB heads towards the end of its current investment program in 2024, is that the bank is able to build quickly and change is no longer viewed as a “taboo” in the organization. Wilbert said:

The biggest data challenge we face today is getting access to data. So once you have access to the data and then build your app, it’s so much faster when you can easily access the data.

You can easily connect to Kafka feed, whatever you use, then data scientists can immediately start building the model. Then you have the infrastructure-as-a-code element, where you build your environment in a containerized form and then it’s easy to upload. So all of that “having to prepare for the event of putting something into production”, all of that just isn’t there anymore.

In the past, change was something dangerous. Now change is something we want to do, we want to be faster. My boss said a cool statement at the management meeting last week – change is becoming the new normal. We are more accepting of change and technology is helping us get there.

About Pia Miller