arrow leftBack to Customer Stories
Customer Story

A shift in vendor strategy and technical requirements led the First International Bank of Israel to adopt a new data platform. Read about our work with the bank which encompassed end-to-end pipeline construction, performance tuning, and building bespoke ETL services for event streaming. All the work was carried out whilst adhering to and exceeding strict regulatory and security requirements.

google cloud & kubernetes
Big Data Platform Modernization
elasticsearch
Data Platform Security, Compliance and Monitoring
hebrew text analyzer
Kafka Integration and Optimization

The Client

First International Bank of Israel (FIBI) is a long-standing Israeli financial institution. For half a century, FIBI has served a mixture of corporate and individual customers, separating itself from the field by providing outstanding customer service and pioneering financial technology solutions. Today, FIBI serves both local and international clientele, operating a global network of staff and offices.

The Project

FIBI is a sophisticated user of various database and big data technologies, which underpin its day-to-day activities as a financial services institution. In 2020, these were primarily MongoDB databases and numerous Hadoop clusters.
However, the Cloudera acquisition of Hortonworks in 2018 presented an issue for FIBI. Cloudera was not only altering licensing mechanisms for their newly acquired Hortonworks Data Platform and Hadoop customers, but they were also changing the deployment methods for the technology. Additionally, Cloudera was seeking to pull the disparate customer base onto a new, singular platform.

“Amongst other things, the new management UI that Cloudera proposed for Hortonworks customers just wasn’t going to work for us… this presented us with a unique chance to try something new”

states Denis, a lead DBA at FIBI who is responsible for big data technologies.

FIBI selected Kafka as the platform for their upcoming big data projects, but it wasn’t set to be a straightforward project. Hadoop had been in use at FIBI for many years, and the ETL processes that had been established internally were not able to be re-used onto the new Kafka-based platform.

FIBI’s technology and infrastructure teams were also lacking an understanding of Kafka, which would make delivering a complex project like this impossible. There were also numerous security integrations to be considered, given that FIBI operates in a heavily regulated industry.

Selecting the right partner

“BigData Boutique’s recommendation came straight from Confluent”,

a member of the IT infrastructure team says.

Given the complexity of the project, FIBI knew that they had to get a partner involved who was renowned in the big data space. The bank also needed consultants who understood their compliance requirements and could factor those into their solution. Most importantly, whichever partner FIBI selected would have to be able to transfer knowledge of the new Kafka solution throughout the engagement.

“We needed someone who wouldn’t just complete the project, but would explain and familiarize our engineers with every technical aspect of the platform,”

says one of FIBI’s database team.

The bank’s technical team was not familiar with Kafka, and so would be relying on their trusted partner to bring them up to speed.

“After having met with a few other potential partners, it became clear that BigData Boutique’s vision for the platform was the right one for FIBI”.

A brand new big data platform

As the Kafka platform would be totally independent of the existing Hadoop infrastructure, BigData Boutique advised that a brand new architecture based on Kafka and modern storage systems would be the best approach.

“This allowed us to have the best solution possible, without compromising to allow for migration activities,”

says a member of the big data team.

BigData Boutique provided, in consultation with FIBI’s infrastructure and compliance teams, a brand new architecture for the Kafka-based big data solution. The new platform was complex in design, but it was solving a complex problem. First, data services had to be established to egress from FIBI’s mainframe environment and prepared for ingestion to Kafka. Then, within Kafka, BigData Boutique allowed flexibility for FIBI to run their ETL services on the platform. Those ETL services were streaming Kafka events and making structural changes to the data before writing it to MongoDB.

“It was very difficult to get this right, given how complex the problem was,”

says one of FIBI’s infrastructure team. The MongoDB database underpins many of FIBI’s core applications as well as its customer-facing site. BigData Boutique ensured that the data structure from the Kafka solution was identical to that from the older Hadoop platform, to prevent any application re-architecture.

“They were able to design and build all of the data pipelines to meet and beat our requirements, and automate the entire process using DevOps principles”.

Workload streamlining

In addition to the complex big data pipeline built, BigData Boutique designed a new bespoke data service to query data directly from the pre-existing MongoDB databases. Previously, application queries were carried out on the mainframe. This new data service produced a significant reduction in latency for those query-heavy applications, which was passed on to end-users as an improved user experience.

“This has become a very important component of the data ingestion and processing for the entire system”

states Denis.

The security angle

“Because we’re a bank, it can make projects like this quite challenging because of the regulatory demands from the Israeli central bank”

admits a member of FIBI’s big data team.

The new data pipeline and the Kafka platform were responsible for dealing with significant amounts of sensitive data, which was a key consideration of the engagement.

“Securing our data was the number one concern when working with BigData Boutique on this project,”

said one DBA.

Given that FIBI’s mainframe environment was the source of the data, BigData Boutique had to construct complex integrations to interact with the mainframe’s operating system and security controls.

“We had to have very strict authentication on every MongoDB instance, and that was typically done with BigData Boutique’s work with Active Directory,”

says Denis.

BigData Boutique carried out substantial work to ensure the security and compliance of FIBI’s Active Directory authentication, establishing secrets, new security parameters, and SSL connections between services.

“Those were the security demands we put on the table for BigData Boutique, and that’s what they were able to do”.

Looking to the future

There is no doubt that BigData Boutique’s engagement with FIBI was a challenge. From mainframe integrations to regulatory requirements, it required expertise to be completed to such a high standard.

“What was so impressive about BigData Boutique wasn’t just the work that they did, but that it was delivered during COVID-19 remotely”

states a member of FIBI’s IT team. This was the first time that FIBI had experienced a fully remote engagement with a technology partner, and they weren’t disappointed.

FIBI continues to work with BigData Boutique and has agreed to a bank of hours to flexibly draw tap into when they need specialist assistance and advice

“It’s fair to say that there are complex tasks where we absolutely need and will use BigData Boutique,”

says Denis. There are also new projects, including cloud migration and data services tasks, which BigData Boutique is engaged with across the bank.

We use cookies to provide an optimized user experience and understand our traffic. To learn more, read our use of cookies; otherwise, please choose 'Accept Cookies' to continue using our website.