Big Data solution

Big Data solution for online bank in Eastern Europe

20%

fewer fraud attempts

35%

enhanced risk control analysis

+$2M

annual revenue boost

Challenge

Limited fraud detection, customer insights, and market trend analysis prompted a need for Big Data solutions.

Solution

Development of an advanced analytics platform that could handle the increasing volume and complexity of data and provide real-time insights for informed decision-making.

Tech stack

Hadoop, MapReduce, YARN, PowerBI, Hive, HBase, AWS, Azure, GCP, Kafka, ClickHouse, ScyllaDB.

Client

Our client is one of the leading online banks based in Eastern Europe that has been operating for almost a decade, serving millions of customers across the region. As a fast-growing financial institution, they face intense competition from other established banks and fintech startups disrupting the market with innovative products and services.

Challenge

To stay ahead of the competition, the client realised the importance of leveraging Big Data solutions to improve risk management analysis, fraud detection, and customer contentment. The bank was facing several challenges in these areas, including the inability to detect and prevent fraud in real-time, limited visibility into customer behavior and preferences, and a lack of timely insights into market trends and risk factors.

Seeking a resolution to their problems, the bank addressed the Modsen team to develop a Big Data solution to combat the issues causing a loss of revenue and customers. The client understood that they needed an advanced analytics platform that could handle the increasing volume and complexity of data and provide real-time insights to make informed decisions and take appropriate actions.

Share form

Got a similar software development project on your mind?

Estimate a precise timeframe for its implementation.

Team

1

Project manager

4

QA testers

1

UI/UX designer

1

Business analyst

1

Solution architect

6

Software engineers

3

Big Data engineers

Project team

Process

The development of the Big Data solution for the client’s online banking operations involved several stages, each of which required careful planning and execution.

1. Initiating the Big Data solution: A collaborative effort between Modsen and the client

To commence the Big Data solution development, Modsen put together a dedicated team comprising a project manager, a business analyst, a solution architect, three Big Data engineers, a UI/UX designer, six software engineers, and four QA testers. The initiation stage involved closely collaborating with the client to define the project scope, goals, and objectives. To ensure a smooth flow of communication, the team identified and established the communication channels and protocols that would be used throughout the project. Roles and responsibilities were also defined in a project governance structure. The client was primarily focused on enhancing customer satisfaction, improving fraud detection, and minimising the risks causing revenue loss. The Big Data solution would be able to analyse large volumes of data in real-time, generating actionable insights that could be utilised to enhance business processes and services. As part of the solution development process, each of these objectives was treated separately and given equal attention.

2. Building a comprehensive plan: Our approach to discovery and planning phases

Business requirements

  • R1To speed up the implementation of new big data analytics applications (business goal)
  • R2To be able to test new data analytics tools and algorithms outside the bank premises whilst assuring maximum level of security/privacy (business goal)
  • R3To enable third parties to efficiently implement and test new tools and algorithms without accessing real data (business goal)
  • R4To ensure accuracy and reliability of analytics process (quality business goal)
  • R5To improve efficiency of the analytics process (quality business goal)
  • R6Time efficiency
  • R7Cost reduction

User requirements

  • R8Data is collected by several different sources (ATMs, online banking services, employees’ workstations, external providers’ activity, network devices, etc.) (data provider requirement)
  • R9Data are owned by the bank and are not publicly available. They can be shared with third parties only once the data is anonymised (data provider requirement)
  • R10Support the use of techniques related to log analysis, such as process mining algorithms or similar (big data analytics provider requirement)
  • R11Users will be able to download results (in several formats such as .csv, .xls, etc.) in order to analyse them by their own or send them to other employees of the Security Operation Centre (data consumer requirement)
  • R12Intermediate users will be able to modify parameters of the algorithms and refine the initial results (data consumer requirement)

System requirements

  • R13The system should enable the generation of anonymised and synthetic data to enable safe experimentation and testing (functional requirement)
  • R14The system should support diversified, analytic processing, machine learning and decision support techniques to support multiple stages of analysis (functional requirement)
  • R15The system should ensure security of sensitive data (non-functional requirement)
After setting up the project governance structure, the Modsen team conducted an in-depth analysis of the client’s data and business requirements in the discovery phase. This involved collaborating with the bank to develop a custom Big Data solution that met their specific needs. Our business analyst developed use cases and user stories, conducted data profiling and cleansing, and identified key performance indicators for the project. We also prioritised regular updates to keep the client informed and engaged during this crucial phase.

Furthermore, Modsen proposed a scalability plan to the bank stakeholders, which involved designing a system capable of handling a growing volume of data and users as the bank’s business expands. The team worked closely with the client to identify potential growth scenarios and design the system architecture accordingly. By implementing this solution, the Big Data platform could support the bank’s future needs and save costs associated with developing a new system from scratch.

Finally, during the planning phase, the Modsen team developed a detailed project plan that included a roadmap, technical architecture, and risk management plan. They selected appropriate technologies and tools, defined data models and schemas, and created a data integration strategy. By taking these steps, the team was able to deliver a high-quality Big Data solution that met the client’s needs, provided valuable insights, and exceeded expectations.

3. Developing a Big Data solution: An inside look at the Modsen team’s process and methodology

Scaling data processing with Big Data infrastructure

As we were developing the Big Data platform for one of the leading online banks in Eastern Europe, we recognised the need to build a strong and dependable infrastructure capable of handling the vast amount of data that would require processing. To achieve this, the Modsen team used a range of technologies, including Hadoop, MapReduce, YARN, Hive, HBase, Kafka, ClickHouse, ScyllaDB, and cloud services such as AWS, Azure, and GCP. We chose Hadoop as the core processing engine because it could distribute storage and processing across multiple servers, allowing for scalability and reliability. We also implemented MapReduce and YARN to handle large data sets, while Hive and HBase were used as the data warehousing solutions. To enable real-time data streaming and processing, Modsen engineers decided upon Kafka and ClickHouse, while ScyllaDB was used as the NoSQL database to handle large amounts of data quickly. We leveraged the cloud services of AWS, Azure, and GCP to provide a scalable infrastructure for the Big Data platform, allowing for rapid and efficient resource provisioning as data volumes or processing requirements increased.

Building the foundation: designing a robust Big Data platform architecture

Big Data platform architectureThe architecture design played a crucial role in achieving optimal performance in real time. Prior to implementation, the Modsen team conducted extensive research and analysis to determine the best architecture for the platform. We considered various factors such as data volume, processing requirements, and scalability, and ultimately implemented a Big Data platform that was robust, scalable, and highly efficient.

The solution was designed to handle large amounts of data without compromising on speed or accuracy. We ensured that our architecture was scalable, so it could accommodate increasing data volumes as our platform grew. In addition, our engineers optimised the platform for efficient processing, which allowed us to quickly analyse and extract insights from the data.

Agile deployment of a high-performance Big Data platform

High-performance Big Data platformThe development of the Big Data platform for our client followed an Agile approach, which involved iterative cycles of the code implementation, testing, stabilisation, and demonstration. Every two weeks, we would demo the progress we had made and gather feedback from the client, incorporating their input into the development process. This approach was critical in ensuring that the platform was customised to meet the client’s needs and requirements, enabling the team to make any necessary adjustments early on, minimising the risk of costly rework or delays down the line.

Additionally, our QA engineers conducted extensive performance and stress testing to simulate heavy data volumes and extreme processing conditions, ensuring that the platform could handle unexpected spikes in demand without disruption.

Efficient deployment was also a key component of the development process. Following the Agile methodology, we executed the software deployment process swiftly and efficiently, so that the Big Data platform was deployed seamlessly and with minimal disruptions, providing a flawless and hassle-free experience for the end user.

Meeting compliance standards

To meet compliance standards, we engaged third-party auditors and certification bodies to perform regular assessments and audits of the solution’s security, data protection, and regulatory compliance. This helped us ensure that the Big Data platform complied with the General Data Protection Regulation (GDPR), the Payment Card Industry Data Security Standard (PCI DSS), Basel III, etc.

4. Wrapping up the Big Data solution: A glance at the closing phase

Finally, the closing phase involved delivering the solution to the client and providing ongoing support and maintenance. This stage included conducting user training, documenting the solution architecture and processes, and ensuring that the client was satisfied with the results. The team also established a performance monitoring framework to track key metrics and ensure the solution continued to meet the client’s needs over time. The development of the Big Data solution for the client’s online banking operations involved a structured and disciplined approach, combining technical expertise and domain knowledge to overcome complex challenges and deliver tangible business value.

Final product: scalable, secure and powerful

The big data platform developed by Modsen is a highly scalable and flexible solution designed to support a wide range of use cases. It is built on top of a microservices-based architecture, which allows for high levels of modularity and flexibility and supports seamless integration with a wide range of third-party tools and technologies. At the core of the platform is a distributed storage system, which is capable of handling large volumes of structured and unstructured data in real-time. The system is highly scalable, with the ability to store and process petabytes of data, and can be easily scaled up or down based on demand. The platform also includes a powerful data processing and analytics engine, which is capable of running complex analytical queries and machine learning algorithms in real time. This enables the system to detect patterns and anomalies in data, perform predictive analysis, and identify potential risks or opportunities. To support advanced analytics and data visualisation, the solution includes a range of powerful tools and technologies, including Hadoop, Hive, and PowerBI. These tools allow for easy data aggregation, query, and analysis, and support real-time data processing and visualisation. In addition, the platform is empowered with a range of advanced security features, such as multi-factor authentication, encryption, and access controls. These features help to ensure that data is protected at all times and prevent unauthorised access or breaches. Overall, the Big Data platform developed by Modsen is a highly capable and flexible solution that can support a wide range of use cases, from risk management and fraud detection to customer sentiment analysis and personalised marketing.

Big Data platform

Results:

After the implementation of the Big Data solution, the bank experienced remarkable improvements in their risk management analysis, fraud detection, and customer satisfaction. The use of advanced analytics tools and machine learning algorithms provided real-time insights into customer behavior, risk management, and fraud detection, leading to a 20% reduction in fraudulent activities, a 15% increase in customer satisfaction, and a 35% improvement in risk management analysis. These impressive results had a significant impact on the bank’s profitability, with an estimated increase in revenue of $2 million per year, due to the reduction in fraud and the increase in customer satisfaction. Overall, the implementation of the Big Data solution not only improved the bank’s operational efficiency, but also had a positive impact on their financial performance.

20%

reduction in fraudulent activities

15%

increase in customer satisfaction

35%

improvement in risk management analysis

$2m

estimated increase in revenue per year

Let’s calculate the accurate cost and resources required for your project