Industry
Industry
Oracle Database, ESB, Oracle Forms, Oracle Reports, Oracle BI
Solution and expertise
One of the prominent European banks owns a huge number of internal banking software. Each solution performs its own financial functionality and operates within the specific context to achieve assigned target financial indicators.
Being great in its specific independent financial area each software is developed using a particular technology, architecture, data model and logic. However, during the bank operation, there is a necessity to cross-system check the financial results within the whole bank environment and identify possible discrepancies for each area.
In order to fulfill the bank's obligation to create and provide defined financial reports to the regulatory authorities and to enable the preparation of financial analytics for C-level management, it is essential to gather all data from the bank's software landscape into a single central location. This allows for the processing and validation of the data using unified rules and logic, resulting in consistent and prepared data into a single point of truth.
Considering the existing source bank systems, actual data landscape and anticipated results the implemented financial analytical system yielded the following outcomes:
solution architecture allows for scalability and adaptability to evolving business needs, enabling bank to cope with growing data volumes, changing data sources and adding new types of data sources
data gathering is orchestrated considering the source systems dependency initial data preparation schedule and preliminary ingestion results
data preparation involves data cleansing, transformation and validation processes, resulting in improved data quality and reliability for the bank's operations efficiency
data consolidated from diverse sources provides a unified view of validated information for better financial analysis and decision-making. It involves predefined reports, BI analytics including self-service analysis
a validated single point of truth facilitates regulatory compliance by providing a centralized and auditable data repository, enabling bank to meet reporting requirements more efficiently including regular obligated reports preparation and delivery
financial analytical system provides valuable insights for marketing campaigns and financial product development by analysis of customer data, market trends, and performance indicators.
In summary, the centralized analytical system implemented in a bank improved data-driven decision-making, operational effectiveness and quality, and regulatory compliance, contributing to the bank's overall competitiveness and long-term success.
In order to enable a great level of flexibility in data processing within any batch of data, both within each individual subsidiary and across multiple ones, for the specified reporting period, and applying a variety of validation rules of different severity, a flexible date-agnostic data architecture of the financial analytical system was designed and developed.
This flexible approach enables the centralized analytical system to implement data quality validation across the entire integrated data landscape, facilitating transparent resolution of data issues for each banking software and ensuring the capability to effectively respond to varying data processing needs.
As part of the integrated data landscape of the bank, the analytical system is connected to each financial software and gathers, validates, and prepares enterprise data to enable the single point of truth.
External orchestration is established employing the ESB used for systems notification delivery when necessary for processing data is ready in any of the banking systems.
According to the specific notification and related data flow configuration, the centralized analytical system establishes a connection to the defined endpoint and ingests the specified data batch which can be either a database link query, an API extraction, or a structured file parsing (XML, JSON, CSV, etc.).
Ingested data processing goes through the subsequent stages of data processing: raw data, worktables, validation, prepared for operation data. Additional processes for data archiving outside of the operational period are also established. All these steps are implemented and orchestrated by the processing engine developed.
The data processing engine operations are supported by a flexible work item-dependent orchestration model, which provides the ability to dynamically change the composition and logic of the processing structure on the go or according to a schedule, aligning with evolving business requirements and changes in regulatory policies.
As a result, validated and prepared data is ready for analysis and regular data delivery to external systems, including regulatory units.
The established layered and flexible approach enhances the effectiveness and reliability of the entire data-processing framework including processing scalability and manageability.