Archive 2008 : BIL with Sybase IQ
Fill to download the story [sdfile url=”http://www.sybase.com/files/Success_Stories/Dexia_SS_013008.pdf”] Founded over a […]
January 6, 2008
Fill to download the story
[sdfile url=”http://www.sybase.com/files/Success_Stories/Dexia_SS_013008.pdf”]
Founded over a century and a half ago, back in 1856, Dexia Banque Internationale à Luxembourg (Dexia BIL) is not only one of the genuine pioneers of the financial marketplace in Luxembourg, but is also the oldest bank in the Grand-Duchy. In fact, Dexia BIL has played an active role in the development of all the major phases in the Luxembourg economy. As a member of the Dexia banking group, which is one of the twenty largest financial institutions in the Euro zone, Dexia BIL operates in areas such as public finance, local banking, private banking, assets management and the administration of investment funds. This part of the bank’s business also requires a highly active presence on the financial markets.
In actual fact, a datawarehouse is an enormous entity, like a cache, with the ultimate aim of making it easier to operate.”
Pascal Paulin, Vice President of IT Strategy & Architecture, Dexia BIL
Business Advantage
Dexia Banque’s solution now ensures better operating control in terms of the quality, consistency and homogeneity of operating data.
Key Benefits
Supplies real-time responsiveness
Provides central, consistent manageability
Enables quick response to the constraints of the business
Sybase Technology
Sybase IQ
PowerDesigner
Industry
Financial Services
Download the full success story in PDF format.
Dexia BIL: From ‘Responsive’ Decision-Making to Service Resources
Designed and implemented in the mid-1990s with the main aim of providing a decision-making source for a handful of users with a commercial profile, Dexia BIL’s data warehouse has had to cope both with its own success and the ravages of time as its technological cutting edge has been eroded. Its very existence has prompted a growing number of end-users to make requests and enquiries on a more frequent, numerous and spontaneous basis. Things that the original warehouse, “set up with a specific viewpoint in mind”, as Dexia BIL’s datawarehouse Manager, Régis Tiberghien, emphasizes, was not able to provide. Nor was the small IT team, whose job it was to transpose enquiries into usable requests and then provide results for the people submitting the queries.
“The volume of information enabled us to provide a low level of responsiveness only, although there was sufficient speed to meet the warehouse’s initial objectives,” he says. “We could make figures, statistics and classic results available for things like sales figures or business breakdowns on a daily or monthly basis. And we were able to do so for a clearly defined target audience that was limited to about fifty users working in the teams supporting and managing the data from a range of profession-specific lines at the bank. But as time went by, the type of requests we were getting became more extensive and diversified: working at a slower pace that was essentially monthly was no longer good enough. Not to mention the fact that the existing datawarehouse gave us almost no real control over our operating business. It didn’t, for example, enable us to detect inconsistencies. From 2000 onwards, it became clear that we also had to ensure better operating control in terms of the quality, consistency and homogeneity of our operating data. One of the things we had to be able to identify was securities where a price had not been supplied for a number of days.”
The level of responsiveness – at least daily – required for operating purposes, was at odds with the more sedate pace of strategic decision-making. This in turn increased the discrepancy between the existing datawarehouse tool and the requirements of the business. “Having an environment focused on near real-time doesn’t make sense for decision-making insofar as it might be a source of instability,” points out Pascal Paulin, Vice President IT Strategy & Architecture. On the other hand, operating decision-making requires adaptability, flexibility and a high level of responsiveness. As a result, the existing datawarehouse, in its current design form, was likely to end up both schizophrenic and powerless: an undesirable state of affairs that required a new approach.
New Era
With this in mind, the IT department at Dexia BIL took a long, hard look at the very concept of the datawarehouse, taking a long-term view that would enable the future warehouse product to survive and retain its effectiveness and relevance for at least ten years.
Recreating a new datawarehouse would also make it possible to end the unbridled proliferation of small operating systems running, for example, in Excel, or to eliminate the datamartsthat had sprung up over the years “simply because the people in operations could not find the information they needed from our central warehouse, or because we could not provide them with it quickly enough,” explains Pascal Paulin. “Any form of traceability had become impossible because they were creating what they needed without any coordination. In addition to the loss of control, this situation was also causing problems in terms of the maintenance and management of these ‘little islands of data’, not to mention consistency and redundancy, etc. So it was essential for us to review our approach as a centralized unit and create an environment that made room for operating data and provided a layer dedicated to departmental needs. The warehouse had to become a genuine company datawarehouse where the user was the company itself, and with the additional ability to handle a whole range of different uses and requirements. It also had to be a warehouse that was viable and effective in the long term – and hence capable of incorporating an inherent notion of flexibility and the ability to develop further from the outset.”
“In view of the fact that it was technically and objectively impossible to use the content of the operating systems in real-time, the datawarehouse had to be designed as a gigantic semantic entity that was both uniform and consistent. It had to be able to gather information at source and assemble it according to the expectations and needs of the various target audiences – while at the same time guaranteeing a singleness of definitions and the way of accessing that data. In actual fact, a datawarehouse is like an enormous cache, with the ultimate aim of making it easier to operate.”
It would work as a central resource, acting as one of the services provided by the Dexia BIL IT environment and organized as a service-oriented architecture (SOA).
A further essential element for the stability and singleness of the system required were the datamarts, whose existence was justified by the constraints of the business. These datamarts had to be given ‘official’ status and be managed by the IT department. This would not prevent departments from continuing to create their own datamarts, but only for their own needs. “All data has to be incorporated into the datawarehouse,” stresses Pascal Paulin. “But we cannot strip away a form of departmental autonomy that is linked to the needs of the business. That autonomy must be preserved.”
So as we can see, the whole undertaking was ambitious, as were the objectives.
Rather than adapt the existing Sybase datawarehouse, which would have been far too costly a task, Dexia BIL opted to re-engineer it while imposing a dual constraint: retain total visibility over the existing warehouse and avoid any lack of continuity. And as before, the bank turned to Sybase for an effective response to its requirements.
Triple motivation
Central, consistent manageability by the IT department, the ability to respond to the constraints of the business while taking account of the increasingly numerous and complex regulatory requirements, operations that are more and more virtualized, with one-to-one management of consistent data, etc… The needs that today’s datawarehouse has to meet in the banking industry are light-years away from the almost superficial expression of its ‘ancestors’ from the 1990s. New regulations and legislation have a major influence on the qualities that a worthy datawarehouse has to display these days: auditability, traceability, justification of each process, the demonstrable guarantee of the quality of each step in the process, a full history of the operating data and system information, etc. are all rules that have to be complied with and put into effect.
Virtual real-time responsiveness has also become an integral part of the nature of the datawarehouse, in parallel with its more placid speed for decision-making purposes. In addition to the demands of delivering data, information and statistics on a daily basis (such as the net inventory value of unit trusts), the datawarehouse also has to be capable of providing data that can be supplied and used in the space of just a few minutes, with each department setting its own SLA criteria. This means that the level of responsiveness applies to all links in the chain, from the datawarehouse through to the data mining tool and the calculation of statistics.
Decision-making or operating data, instant or background information (made available immediately, without being slowed down along the way by back-up mechanisms and media) – everything has been designed from the point of view of a single environment. “One and the same tool and one and the same interface provide access to the data, whether it be live or historical,” underlines Pascal Paulin emphatically.
The concept of near real-time also comes with new constraints. “In near real-time,” points out Mr Paulin, “the information is not necessarily managed. It’s a question of fine-tuning blocks of information into data sets and then managing and controlling the flows displayed by each operating system. Added to the concept of data volume are the notion of time and the status of the information. Sometimes, to put together a flow of data that is relevant to the business, you have to combine different types of information and flows. Hence the need for a staging layer where the flows of data that are relevant to the business are built up and managed intelligently, based on their own rules. The same operating system flow can be part of several business flows whose relevance needs to be verified. The same thing applies for the relay of a layer by layer flow. Apart from the relevance, it is essential to check the status, quality and completeness of each flow using the existence of associated metadata that makes it possible to verify the various parameters.” Metadata that cover and document all of the layers that now make up the Dexia BIL datawarehouse infrastructure: operating sources, warehouse, corporate data collector, persistence, datamarts, services, etc. Metadata provides a catalogue that makes it possible to list and document what is available. It also enables the processes and elements available to be identified quickly – something that is very important when it comes to (re)developing a flow, or adapting it or using the information it contains.
In this regard, the next step at Dexia BIL will be to build an ‘industrialized process to automate the production of the flows that pass through the various layers.’ But any form of ‘industrialization’ will necessarily be accompanied by essential controls – which once again would only be put in place to satisfy the requirements of documentation and auditability.
Another aspect that is inherent to the concept of near real-time is the rapid presentation of information “via a display layer specific to the business target”. The working tool in this particular area is the datamart, which is characterized on the one hand by a specific purpose and use, without transverse overlapping through the company’s resources, and very often, on the other, by a limited service life (the time it takes to conduct a project, campaign or market analysis, etc.). But it has to be a datamart that isn’t physical – too cumbersome and too slow to implement – but a virtual and logical one that can be up and running in a minimum amount of time, providing a “simple view of the persistence layer on which it becomes embedded”. “Sybase IQ is the ideal answer for this type of outline, because it enables data views to be made available quickly, even though there still isn’t a rules engine set in stone (notion of non-predictivity). It’s something that guarantees both performance and responsiveness”.
Nevertheless, Dexia BIL is wary about generalizing the principle of virtual datamarts: “It’s not because it can be done technically that it makes sense from a system point of view to do it. Decisions will always be taken on a case-by-case basis. Where we need high performance and a complex or huge potential to make calculations, we will opt for a physical datamart. But if there is less processing and a need to for responsiveness and fast implementation, the logical route is justified.”
Basic Criteria
Dexia BIL has tackled the job of designing its new datawarehouse and the procedures that go with it while complying with a number of fundamental constraints and criteria:
necessary controls over the incompleteness of the data structure
responsiveness in the face of operating developments and data requirements, with the time-related appropriateness of supplying functions with data
near real-time for managing basic and derived data; near real-time guaranteed through the stage of publishing the information in datamarts, departments, downstream, using the infrastructure, resources and services at their own speed
The singleness of the access layer
A transverse approach (a company datawarehouse) incorporating the notion of creating a history and archiving (initially planned for 10 years)
Management of the notion of flow (thus enabling near real-time)
The versioning of models (hence guaranteeing the potential for the future development of structures)
A layer dedicated to access (physical or logical datamart)
Integration into the company architecture (as a service)
Taking metadata into account (descriptive and operational)