Data Warehousing for Bank Decision-Making
Barzilay, Amos, American Banker
Banks are clamoring to incorporate data warehousing into their information technology operations.
The overriding reason is that the technology offers people at all levels - in the front office and back office, from the teller line to the executive suite - the power to make mission-critical decisions more quickly, more accurately, and more easily.
The 1994 Ernst & Young and American Banker Special Report on Technology in Banking estimated discretionary spending on modernization and new technologies would rise 30%, of which 30% was expected to be directed at improving the quality of decision-making.
Data warehousing directly targets decision-making for virtually all critical paths to profitability, starting with profitability analysis itself and extending to risk management (both credit and interest rate risk), improved customer support service, and target marketing of new products.
The demand for data warehousing stems directly from the increasingly competitive landscape of the financial industry, in which banks are relying on technology advances to differentiate themselves and to improve productivity and profitability.
Also spurring the demand is the onslaught of mergers and acquisitions that is bringing together disparate systems and growing volumes of data to be churned and manipulated for decision analysis. The M&A phenomenon alone has challenged all aspects of the information technology infrastructure, from hardware capacity to software capability.
The promise of data warehousing for most organizations reads as follows: Data warehousing consolidates information into a single view from disparate systems throughout the financial institution for analysis and optimized decision-making.
Though most banks have tried for years to bring together information, an overwhelming obstacle has been technology and the costs involved. Information in disparate systems throughout the institution has remained grouped as "islands" for isolated decision-making.
Accurate and timely decision-making has suffered immeasurably. As mergers and acquisitions have become more commonplace, so has the number of disparate systems within the combined institutions. This compounds the problem of bringing together information.
The costs associated with computing and storing large volumes of data have dropped tremendously, putting data warehousing within reach of more bank budgets. It's the consensus that hardware prices drop by half every 18 months in the open systems market and data storage costs drop even more significantly.
The Gartner Group estimates that 10 years ago it cost about $28 million to store one terabyte of data. By the end of the decade, the same of amount of data cost $1,000 to store. It is no wonder we are seeing an increase in the integration of advanced open systems platforms with presiding mainframes and, in many cases, the adoption of replacements for mainframe systems.
But multiprocessing - computers using several processors for parallel processing - has had the greatest impact on the technological merits of data warehousing.
Data base companies have revolutionized the use of this technology by radically improving the cost-effectiveness and performance of processing information.
Parallel processing allows the breakdown of a single query into components, and each component is assigned to a different processor that can operate on its portion in parallel with the others.
This is revolutionary. A bank can now reduce the performance time to answer a query in a linear relationship by the number of processors working on it. For example, if a bank has 20 processors, response time can be reduced 20 times.
Combined with improvements in the cost performance of hardware in the last couple of years, the overall cost-effectiveness of querying has improved by a factor of nearly 100.
In essence, data base technology has delivered a way to cut the time it takes to perform a complex query from 1. …