Is There a Case for Replacing the Core? (IT Trends)

With core processing systems, to replace or not to replace... is (and perhaps has nearly always been) the multi-billion dollar question. This is swiftly followed by a series of equally ponderous follow-ups, not the least of which is how best to go about getting the job done. As far as IT projects go, swapping out aging core technology, including general ledger, accounting, clearing, settlement, and customer information file (CIF) systems, is by all expert accounts about as onerous as it gets. Yet despite the headache of replacement, some experts are predicting that even the largest banks are forcing themselves to at least seriously consider it. And so they've been issuing requests for proposals to various types of vendors or questioning analysts and consultants to get a grip on an IT processing issue with Ph.d-level complexities. A recent report issued by Boston-based Celent, pointed out that systems--with an average of 35 years in service--are rapidly reaching the age of liability at many institutions. Celent, for one, is predicting a lot of activity around replacement by 2005 as a response to antiquity. As if to keep bankers in need of guidance guessing, other groups offer decidedly different interpretations. A GartnerGroup report issued in November, for instance, stated "through 2004, enterprises will be driven by economic concerns to extend the life of legacy systems 60% of the time, rather than replace them with packaged solutions or new technologies." The biggest banks are figuring out what their options are and what the ROT on cost of replacement is. In the vast middle-tier market, where many outsourcers have long picked up the processing slack once done by in-house core systems, Y2K-era contracts are up for renewal. This means, of course, that some churn is inevitable. "We're heading into a period where some community banks will likely put pressure on their vendors to upgrade their data centers to support real-time processing if they don't already," explains Jamie R. Geshke, senior vice-president with Metavante Technology Services in Milwaukee. "Certainly if a bank is unhappy with how some aspect of processing or reporting is done, they will be shopping around," says Geshke. "Some may even consider buying newer platforms and taking the job back in-house." Capability, rather than capacity or cost, tends to be the biggest issue among banks of other sizes. Mid-tier banks--mare options Gretchen Mohen, vice-chair technology and operations for Bank of Hawaii, Honolulu, is preparing for a conversion to Metavante, this July 4. "We wanted a three-day weekend to work on the project because we have a time zone challenge with our operations in Guam," she explains of date. Mohen recounts that soul searching related to the Y2K project first prompted the $9.5 billion-assets bank to reconsider its operational scope and cost; a subsequent effort to expand quickly in the Asia/Pacific region that was less than profitable then put the bank into a mindset of re-evaluation. In looking specifically at technology, Mohen explains: "We began to realize, post remediation, that we could do more with the same budget if we upgraded our technology--both staying inside and going the outsourcing route was thoroughly considered before we opted to outsource." When asked why decisions to replace weren't made sooner, say, preceding Y2K, Mohen was candid about the bank's learning curve. "Going into our Y2K work I don't think there was a full understanding of the costs involved in maintaining our existing infrastructure," Mohen explains. "Since then, we've evaluated the systems and operations ourselves and also went through consultant-led, proof-of-concept exercises to get clear on all the soft costs. We also looked very closely at business process and our product set to see what would most impact customer service and customer need. …