Got Grid? A Old-but-New Way to Manage Processing Power Is Taking Wall Street by Storm. (Tech Initiatives)
暂无分享,去创建一个
If you're the sort to watch the watchwords, then you know that grid computing is pushing the needle on the "hype-meter" off the scale of late. Right up there with web services, application and network security, and disaster recovery, grid's getting more ink, and, it turns out, more "near-future" purchasing commitment among many financial services firms than nearly any other solution category this year. Although statistics are hard to come by, the current value of the market is estimated at $1 to $4 billion dollars, according to Gartner Research. Bill Clayburg with Aberdeen Group, Boston adds that many of the stats you might stumble upon could be misleading. "A lot of high performance computing--which is a broader category than clusters and grids--gets lumped into grid statistics, so it can be confusing," says Clayburg. "But my sense is that the market for grid is expanding." After ten years, it's hot What you may have wondered (besides what a grid is, exactly, which we'll get to) is why a method that's been ten years in the making is on the radar now. The short answer is that grids have finally matured to the point where they are available as products that are "faster, cheaper, and better." That budgetary interest is starting to get mainstreamed on Wall Street and is also bleeding over into even wider acceptance, too. Pacific Life Insurance Co., Newport Beach, Calif., as a for instance, recently announced that it would adopt the DCGrid 5.0 from grid vendor Entropia, San Diego, for risk modeling applications. The reasons for grid's popularity are practical. Peter Lee, CEO of Data Synapse, New York City points out that most firms use about 20% to 30% of total enterprise capacity at any given time. This results in costly IT environments that require the purchase of additional, large multiprocessor servers every few years. Grid is proving itself a way around this conundrum. True, grid's high profile takes a peculiar shape. This is because individually, most companies are keeping a low profile about their involvement anti loathe to be named or to dish out specifics to the press. (The unwillingness to talk could mean anything from fear of giving away competitive advantage to silence over the inevitable implementation problems faced by early users.) But what you'll hear in general terms is that any high-stakes computing problems involving risk calculations and other types of reporting are all the purview of this "old-but-new" method. The pioneers that are talking are clearly enthusiastic about how grids have reduced their reliance on expensive alternatives and resulted in a more rational and affordable use of IT infrastructure. Early adopters At the recent "IT on Wall Street" conference in New York City, JP Morgan Chase, Wachovia, and their vendors proclaimed grids "real and able" to tackle the resource capacity issues that many brokerage and investment management firms do battle with daily. Wachovia Securities director of trading technology, Joseph Belciglio, says the firm first adopted a kind of homebrewed distributed computing method in the late 1980s. Now, it uses Data Synapse's version of a commercial grid to do real-time risk management more affordably. (Belciglio did not want to get into specifics on cost savings, however.) Yet another way to frame grid's newfound status is that, when it comes to number crunching, Wall Street's volume is right up there with the biggest and best in academia, sciences, and industrial design. As a result, companies on the Street are joining those others in making use of grids instead of relying exclusively on costly clusters (i.e., groups of processors) and supercomputers to keep their calculations rolling on cue. As a for instance, one of the top four broker dealers in the world has signed to a deal to go from a "farm" of 350-400 CPUs spread out across many cities today to an improved 1,000-server farm centralized in fewer locations. …