|Print Article E-Mail Article|
Utility Computing: Business evolution lagging technological evolution?[Ben Tamblyn, Kingston University 2004/9/9]
Many leading analyst firms have identified utility computing – creating a IT service that users take for granted, like they do their electricity or water supply – as one of the top ten technology trends. With market penetration so far being estimated at 15%, there is certainly plenty of potential for future growth – Gartner estimate the value of the existing global utility computing market at $8.6B (US) and expect it to double in the next three years. Yet utility computing is unlikely to fundamentally change the composition of the IT services market place if its technical deliverables aren’t matched with business benefits. And, crucially, there still appears to be a gulf between the idea of creating a technological evolution through the use of the grid, virtualisation, and autonomic technologies, and the business evolution needed, which is primarily concerned with the movement towards a variable cost model.
Utility computing should transform the manner in which companies invest in technological resources. Current investment structures require customers to pay up front for pools of hardware and software resources, and also assume much of the risk associated with implementation and support. The utility computing model aims to relieve customers of much of this risk by reallocating some operational control to technology vendors and service providers. Instead of offering IT resources at a fixed cost, service providers function like a utility – just as with their local electric supplies, companies are connected to a grid of computing resources that they can call on to supplement their existing services as needed. IT departments scale their IT investments to meet the average level of demand, and can choose to top up that service as and when required.
This could act as a catalyst for a wave of right-sizing of computing resources across the enterprise landscape. Some, like Edward Baker, maintain that this “will save so much money that the costs can be paid for out of the operating cash flow.” Others however are more sceptical. Greg Goldstein, Head of Portfolio Management at CIT Group believes that “the product risk is different in a usage based program…it’s very difficult to come up with a schedule of lease payments to recover costs. And that’s not a risk finance companies have traditionally taken.” And initial evidence suggests that more than 80% of global organisations with sophisticated accounting tools will struggle to move from an annual IT budget/project-based model to a cost apportionment model. The fluid, utility-driven environment features irregular cash flow, granular payment schedules, and complex internal cross-charging.
Technology vendors which promote a variable pricing model stress the financial benefits to their customers of paying for what they use, however the myriad of pricing structures and options are a growing source of customer confusion. As a result companies can only be sure of (TCO) savings if they have a very clear idea of their IT costs. And Alastair McAulay, from PA Consulting is amazed at “how little organisations know about what they are paying for their IT, especially large organisations.” Without this knowledge, he suggests, it is hard to know how much a business can save through an investment in utility computing.
Even if the industry were to embrace measures for pricing a standard unit of capacity in much the same way as electricity or gas providers, Stephen Pritchard from the FT argues that this may not address the “needs of managers, who are more interested in the cost of business processes than of hardware. Many utility computing contracts use transaction measures, such as a database entry or accepting an e-commerce order, rather than capacity,” increasing the complexity of evaluating utility options.
This leaves the CIO with two options, either, to accurately measure the required capacity, arguably impossible without a clear idea of existing IT costs or, pay for computing on a transaction basis. According to McAulay “there will be some things you can commoditise, such as storage or perhaps an Oracle database but to bill by the transaction, you have to know what that transaction is.” The inability to comparatively evaluate utility measurements may represent a significant barrier to the take-up of utility computing. Whilst the creation of a standardised pricing methodology may help, achieving this will take some time, and significant vendor cooperation.
So persuading customers to become accustomed to flexible pricing models looks likely to require major changes to existing technology investment practices and company-wide attitudes. Under a utility computing model the IT department will also have to take on new responsibilities, acting as a change agent and assuming a role with a greater business orientation. Mike Dodd, vice president at the IT research company, Giga Information Group predicts that as demand for utility computing and associated outsourcing services increases most IT departments will shed staff, acquire new management skills and turn into specialist resource brokers. “They will spend their time facilitating the satisfaction of business-driven demands, rather than being direct suppliers of IT services.”
This will have a significant impact upon the composition of the IT department, but also upon other areas of the organisation. According to IT market analyst Gartner Group, by 2005, 40 per cent of Europe’s IT and business process expenditure will go on out-sourced services. “The monolithic corporation that owns all its own resources, capabilities and channels is becoming a thing of the past. It is being replaced by organisations that can build and manage a spectrum of relationships, from basic procurement to strategic alliances,” according to Roger Cox, a Gartner analyst. Phil Morris, Director of Morgan Chambers, a consultancy that specialises in advising companies on outsourcing contracts, believes that “the culture of the whole organisation will need to change to reflect the central role of IT. So often at present when something is vaunted as strategic the IT department is the last to know about it…Companies will be challenged to think about how much more value they could extract from their IT. They will need to raise their game to take advantage of the flexibility of computing on demand.”
Use our news headlines on your website or newsreader software for FREE!
Click here to find out more today.
[2009/3/28 ]App Store to drive marketing growth
[2009/3/28 ]$250.000 Worm Targeting Windows 7, Vista and XP to Evolve Again
[2009/3/28 ]Sun VDI Software 3 boosts virtualization flexibility
[2009/3/28 ]Hitachi delivers 6 gig SAS enterprise drive
[2009/3/28 ]Q&A: When Mobility and Open Source Collide
[2009/3/28 ]Open-source Firms Urged to Go on Legal Offensive
[2009/3/28 ]Author of 'cloud Manifesto' Surprised by Microsoft Protest
[2009/3/28 ]IBM Cuts Jobs as It Seeks Stimulus Money
[2009/3/28 ]Touch-screens are not the future of computing, says HP