Introducing Utility Computing

Utility computing enables a service provider to make computing resources and infrastructure management available to customers as needed. This excerpt from "Data Lifecycles: Managing Data for Strategic Advantages," discusses how to use utility computing to improve administration efficiencies and apply best practices uniformly across all resources. Read this chapter to learn how utility computing can save your customers time and money, while providing important compliance tools.

Service provider takeaway: Utility computing is technology that enables a service provider to make computing resources and infrastructure management available to customers as needed. This excerpt from Data Lifecycles: Managing Data for Strategic Advantages, discusses how to centralize storage management, improving administration efficiencies and allowing best practices to be applied uniformly across all resources. Read this chapter to learn how you can manage your customers' storage more efficiently, saving them time and money and providing them with important compliance tools.


In the 1970s and 1980s, mainframe computing comprised huge global computing systems. It was expensive and had a pretty bleak user interface: but, it worked. In the early 1990s enterprises moved to highly distributed client/server computing which allowed IT to deploy PC client systems with, on the face of it, lower cost and a much better end user experience. By the late 1990s, Internet computing allowed systems with the mainframe's centralized deployment and management but with rich PC-like browser-based user experiences.

Now, the industry is in that age of utility computing. utility computing is a term the IT community has adopted that represents the future strategy of IT. No vendor is embarking alone in this approach – all the major vendors have their own version of this vision. But whatever it is called, utility computing represents an evolution of the way corporations use IT. So, what's different about utility computing?

Utility computing is the first computing model that is not just technology for technology's sake; it is about aligning IT resources with its customers – the business. Shared resources decrease hardware and management costs and, most importantly, enables charge back to business units. Utility computing also has autonomic or self-healing technologies, which comprise key tools for the CIO to make business units more efficient. But it isn't possible to buy utility computing off the shelf because utility computing will evolve over the next 5 to 10 years as technology advances. Organizations, however, can help themselves by setting up the correct building blocks that will help intercept the future. Most enterprises now use available products for backup and recovery. Large organizations can also provide numerous IT management functions as a utility to the business.

If parts of a business are charged back for IT services, then the size of that charge back becomes a key measure of success. Data storage, for example, has costs associated with it the same way that paper-based filing cabinets, clerks, floor space and heating overheads did 20 years ago. Keep in mind that these solutions must provide a framework across heterogeneous IT infrastructures that provides IT with the ability to manage and justify all assets back to the business, as well as provide the business with continuous availability of mission critical applications and data. Even if the organization decides not to bill back, the insights can prove immensely valuable.

Attempting to make realistic IT investment decisions poses a dilemma for business leaders. On one hand, automating business processes using sophisticated technology can lead to lower operating costs, greater competitive advantage, and the flexibility to adjust quickly to new market opportunities. On the other hand, IT spending could be viewed the traditional way – a mystery, essentially due to the view of IT as an operational expense, variable cost, and diminishing asset on the corporate balance sheet.

More on data storage management
Data lifecycle management definition

ILM vs. DLM: The importance of data management

By treating IT as an operation, organizations combine the costs, making it next to impossible to account for individual business usage. From an operational perspective, this means that not only are usage costs hidden in expense line items, but also the line of business has no way of conveying its fluctuating IT requirements back to the IT department. Moreover, this usually leads to the IT department having a total lack of understanding for the business requirement for service levels, performance, availability, costs, resource, etc. Hence, the relationship between IT spending and business success is murky, and often mysterious. So utility computing attempts to simplify and justify IT costs and service to the business. Utility computing effectively makes IT transparent. In other words, a business can see where its funds go, who's spending the largest funds and where there is wastage or redundancy. Utility computing means that lines of business can request technology and service packages that fit individual business requirements and match them against real costs. This model, then, enables a business to understand IT purchases better, together with service level choices that depend on the IT investment. When making IT purchasing decisions, historically businesses arbitrarily threw money at the IT department to 'do computing' to make the system more effective. Now, utility computing enables businesses to obtain service-level agreements (SLAs) from IT that suit the business.

Transparency of costs and IT usage also enables organizations to assess the actual costs associated with operational departments. In the past, this was not possible because IT was simply seen as a single cost centre line item. Now, IT can show which costs are associated with which department – how much storage and how many applications the department is using, the technology required to ensure server and application availability, together with how much computing power it takes to ensure that IT provides the correct level of service. This visibility allows IT departments to understand storage utilization, application usage and usage trends. This further enables IT departments to make intelligent consolidation decisions and move technological resources to where they are actually needed.

Giving IT the ability to provide applications and computing power to the business when and where it is needed is essential to the development and, indeed, survival of IT. By being able to fine tune IT resources to meet business requirements is essential in reducing overall cost and wasted resource. It saves time and personnel overheads. Not only does it mean the end user experience is dramatically enhanced, but also the visibility of how IT provides business benefits becomes apparent. We may characterize IT as a utility, but what we really mean is providing IT services when and where they are necessary; delivering applications, storage and security, enhancing availability and performance, based on the changing demands of the business and showing costs on the basis of the use of the IT services provided.

The utility computing approach not only provides benefits to the business but also to the IT department itself. As IT begins to understand the usage from each of the business units, IT then has the ability to control costs and assets by allocating them to specific business departments and gives IT management a better understanding on how IT investment relates to the success of business tasks and projects. The utility approach gives IT the ability to build a flexible architecture that scales with the business.

The challenge for many IT departments is deciding how best to migrate current IT assets into a service model which is more centralized, better managed, and most importantly, better-aligned with the needs, desires and budgets of departmental users. This means increasing servers and storage utilization through redundancy elimination.

Utility computing methodology can provide significant cost savings. By delivering IT infrastructure storage as a utility, organizations can:

  • reduce hardware capital expenditures;
  • reduce operating costs;
  • allow IT to align its resources with business initiatives;
  • shorten the time to deploy new or additional resources to users.

Provisioning enterprise storage – including storage-related services such as backup and recovery and replication – within a service model delivers benefits for IT and storage end users. It can maximize advantages of multi-vendor storage pool resources, improve capacity utilization, and give corporate storage buyers greater leverage when negotiating with individual vendors. This service-based approach also allows storage management to centralize, improving administration efficiencies, allowing best practices to be applied uniformly across all resources, and increasing the scope for automation.

A storage utility delivers storage and data protection services to end users based on Quality of Storage Service (QOSS) parameters of the service purchased. Delivery is automatic. The end user need not know any storage and network infrastructure nuances to utilize capacity allocations or be assured of data protection. At the end of each month, billing reports detail how much storage each consumer used, the level of data protection chosen, and the total cost. This allows each consumer to assess storage resource usage –whether it is physical disk allocations or services offered to secure the allocations – and make decisions about how they plan to utilize the resources in the future.

A storage utility strengthens the IT department's ability to satisfy end user service level demands. By clearly stating the expected service levels of each packaged storage product, the IT department helps end users accurately map application needs to storage-product offerings. This gives the IT department a clear understanding of the service-level expectations of business applications. End users of the business application benefit by knowing that IT is able to live up to the service level it has defined.

Just as a storage utility can use storage management software and Network Attached Storage (NAS) or Storage Area Network(s) (SAN) technologies, a server utility can similarly 'pool resources' and automate rapid server deployment for specific critical applications to meet specific business requirements.

Automating application, server, and storage provisioning, as well as problem management and problem solving through policy-based tools that learn from previous problems solved, will play a large part in future advances in deploying utility storage. Predictions of future usage, as well as automated discovery of new applications, users, devices, and network elements, will further reduce the IT utility management burdens as it evolves from storage to other areas.

Use the following table of contents to navigate to chapter excerpts or click here to view Introducing Utility Computing in its entirety.


Data Lifecycles: Managing Data for Strategic Advantage
  Home: Introducing utility computing
  1: Real problems and real solutions: Using ILM to address compliance
  2: New storage management with utility computing
  3: Data lifecycle management: What should organizations consider?
  4: What does data lifecycle management mean?
  5: Why is IT lifecycle management important?
ABOUT THE BOOK:   
Plenty of storage products are now available, but the challenge remains for companies to proactively manage their storage assets and align the resources to the various departments, divisions, geographical locations and business processes to achieve improved efficiency and profitability. Data Lifecycles: Managing Data for Strategic Advantages identifies ways to incorporate an intelligent service platform to manage and map the storage of data. The authors give an overview of the latest trends and technologies in storage networking and cover critical issues such as worldwide compliance. Purchase the book from Wiley Publishing
ABOUT THE AUTHOR:   
Roger Reid is an enterprise storage architect for Veritas Software Corp. with more than 10 years of combined industry experience supporting various Fortune 500 customers in architecting and implementing a variety of storage solutions, including storage area networks, storage virtualization, active storage resource management, backup and hierarchal storage management products. Gareth Fraser-King is the manager for Product Marketing in the European, Middle East and African emerging territories producing high-level messaging, white papers, articles, presentations and marketing deliverables. He has worked as a writer and marketer for more than 20 years, the last 10 within the IT industry, and possesses a wide range of marketing experience, including copywriting, business, technical and service authoring, as well as business development, operation efficiency, strategic planning, affinity marketing, product development, and quality management.

Dig Deeper on Data Management Technology Services

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

MicroscopeUK

SearchSecurity

SearchStorage

SearchNetworking

SearchCloudComputing

SearchDataManagement

SearchBusinessAnalytics

Close