188.8.131.52 What is IT Lifecycle Management? (Defining DLM/ILM/TLM)
Data Lifecycle Management (DLM) is a policy-based approach to managing the flow of an information system's data throughout its lifecycle – from creation and initial storage, to the time it becomes obsolete and is deleted, or is forced to be deleted through legislation.
DLM products attempt to automate processes involved, typically organising data into separate tiers according to specified policies, and automating data migration from one tier to another based on those criteria. As a rule DLM stores newer data, and data that must be accessed more frequently, on faster, but more expensive storage media. Less critical data is stored on cheaper, but slower media.
Early types of DLM tools included HSM. The hierarchy represents different types of storage media, such as RAID (redundant array of independent disks) systems, optical storage, or tape, each type representing a different level of cost and speed of retrieval when access is needed. Using an HSM product, an administrator can establish and make policies for how often different kinds of files are to be copied to a backup storage device. Once the guideline is established, the HSM software manages everything automatically.
Typically, HSM applications migrate data based on the length of time elapsed since it was last accessed, whereas DLM applications enable policies based on more complex criteria. The terms Data Lifecycle Management (DLM), Information Lifecycle Management (ILM) and Total Lifecycle Management (TLM) are sometimes used interchangeably. However, a distinction can be made between the three.
- DLM products deal with general file attributes, such as file type, size, and age.
- ILM products have more complex capabilities. For example, a DLM product allows searching of stored data for a certain file type of a certain age, whereas an ILM product allows searching of various types of stored files for instances of a specific piece of data, such as a customer number.
- TLM products allow formulating complex requests across multiple storage tiers and heterogeneous operating systems to provide a more complete approach to managing all structured and unstructured data. Data management has become increasingly important as businesses face compliance consequent to modern legislation, such as Basel II and the Sarbanes-Oxley Act, which regulate how organizations must deal with particular types of data. Data management experts stress that DLM is not simply a product, but a comprehensive approach to managing organizational data, involving procedures and practices as well as applications. Fundamentally what has happened over the last 15 years, since the advent of 'Open Systems', is that the ability to process information in a coherent, cohesive and consistent manner has been lost, or at the very least, seriously mislaid.
It would be quite a powerful technology that could examine an organisation's data storage and re-file all data consistently, in an intelligent manner, and that would allow the organisation not just to retrieve information easily (because every file system would be standard), but to store that data logically together in appropriate batches with like-times for deletion, as well as migrating data to upgrade storage – keeping the integrity of the data intact – and bringing a copy of the appropriate version software with it so it can be read in the future.
Suppose it is important to find data in the future and that it is not conveniently located where one would expect to find it … necessarily. So what? What drives the need for DLM or ILM products and services?
- Emerging regulatory and compliance issues (Data Protection, HIPAA, International Accounting Standards, Sarbanes-Oxley, Basel II, etc.), which drives:
a) unbridled data growth (both structured and unstructured data), which promotes
b) the variability in value of data that an organisation owns.
- Organisations continue to pressure CIOs to manage more with less, and to control costs, so – it is becoming increasingly difficult, nay, down right impossible, to manage an organisation's
a)data manually across an increasingly distributed and complex environment with any kind of hope of success.
This has not always been the case. Back in the 1970s and 1980s, mainframes kept all the data in logical file systems as previously mentioned. However, since the arrival of Client Server/Open Systems in the early 1990s, the art of information management has been lost. Basically, personal record keeping has become a chaotic free-for-all. Each individual stores and saves his or her data in different ways. According to a leading analyst, 60% of all data is unstructured – our email, file and print servers (word docs, XLS spreadsheets etc.). How then, does one find a specific piece of data from an employee who worked at the company for four years and left two years ago? With no 'due process' it's not easy – and there have been several organisations that have been billed in the six figure region just to find and retrieve the data. To make matters worse, all offices started to go 'paper-free' around the early 1990s. Prior to this, all organisations had to store their information in hard copy storage systems, including Microfiche, all of which were fairly sophisticated with offsite, fireproof, storage facilities and processes behind filing and record keeping as well as audit trails to show due diligence. However, since the early 1990s these hard copy storage warehouses have slowly but surely disappeared, replaced with electronic data warehouses.
Use the following table of contents to navigate to chapter excerpts or click here to view Introducing Utility Computing in its entirety.
Data Lifecycles: Managing Data for Strategic Advantage
Home: Introducing utility computing
1: Real problems and real solutions: Using ILM to address compliance
2:New storage management with utility computing
3:Data lifecycle management: What should organizations consider?
4:What does data lifecycle management mean?
5:Why is IT lifecycle management important?
|ABOUT THE BOOK:|
|Plenty of storage products are now available, but the challenge remains for companies to proactively manage their storage assets and align the resources to the various departments, divisions, geographical locations and business processes to achieve improved efficiency and profitability. Data Lifecycles: Managing Data for Strategic Advantages identifies ways to incorporate an intelligent service platform to manage and map the storage of data. The authors give an overview of the latest trends and technologies in storage networking and cover critical issues such as worldwide compliance. Purchase the book from Wiley Publishing|
|ABOUT THE AUTHOR:|
|Roger Reid is an enterprise storage architect for Veritas Software Corp. with more than 10 years of combined industry experience supporting various Fortune 500 customers in architecting and implementing a variety of storage solutions including storage area networks, storage virtualization, active storage resource management, backup and hierarchal storage management products. Gareth Fraser-King is the Manager for Product Marketing in the European, Middle East, and African emerging territories producing high level messaging, white papers, articles, presentations, and marketing deliverables. He has worked as a writer and marketer for over 20 years, the last 10 within the IT industry, and possesses a wide range of marketing experience, including copywriting, business, technical and service authoring, as well as business development, operation efficiency, strategic planning, affinity marketing, product development and quality management.|