It should be obvious that organisations must manage and store data more effectively. The upside is that ILM/DLM makes good business sense: in fact, that's why it existed in the first place. ILM/DLM is a prerequisite for good corporate governance, but is also an integral part of good business conduct. It protects reputations and manages risk, as well as promoting a safe, secured transaction environment. It protects global financial market safety and stability as well as tracking suspicious customers' movement. It adds value to customers' confidence and with it competitive advantage. It helps prevent terrorist money-laundering activities and harmonises international regulatory approaches. Why wouldn't anyone want to know about Data Lifecycle Management?
1.2.4 Goals of data lifecycle management
Data is one of the most important organisational assets.
The above statement must be pure plagiarism. How many books,white papers, web sites or articles have made that statement? How many analysts, journalists, sales managers, business managers, business gurus, marketing managers, operational managers, database administrators and system administrators and have been bleating on about the benefits of looking after an organisation's data and information? Surely businesses must have caught on by now? Possibly, but unfortunately, probably not; but the scene is changing. Instant data gratification is out and data longevity is in. With an increasingly
Data as an asset is important in providing organisations with valuable information. Data becomes information and informationbecomes knowledge. This book discusses the differences between Data Lifecycle Management, Information Lifecycle Management, and Total Lifecycle Management in detail and examines the dichotomy at length. Although the principles behind the threeconcepts remain fundamentally different, it is all still data. Information management suggests that an organisation has done something intelligent with its data, and knowledge suggests that some cognitive process has been applied to that information. From a technological point of view it is easy just to refer to DLM, and so initially we need to describe the fundamental goals of Data Lifecycle Management and its platform.
- To make an organisation's data accessible. All data should be readily available to support the businesses to which it is purposed. Availability requirements should not be restrictive.
- To have an adaptable design and architecture. Data continually changes. Hence, the processes, methodologies and underlying technologies that manage it should adapt to meet growing data demands.
- To provide operational security to the asset. The data management platform coupled with its process and methodology should provide auditing, tracking, and controlling mechanisms to manage the data effectively. Specifically, it must provide a complete management infrastructure that affords greater visibility into its daily use.
126.96.36.199 What are the technology trends we will see over the next few years?
Here are some of the expected technology trends:
- Hierarchal Storage Management (1997–2005). Vendors are already veering from using the HSM term and using both DLM and ILM instead. Although the term ILM is fraught with confusion and conflicting interpretations from various vendors (depending what technology they offer), vendors are already introducing numerous technologies that will be the starting point for the development of automated ILM products.
- Data Lifecycle Management (2003–2006). In the last few years, DLM products have emerged – the father of ILM if you like. With the increase in retrieval requirements through compliance issues and other emerging regulations, organisations have started exploring new ways to backup, store, manage and tracktheir most critical data starting with virtual tape and diskbased backup. They have also started to implement some basic tiered storage capabilities, such as moving 'stale' data from their high-performance disk arrays to more cost-effective systems or deleting irrelevant data (MP3 files) altogether.
- Manual Information Lifecycle Management (2004–2007). This technology has the capability to index, migrate and retrieve data as well as prove its authenticity – on any part of the infrastructure. Although still a manual process when setting policies against business requirements, this is the point where ILM gives organisations the ability and technological intelligence to implement more-powerful storage policies. Some organisations will utilise virtualisation applications which logically group many arrays into a single 'virtual' storage pool and host them on emerging 'smart' storage switches. Manual ILM will provide users with a single logical file system view that is, in reality, scattered across multiple media types in multiple locations. ILM will enable companies to move data fluidly within the storage infrastructure as their evolving policies dictate, while shielding administrators and users from the underlying complexity.
- Automated Information Lifecycle Management (2006–2008).
Automated ILM will integrate products that manage storage, virtualisation, and the data itself. Numerous storage management stack aspects will be imbedded into the infrastructure. Compliance capabilities (retention, deletion, etc.) will also be imbedded into a number of storage management products and be well established in some vertical markets, especially the financial sector. The transition from manual ILM to automated ILM will require additional technologies in order to manage data as "information', as opposed to managing it as 'data'.
- Automated Total Lifecycle Management (2007–2010). The total cost and value of a piece or set of data depends on every phase of its lifecycle, as well as on the business and IT environments in which it exists. TLM will automate the way that organizations look at their entire data set. TLM will offer organisations the ability to protect against media obsolescence, legacy data, future hardware changes, as well as dealing with all manner of diverse mobile assets, automatically managing storage costs and data movement (to lower cost storage options) as required, providing audits where required without the need for manual intervention. In other words all an organisation needs to do is decide on the data policy and TLM does the rest.
Use the following table of contents to navigate to chapter excerpts or click here to view Introducing Utility Computing in its entirety.
|ABOUT THE BOOK:|
|Plenty of storage products are now available, but the challenge remains for companies to proactively manage their storage assets and align the resources to the various departments, divisions, geographical locations and business processes to achieve improved efficiency and profitability. Data Lifecycles: Managing Data for Strategic Advantages identifies ways to incorporate an intelligent service platform to manage and map the storage of data. The authors give an overview of the latest trends and technologies in storage networking and cover critical issues such as worldwide compliance. Purchase the book from Wiley Publishing|
|ABOUT THE AUTHOR:|
|Roger Reid is an enterprise storage architect for Veritas Software Corp. with more than 10 years of combined industry experience supporting various Fortune 500 customers in architecting and implementing a variety of storage solutions including storage area networks, storage virtualization, active storage resource management, backup and hierarchal storage management products. Gareth Fraser-King is the Manager for Product Marketing in the European, Middle East, and African emerging territories producing high level messaging, white papers, articles, presentations, and marketing deliverables. He has worked as a writer and marketer for over 20 years, the last 10 within the IT industry, and possesses a wide range of marketing experience, including copywriting, business, technical and service authoring, as well as business development, operation efficiency, strategic planning, affinity marketing, product development and quality management.|
This was first published in November 2007