News Stay informed about the latest enterprise technology news and product updates.

Disk archiving: How do you know it’s not the next ILM?

Disk-based archiving is a popular topic today, one that you might be bringing up with customers or maybe they’re even asking you about it. As we detailed in our article on archiving basics, archiving is the movement of old data off of primary storage and onto a secondary storage tier. Archiving isn’t a new practice; what is new is the use of disk instead of tape or optical as the medium for that archive.

A lot has been written lately about archiving or optimizing primary storage. But you need to be able to separate hype from reality. The investment of time and money to purchase lab hardware and learn how to use it can be very costly. Picking a technology that’s going nowhere can be deadly. So how do you know that archiving won’t be the next ILM?

You remember ILM, right? Information lifecycle management. In early 2002, suppliers were touting ILM as the next big thing, perfect for cutting down on the amount of data on primary storage. Seminars were given, solutions were cobbled together, but no one bought. Why? We were trying to answer a question no one was asking! Yes, the customer had lots of old data taking up space on primary storage, but they also had lots of free capacity; selling them more so they could store the old stuff didn’t work.

Now times have changed. Customers are reaching new levels of disk space utilization, and with features like thin provisioning becoming more commonplace, they’re buying less excess space. While the price per gigabyte of disk capacity continues to decline, in primary storage it has leveled off a bit; yet the demand for even more capacity continues. Most customers won’t be able to justify the cost of adding more primary storage.

More importantly, the cost of the secondary tier can be further reduced through capacity optimization. For instance, some disk-based archiving solutions, like those offered by Permabit or EMC, have a built-in deduplication capability, and companies like Ocarina Networks provide software that can perform a more content-aware deduplication prior to migrating to the secondary tier.

In addition, because these solutions store more data on fewer spindles, they are more power-efficient than continuing to store this data on primary storage, which typically is not optimized and uses fewer higher-speed physical drives. Add to this the fact that companies like Nexsan, Xyratex and Copan can provide power management to their disk archive solutions, and the power savings are even more dramatic.

So, the difference between ILM and disk archiving? First, customers need disk archiving; utilization rates are much higher now than they were seven or eight years ago, so they have less excess capacity on primary storage. Second, the cost delta between primary storage and secondary storage, thanks to capacity optimization and power management, is significantly greater. Taken together, this means that disk archiving is a solution set that resellers should be investing in and talking to their customers about, without the fear that they’re heading down the road to another ILM.

George Crump is president and founder of Storage Switzerland, an IT analyst firm focused on the storage and virtualization segments. Prior to founding Storage Switzerland, he was CTO at one of the nation’s largest storage integrators.

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

Created a solution to allow a major automobile manufacturer outsource a repair operation from their business. They tried before and it failed badly, costing them a significant amount of money to bring it back in house.
My approach started with asking the customer what they don't know, what they want to know, what are the cost drivers to the business as it exists under their control. They were spending 1.6 to 3.4 MM per year on part racks, didn't know how long a part rack would last, did not have visibility with how they flowed between facilities, and had no self service mechanism for handling order adjustments other than calling someone (very manual, very inefficient and ineffective).
Provided the auto manufacturer with a repairs system that went well beyond their expectations, and designed it to not only answer their questions but also answer the better ones they weren't yet asking. The project was a huge home run and set the stage for a lot more business conversations as we became one of the kind go-to vendors you wish you had helping you with your business.
Calculated risks with a focus on revenue generation and innovative ideologies.
Not really. Revenue is something that random other people in this huge org are responsible for.
Any change involves some risks, and it is change that can lead to extra revenue generation. Whether it be scaling up, adding scope or changing the way things are currently being done. Added to this, it is because the job is involved and risky that we need a hierarchy and so many roles.
So the next question that comes to ones mind is how much of risk can we simultaneously take, how do we grade these risk items and how do we communicate internally to ensure prudent mitigation steps are taken. So in effect, do you feel free to take take risks - Yes, if it something that can bring me better rewards than I can manage by ignoring the opportunity.
Every IT team should be asking itself... what do we do so well, others would buy it from us. If the answer is nothing... perhaps there are other problems.
Innovation at all levels should be encouraged and innovation people pool or innovation incubators should be set up to be successful to generate revenues since doing what IT departments have been doing in companies may not help with revenue generation due to the very fact that it was not a primary focus.
the higher the risk the higher the return.
Once you identify the risks and develop a mitigation plan for the risks then you will be on the right path that will lead to revenue generation