Data deduplication has remained the hottest technology in the data backup and recovery market this year, even in the midst of economic woes. So far this year, six companies -- Acronis Inc., Barracuda Networks Inc., CA, CommVault Systems Inc., IBM Corp. and Symantec Corp. -- have either added or expanded data deduplication in their data backup software.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
An emerging trend in data deduplication is a movement toward deduplicating data at the source, or the client server that hosts the application.
Find out the five questions to ask in a data deduplication project.
Disaster recovery site options: Hot, warm and cold sites
Disaster recovery (DR) terminology can be confusing -- terms like hot site, warm site and cold site are common in DR parlance. Each option is a reliable disaster recovery site, but which one should you choose for your company? Here's a look at the differences in hot site vs. warm and cold sites in disaster recovery and the pros and cons of each.
If the acceptable recovery time objective (RTO) for your company is a few hours instead of minutes, then a hot site is likely appropriate. The biggest difference between a hosted site and a hot site is the use of shared equipment for infrastructure components like servers and peripherals. Storage is dedicated and real-time data replication is used to get data from the production site to the disaster recovery site.
Because equipment in the DR site is shared by multiple customers, hot sites are significantly less expensive than hosted sites. "Hot sites and warm sites can be implemented less expensively through outsourcing than doing them in-house because of shared equipment," said George Ferguson, worldwide service segment manager for Hewlett-Packard (HP) Co.'s business continuity and recovery services.
Read the full story on disaster recovery site options.
Iowa Health System uses 'cloud' for disaster recovery to survive flood
The Iowa Health System didn't set out to design a private cloud, but its unintentional internal storage cloud built with Bycast Inc. software and IBM Corp. and NetApp storage ended up saving its mission-critical data from getting lost in the Iowa flood last year.
The Iowa Health System has three data centers connected by a 3,200-mile fiber optic network, supporting 13 hospitals. Bycast StorageGrid software powers a private cloud on SAN and NAS systems across two data centers, running applications in the hospitals and clinics across the state.
IT director of infrastructure Tony Langenstein said the medical network has about 340 TB on the Bycast cloud, plus another 140 TB of data on IBM TotalStorage DS4100 SAN and IBM-branded NetApp network-attached storage (NAS) -- 70 TB in each data center.
Read the full story about Iowa Health System's private cloud.
Easy ways for SMBs to improve their disaster recovery and pandemic plans
These days, disaster recovery (DR) planning is much like getting a flu shot. Everyone knows they should get one, but most avoid it anyway, ignoring their doctor's advice and complaining about the risk of feeling slightly off for days following the shot. But by the time November hits, they're stuck in bed with a fever and regretting their decision not to protect themselves.
With the flu season in full swing and the rise of the H1N1 virus across the country, it's important for companies to take pandemic planning and disaster recovery planning more seriously. SMBs are especially at risk for serious impact if their company is affected by a pandemic. Already staffed with fewer employees, there is a higher risk for things to go wrong when individuals are out sick.
Read the rest of this story about improving disaster recovery and pandemic plans.
Cloud storage provider Zetta looks to replace production network-attached storage
Zetta Inc. today made its Enterprise Cloud Storage service available, and touts its hosted offering as a superior method of handling proprietary file data than traditional network-attached storage (NAS) systems.
Zetta is targeting enterprise data storage shops of 10 TB or larger with the service, which is based on a proprietary file system that performs continual data integrity checks and distributes data over hardware nodes in an N+3 redundant configuration.
Unlike a majority of cloud storage providers so far, which have focused on cloud storage of "cold" or inactive data like archives, backups or disaster recovery copies, Zetta's goal is to be the primary file system for tier-2 applications at large enterprises. That strategy puts it in competition not with other cloud storage services but with storage system vendors such as EMC Corp. and NetApp Inc.
Zetta's infrastructure has three basic layers. The first is a protocol translation layer that provides standard NFS, CIFS, http or FTP access into its cloud for users' on-premise applications. This layer also provides quality of service management, volume virtualization and inline data verification performed using a hashing algorithm, which returns "write receipts" to the customer confirming their data has been stored correctly.
Find out how to become a cloud storage services provider.
Quantum launches midrange data deduplication backup appliances
Quantum Corp. is battling back against EMC Corp.'s Data Domain data deduplication boxes in the midrange with a new network-attached storage (NAS)-interface family of data backup appliances.
The DXi6500 platform consists of five models. The 6510 and 6520 are available now, with the larger 6530, 6540 and 6550 coming in 2010. The 6510 has 8 TB of usable data and two Gigabit Ethernet ports. The 6520 scales from 8 TB to 32 TB and has six Gigabit Ethernet ports. The largest member of the family, the 6550, will scale to 56 TB of usable data and support 10 Gigabit Ethernet.
Quantum bills the midrange systems as simple to set up and use, and they include data deduplication and replication in the base price, which starts at $64,000 for the 6510.
See our data deduplication cheat sheet.
Riverbed updates RiOS; Steelhead WAFS device now supports Citrix and disaster recovery
Riverbed Technology Inc. today updated the Riverbed Optimization System (RiOS) software that runs its Steelhead Wide Area Files Services (WAFS) devices, adding support for accelerating Citrix Systems Inc. virtual desktops over wide-area networks (WANs), and improving data protection and security.
Riverbed's RiOS 6 now allows replication processing for disaster recovery or data backup in the same Steelhead device used for application acceleration. Riverbed also added centralized printing and expanded reporting with the new release.
New application support
Steelhead can now optimize Citrix Systems' ICA protocol, which the Citrix XenApp and XenDesktop products use to deliver either centralized application streams or virtual desktop images over a network. Riverbed claims its optimization can lower response times for those applications by 30% to 50%.
Bob Laliberte, an analyst at Milford, Mass.-based Enterprise Strategy Group (ESG), said this will be the next big trend for enterprises. "As virtual desktop starts gaining popularity and momentum in the enterprises, it will be important to be able to accelerate that traffic," he said. Riverbed already supported VMware virtual desktops.
Read the full story on Riverbed's Steelhead WAFS device.
Unilever maintains 5 PB Fibre Channel SAN storage performance with Virtual Instruments' NetWisdom
Unilever Global, one of the world's largest consumer products companies, is making Virtual Instruments' NetWisdom performance monitoring tool a permanent part of its Fibre Channel storage-area network (FC SAN) infrastructure after it helped the IT staff resolve storage performance issues last year.
Headquartered in London and Rotterdam, The Netherlands, Unilever Global is the parent company of several household name brands, including Bertolli pasta, Lipton Tea, Slim-Fast diet drinks and Dove soap. Its data centers are located in northern England (company officials declined to specify the exact location for security reasons) and host approximately 5 petabytes (PBs) of data on a large and complex, multilayered FC SAN fabric.
Unix administrator Paul Faid said the company last year experienced storage performance slowdowns in its SAN infrastructure that were ultimately traced to overloaded ports on some of its Brocade Communications Systems Inc. switches, purchased through Hewlett-Packard (HP) Co.
Use our Storage Capacity Planning and Performance topics page as a resource.
BakBone phasing out virtual tape library, adds data deduplication with NetVault Backup 8.5
BakBone Software Inc. will begin phasing out its software virtual tape library (VTL) in favor of its new SmartDisk plugin, which adds data deduplication to BakBone's NetVault Backup product.
SmartDisk is a software-based network-attached storage (NAS) interface into a disk-based data backup repository that also performs post-process data deduplication. It can be run inside the backup server, on a separate piece of storage hardware, or on a virtual machine supplied by the user.
The product will eventually replace BakBone's virtual tape library, according to product manager Dawn renee Campbell. "Over the next couple of releases, we'll be phasing out VTL and replacing it with SmartDisk," she said. Over the next three releases, which could take about two years to complete, customers will first have a choice between VTL and SmartDisk. The next major release will ship with SmartDisk but customers will have the option of buying VTL.
Learn how to size a virtual tape library for customers.
Dot Hill releases AssuredSAN line to the channel
Dot Hill Systems Corp. this week introduced to the channel the AssuredSAN family of disk-to-disk SAN products, which combines storage capability with enhanced data protection software features. The company said AssuredSAN includes AssuredSnap and AssuredCop disaster recovery software, to create self-contained backup and recovery appliances for applications that are independent of application or server operating system environments.
Additional storage news
Check out last week's storage channel news roundup.