Datamation Logo

Virtualization Needs a New Backup Strategy

July 28, 2008
Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

MOUNTAIN VIEW, Calif. — As enterprises move more heavily into virtualization, they will have to overhaul their data backup and disaster recovery strategies because these won’t apply so well to the new virtualized world.

That’s the case Deepak Mohan, senior vice president of Symantec’s (NASDAQ: SYMC) data protection group, made in a press briefing here at the company’s offices where he discussed its strategies for disaster recovery, high availability and data protection.

There are two major reasons why virtualization requires a new approach to data backup and disaster recovery, Mohan said. One is virtual sprawl, which is the unchecked proliferation of virtual machines (VMs) (define). “Virtual machines are easy to deploy and propagate like rabbits, and that causes complexity of management from the data perspective,” Mohan explained.

The other reason is the difficulty of protecting and recovering applications in virtual environments. Distributing applications across VMs or across both VMs and physical servers further strains the backup and recovery systems. Finally, VMs can be easily moved from one physical server to another, using applications like VMware’s VMotion, which makes them more difficult to track and back up.

Mohan recommended that CIOs consider restructuring their data backup and disaster recovery strategies as soon as they begin to virtualize. In the traditional backup approach, where perhaps 20 virtual machines are running on one physical server, IT would have to back up each of those VMs and take one snapshot of the entire environment so it could recover one file or a number of files with a data protection product, Mohan said.

Symantec’s NetBackup enterprise-class flagship product offers a new approach — it lets users take only the one snapshot of the environment (instead of many) and conduct granular recovery of files from that single snapshot image.

This sort of granular recovery capability is getting more important as virtualization moves from development and testing labs to production environments where transaction-intensive applications are being used.

“Before, people were virtualizing print and other servers and testing and development, where losing data wasn’t that important, or consolidating legacy applications into smaller, newer servers,” 451 Group analyst Henry Balthazar told InternetNews.com. “Now, they’re moving into e-mail servers and transaction-oriented applications, where problems get magnified,” he added.

This article was first published on InternetNews.com. To read the full article, click here.

  SEE ALL
ARTICLES
 

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Datamation Logo

Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.

Advertisers

Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.

Advertise with Us

Our Brands


Privacy Policy Terms & Conditions About Contact Advertise California - Do Not Sell My Information

Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved

Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.