By Ian Hamilton
In the aftermath of the September 11 terrorist attacks, increasing concern has been voiced about the security of both business and government computer systems. How safe is sensitive data? And when systems are destroyed in any kind of disaster, how do organizations go about the process of recovery?
Many businesses requiring disaster-tolerant systems maintain a duplicate of all data stored in the primary data facilities in a remote data facility. Typically, data in the remote facility is synchronized in near real time. Disk storage systems map the information in files or database into tracks and blocks. Each time a track or block is written to the local disk, it is also written to the remote disk.
These storage systems ensure minimal loss of data in the event of a disaster, but not without substantial system and telecom costs. Specifically, in a case where the remote facility is a significant distance from the primary facility, telecom costs can be substantial, often requiring DS3 connectivity, with associated costs in the range of $20,000 per month for a cross-country circuit.
In less critical situations, a commonly employed disaster recovery technique is offsite storage of backup media. In the event of a disaster, the backup media is used to load data on a remote system. This approach requires substantial time to recover from disaster, and all updates between backups are lost.
Another approach used in medium-critical data replication situations is scripted FTP. One financial services firm pointed out that it has around 100 FTP scripts running each night, collecting information from departmental file services, then bringing them back to corporate file servers for backup and disaster recovery services. Supporting this system requires a substantial IT staff and network bandwidth, since entire copies of the file system are transferred.
So, how do mid-sized and smaller companies protect critical data without busting their IT budgets to do so?Using a trusted data transfer service, entire file systems can be replicated using fewer human resources and significantly less bandwidth.
Incremental transfers allow businesses to transfer only portions of the file systems that are different on the source and target hosts. Synchronization can be scheduled to occur as frequently as the underlying business requirements dictate. Synchronization can additionally be performed on portions of file systems, between heterogeneous systems and on any combination of source and target hosts. Public networks can be used to transfer data for synchronization without fear of data corruption or interception.
Bottom line: All companies need a plan in place to protect organizational data. Disasters occur and they come in all shapes and sizes. Whatever might happen, no organization can afford to let it take the business down.
Ian Hamilton is vice president of research and development for Signiant, Inc., a provider of trusted data transfer services for businesses.
Ethics and Artificial Intelligence: Driving Greater Equality
FEATURE | By James Maguire,
December 16, 2020
AI vs. Machine Learning vs. Deep Learning
FEATURE | By Cynthia Harvey,
December 11, 2020
Huawei’s AI Update: Things Are Moving Faster Than We Think
FEATURE | By Rob Enderle,
December 04, 2020
Keeping Machine Learning Algorithms Honest in the ‘Ethics-First’ Era
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 18, 2020
Key Trends in Chatbots and RPA
FEATURE | By Guest Author,
November 10, 2020
FEATURE | By Samuel Greengard,
November 05, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
November 02, 2020
How Intel’s Work With Autonomous Cars Could Redefine General Purpose AI
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 29, 2020
Dell Technologies World: Weaving Together Human And Machine Interaction For AI And Robotics
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
October 23, 2020
The Super Moderator, or How IBM Project Debater Could Save Social Media
FEATURE | By Rob Enderle,
October 16, 2020
FEATURE | By Cynthia Harvey,
October 07, 2020
ARTIFICIAL INTELLIGENCE | By Guest Author,
October 05, 2020
CIOs Discuss the Promise of AI and Data Science
FEATURE | By Guest Author,
September 25, 2020
Microsoft Is Building An AI Product That Could Predict The Future
FEATURE | By Rob Enderle,
September 25, 2020
Top 10 Machine Learning Companies 2021
FEATURE | By Cynthia Harvey,
September 22, 2020
NVIDIA and ARM: Massively Changing The AI Landscape
ARTIFICIAL INTELLIGENCE | By Rob Enderle,
September 18, 2020
Continuous Intelligence: Expert Discussion [Video and Podcast]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 14, 2020
Artificial Intelligence: Governance and Ethics [Video]
ARTIFICIAL INTELLIGENCE | By James Maguire,
September 13, 2020
IBM Watson At The US Open: Showcasing The Power Of A Mature Enterprise-Class AI
FEATURE | By Rob Enderle,
September 11, 2020
Artificial Intelligence: Perception vs. Reality
FEATURE | By James Maguire,
September 09, 2020
Datamation is the leading industry resource for B2B data professionals and technology buyers. Datamation's focus is on providing insight into the latest trends and innovation in AI, data security, big data, and more, along with in-depth product recommendations and comparisons. More than 1.7M users gain insight and guidance from Datamation every year.
Advertise with TechnologyAdvice on Datamation and our other data and technology-focused platforms.
Advertise with Us
Property of TechnologyAdvice.
© 2025 TechnologyAdvice. All Rights Reserved
Advertiser Disclosure: Some of the products that appear on this
site are from companies from which TechnologyAdvice receives
compensation. This compensation may impact how and where products
appear on this site including, for example, the order in which
they appear. TechnologyAdvice does not include all companies
or all types of products available in the marketplace.