Best practices for backups and archiving of paper and electronic data. One cannot rely on a weekly backup to a USB or tape. It is not a case of if this system will fail or break, but when.
According to Jonathan Young, Microsoft product manager at Vox Telecom, best practice in terms of backups and archiving is the ability to restore whatever you need in terms of your business requirements. “However, not all data and documents are equally important, so one needs to ascertain levels of importance in advance.”
JJ Milner, MD of Global Micro added that companies need to reach agreement on the risks and the cost of instituting a comprehensive backup and archiving system. “Essentially it boils down to a trade off of data lost versus capital expenditure. Some companies say that they just do not have the time to quantify and qualify their risks and in these instances, they then have no choice but to simply back everything up. However, I firmly believe that it is more cost effective to spend time classifying a business’s data according to value and risk of data loss so that a tiered approach to backup can be defined.
“Tiering allows a business to optimise their spend between mission-critical applications with high-speed (and therefore more expensive) recovery time objectives and long-term archiving, which is not as time sensitive. Algorithms can be written to, for example, automatically move data to a lower tier after a predetermined period based on its age, version or current legislation. By prioritising and moving data in this way, one is able to store large quantities of data and significantly lower cost,” Milner explained.
Young said one needs to differentiate between consumer and enterprise backups and provide packages that suit the specific needs of the individual or business. It is also important to note that while anyone can backup data with the multitude of products available, this does not guarantee that the data can be recovered and restored. Anyone can backup data, but the trick is in being able to restore it. In order to contain costs, while at the same time supply customers with a quality service, duplication of data should be avoided and files should be compressed.
Milner pointed out that one should not forget the importance of comprehensive archiving. “Backups are obviously critical, but one needs to be able to actually access the data one requires, quickly and expeditiously. Archiving mitigates against risk and allows one to find important documentation even when employees responsible for their collection leave the company.
“One requires a central, searchable repository which will provide a large level of control. This will also reduce the cost of having to reproduce data repeatedly when the original cannot be found and it will ensure compliance with regulatory and legal requirements,” he added.
Neither Milner nor Young are keen on non-IP storage of data. “One simply cannot rely on a weekly backup to a USB or tape. It is not a case of if this system will fail or break, but when. Given that stored data is generally important data, there is no time when failure is opportune,” said Young.
Milner added that theft of media is another overt risk associated with offline physical storage of data. “Backing data up in this manner is also a lengthy and time-consuming process and results in a period where data updates are unprotected.”
The alternative to physical storage of data on various media is Internet-based or cloud storage. “Although South Africa tends to lag behind the world in terms of technology adoption, once a trend starts it accelerates at a rapid pace,” said Young. “With the cost of bandwidth and hosting backups reducing, offsite cloud-based storage is fast becoming the method of choice. This has many benefits, including a safety net for other modes of storage, speed of backup and shifting of risk from a single physical environment onto multiple servers.”
Young said that by moving data offsite in a corporate environment, one is effectively taking the first step in future potential disaster recovery. “The level of deduplication and compression is highly favourable on the Internet and allows a greater level of protection.”
Milner detailed the procedure for moving data from an onsite to an offsite storage environment:
* Discovery. Decide what data needs to be backed up. At this stage a statistical or mock backup occurs. This allows companies to ascertain how long the actual backup will take and allows one to predetermine specific storage categories.
* The initial backup takes place.
* Incremental changes are conducted, over time, to the data centre.
* Policies are set up to move the data to different tiers or to delete data.
* Certificates are issued to prove that data has been deleted. This is critical as it ensures that no assumptions are made about sensitive data that has not actually been deleted. The recent approval of the Protection of Personal Information Bill (POPI) is very strict on the failure of businesses to implement effective information destruction practices.
Securing your data
So how do you ensure the security of your sensitive data on a cloud-based storage solution? “One has to make the assumption that nobody – your employees and service provider included – can be trusted with the security of your data. The secret is for selected management to hold the encryption key to access data. By ensuring that clients retain their encryption keys, service providers cannot be compelled by the courts to hand over data,” said Milner.
He added that in order to ensure your data is truly safe, it is advisable to perform a due diligence on the service provider and ask for referrals and guarantees that these companies can perform restoration of data.
“Another noteworthy point is that the service provider should not be merely a backup vendor, but rather should be prepared to spend the time and effort necessary to understand what their client is backing up. One standard we apply when storing data for clients is the Federal Information Processing Standard (FIPS). Reputable service providers will be willing to divulge their methodologies and compliance with standards,” said Milner.
With regard to employees, Milner pointed out that hardware failure accounts for only 3% of data loss. “Human error is by far the most common cause of data loss. You need to find a backup platform that does not require a software download or human intervention for updates. Systems should be accessible on the Internet and updates should be programmed to occur automatically, based on predetermined criteria.”
E-mail or Eekmail?
“Over the past two decades e-mail has become a fundamental method of communication. But with the rapid increase in data growth, companies are faced with the challenge of providing their users with long-term compliant storage,” said Christelle Hicklin, customer experience manager at Mimecast.
“Due to the sheer volume of e-mails generated, coupled with the legislation around authenticity and integrity of e-mails, it is essential to have a unified archiving system in place. An overarching perspective is needed to streamline, rather than complicate, the archiving of e-mails. Archiving systems quickly lose value if documents cannot be found effortlessly,” she added.
“This necessitates a refinement of search techniques and is particularly relevant in cases of e-discovery (electronic discovery) which refers to any process in which electronic data is sought, located, secured, and searched with the intent of using it as evidence in a civil or criminal legal case,” Hicklin explained.
She believes that the cloud is the perfect solution for offsite storage and archiving of e-mails. “There is no upfront investment and one gains access to the latest technology. The disaster recovery perspective is far superior to onsite storage methods and e-mails can be archived in triplicate, with 32-bit encryption, for full assurance.
“Other advantages include the fact that solutions are scalable, with a pay-per-use option allowing any size of business the affordability of offsite e-mail storage and archiving. Data is always secure, since the encryption key is held by the customer and there is a complete audit trail so companies can track who is trying to access their e-mails,” she added.
Hicklin said that in the past, archiving of data was an end in itself. “New methodologies mean that it is now a means to an end. You can now acquire value from your data as the system consolidates documents across all e-mail platforms into a single feed. This puts all data into one place where you can quickly find the material you need,” she concluded.
Accountability is extremely important in terms of the management of paper documents, according to Guy Kimble, MD of Metrofile Records. “One person in a company should be made responsible for the control and understanding of any paper archiving system. The procedures to follow are: (a) Discovery – what are the records to be archived, what are they used for and how often they are used; (b) Do a classification of the records; (c) Create a filing system.”
Kimble added that initially one should create a written database of the files for archiving and later add them into a software program. “For business success, one needs to identify and determine accessibility rights then allow ease of access for either one or multiple people. In addition, there is a need to understand the legal and accountability factors related to storing and archiving paper documents.
“A records management solution protects the business, its employees, shareholders, customers, suppliers and future stakeholders. It also provides continuity for disaster recovery. By implementing the archiving of documents, companies effectively protect their intellectual property, resulting in sustainability of operations. In addition, the system provides protection in terms of legal procedures, the Labour Relations Act, and occupational health and safety (OHS) regulations, for evidentiary purposes,” Kimble explained.
In terms of the deletion or destruction of documents, Kimble said that only once you have ascertained what records need to be retained and why they need to be retained, can you make an informed decision on how long you need to keep them before they are destroyed. The retention period is then linked to a file plan and the record management system.
“Records should be destroyed in a confidential manner. This can be conducted onsite, preferably using a cross shredder, which shreds both horizontally and vertically. Another machine rips the paper until it resembles pulp. Both methods result in waste that cannot be patched back together to form the original document. Offsite destruction follows a similar procedure,” said Kimble.
Companies will need to review their records management systems in order to safeguard individuals and comply with POPI, said Kimble. “Information can only be used for its original intention and complete audit trails need to be in place. My advice would be for companies to leave the setup of archiving systems to companies that specialise in this. The best systems are simple to administer and need not be expensive. A reputable service provider will streamline the process from the outset, thereby eliminating duplication and reducing unnecessary time wastage.”
|Tel:||+27 11 543 5800|
|Fax:||+27 11 787 8052|
|Articles:||More information and articles about Technews Publishing|
© Technews Publishing (Pty) Ltd | All Rights Reserved