Backing up data to the cloud sounds like a great idea. But, then what? You’ve decided to take the plunge, now you have to start the journey. I talked about other parts of backup and recovery in Part 1 and Part 2 of this series.
There are several factors to consider when making the switch to cloud storage. Cloud storage is ideal for quick recovery of data and long-term archiving, but is your infrastructure capable of handling it? Backing up to the cloud is directly dependent on the speed and capacity of your internet connection; data needs to move from the backup servers to the cloud quickly to tackle large amounts of data. Internet connectivity also determines how quickly large amounts of data can be restored and if it will meet your requirements.
What Data Is Cloud-Worthy?
The first step in evaluating cloud backup is determining which data would be a good candidate. This can be done by identifying which data would benefit from being sent to the cloud. Data that is critical to the business and that may need to be restored quickly would be in the first tier of cloud-worthy. Then determine how often this data should be sent to the cloud.
Using your current offsite tape rotation is a good starting point. In most cases, this would be a weekly tape rotation, a Friday full backup would be picked up Monday morning. For a cloud backup, the data would have the weekend to replicate to the cloud, the time window and speed of the internet connection determines how much data can be sent offsite. If your internet connection isn’t fast enough or saturated, then the backup may never finish. This will cause problems with not just the backup jobs, it could also impact other services.
Once you have figured out the data to be protected and the frequency of the jobs, then you can start to estimate how much storage will be needed in the cloud. This will also help in estimating the costs. There are two parts in estimating cloud storage with most backup solutions: total storage and number of operations.
Total storage is how much actual room is needed to store data. This is related to how many backup jobs are run and how long they are retained. Most backup rotations involve daily or differentials during the week, Monday to Thursday, which picks up any files, applications, databases or services that have changed. Full backups are done at the end of the week, usually Friday, since data is mostly at rest over the weekend. Full backups can be kept for several weeks, depending on your needs.
The last day or last Friday of the month, another full backup is run to capture a monthly backup. Monthly backups are usually kept for at least 12 months and possibly longer. Finally, a yearly backup is run on the last day of the calendar or fiscal year. These can be kept for several years. The length of retention is determined by the business needs, compliance or regulatory requirements.
Number of Operations
The second component to estimating cloud storage is the number of operations, which are typically the number of reads or writes of data to the cloud blob storage. Operations are executed each time a backup job sends data to the cloud storage account. These can be estimated by running several test backup jobs and gathering the number of operations. This number can then be divided by the amount of storage consumed to provide the number of operations per GB.
In this example, we are backing up 2TB of data every week during the weekly backup job:
2TB backup data x 4 weekly, 1 monthly = 9TB total storage
2TB backup data/8,360,000 operations = 4,180,000 operations/TB
(11,000GB * $0.0208) + (4,180,000/10,000 * 11 * $0.05) = $276.09
Microsoft and most other cloud vendors provide a calculator to estimate pricing.
These calculators have the latest pricing, which changes frequently. The calculators also help to determine the right type of storage to use. The standard type of storage is blob, but there are different regions to store the data and different ways to add redundancy and encryption. Each vendor has their own recommendations on the correct configuration.
There are ways to replicate data into other geographies or regions that will protect against the loss of a cloud data center. This feature will, in most cases, double the storage costs while keeping the operations cost the same. When setting up cloud storage, pick a region that is close to the backup server to reduce latency and improve internet performance, this will also help with any restores.
Most cloud vendors also offer different tiers of storage, typically either hot or cool. Hot storage allows your data to be readily available when you need it. There is no delay when attempting to retrieve any data. Cool storage is for data that is infrequently accessed or archived for long periods of time. The performance is like hot storage and is readily available, but less expensive and recommended for backups. For short retention, cool storage offers few advantages over hot storage; the benefit is when backups are stored for several years.
Is Your Internet up to the Task?
Now that we know what data will be backed up to the cloud and how much, we can now figure out if our internet connection will be able to support a typical backup cycle. For instance, if 1TB of data is sent to the cloud every Friday night and the back window is until Sunday, then our upload speed will need to be at least 50Mbps. This doesn’t take into account other applications that may also need to upload during this time. This calculator will help determine how much data can be sent to the cloud during your backup window.
Depending on the available internet connection, the size of date being backed up or the frequency may need to be adjusted to fit within the backup window, so as not to impact other applications.
During the first few weeks after enabling cloud backup, keep track of the usage and operations to determine if the cost estimates are keeping within budget. Adjustments to retention and the frequency of backup jobs have a cumulative effect on the cost of cloud storage. By properly classifying data and estimating storage requirements you will help avoid any unforeseen surprises.
In the next part of this series, I will explain how to configure Veritas Backup Exec to use Microsoft Azure storage.