Create Content

Description

Describes recommendations for performance regarding Cloud Backup

Content / Solution:

There are lots of factors that make up backup performance. Simple things like the following will make a huge difference in how long your backup will take as per your backup schedule:

  • Number of CPU's in the machine 
  • Amount of RAM in the machine 
  • Available free disk space 
  • Speed of spinning disk ( SATA, SAS or SSD ) 
  • Configured Network card / speed 
  • Are there applications running on the server 
  • Has AntiVirus been correctly configured

These are some of the considerations that need to be taken into account when performing a backup of your machine. When designing the Cloud Backup platform Dimension Data has taken some of the above-mentioned factors into consideration, however as the servers are configurable by the customer through our self-service portal some of these decisions need to be made by them.

Here is an example of a performance issue that can be easily fixed just by changing the spec of the server that has been provisioned.

  1. Windows 2008 R2 server 
  2. 1 VCPU 
  3. 2 GB of RAM 
  4. 100GB C Drive 
  5. AntiVirus Agent installed 
  6. File System Backup Agent installed

Now with this configuration backups can be expected to run a long time due to the machine having a lower specification. With the example above, the machine has a very limited amount of Memory (RAM) and CPU. Because of this configuration, the backup times will take double or longer to complete. However, if the machine was to have the vCPU count increased to 4 and the amount of RAM increased to 4GB, this would aid in the overall performance of the server and the backup performance as well.

This example that has been used has been seen many times with regards to the Cloud Backup Platform and the recommendation has always been the same, the server resources are too limited and they need to be increased. 

Another thing that needs to be taken into consideration when looking at backups are, before any of the data leaves the server it needs to be compressed/deduped and encrypted, all of which take processing power. If the server is under powered then it can take a very long time for backups to complete. It will also cause high utilisation on the server as well, which may impact the user experience. 

AntiVirus

It is always recommended that AntiVirus is only enabled for incoming scans. It is highly recommended that outgoing scans are disabled as this will only decrease the performance of the backup on average by 50%. (This is what has been observed in the past is not a reflection on all servers).

Disk space

It is also important to ensure that the server is not running out of disk space as this can be a contributing factor to poor performance. It is always recommended that there is a minimum of 25% free disk space at all times to ensure the best possible performance for not only backup but for general operating and application performance as well.

Disk Performance

Disk performance is also important when it comes to performing backups. If you have a system that is running on the slowest available disk and the file system is quite large then the backup will take a long time to complete. If the server is running a database or other type of application and it requires High Performance disk speed to allow the system to perform well, it is highly likely that the backup will perform highly as well. 

Small Files

One of the biggest killers of backup performance today is small files. A small file is defined by something that is less that 500KB. All operating systems and application can have small files but it is how they are handled and if they change on a regular basis. A good example would be temporary internet files that exist in a users profile. These kinds of files are very small and there can be thousands of them depending on what function the server is performing. If there are hundred and thousands of these files then the backup system has to read each one and process them one at a time and not in large chucks of data. In this case, this would be a good use for filters, to filter them out as it is more than likely that they will never be used again.

If the server has a lot of small files and they are needed for a specific reason like archiving it is recommended that they are placed onto a separate volume. This way, if the files are not changing and they are static and no new files are being added to the system once the first backup has been taken again, the filter option can be used to stop the backup from scanning that part of the file system each time looking to see if any files have changed. This will also help to reduce the amount of time taken for the backups to complete.
Note: If files are changing on this volume then it is not recommended to use the filter option as it will stop these files from being backed up and in turn prevent the latest copies from being restored. These need to be managed carefully with your server admin.

Network Performance

Sometimes customers might have servers that reside on the end of a remote link that they require to be backed up. This too can cause issues and have performance problems if there are large amounts of data to be shipped from one site to another. Here is an example: Customer has a server that has 500GB of data residing on it but the link between the site where the server lives and the location where the backup infrastructure reside is a 10Meg Link. With the speed of this link, the backup will take weeks to complete purely based on the speed and utilization of the link. It would be recommended that either the link be increased and or only certain files are backed up. However, this needs to be evaluated on a case by case basis and no two environments will have the same requirements.

Enable Jumbo Frames on the Interface


 Recomendation: Use VMXNET3 when you create the VM through updated UI.

On Windows 2012R2/2008R2:

  • Run the following commands to see all the available interfaces
    • netsh interface ipv6 show interface
    • netsh interface ipv4 show interface
  • Run the following commands to set the MTU size as 9000 to enable jumbo frames on the server. Assuming the interface number that you use for data traffic is "10"
    • netsh interface ipv4 set subinterface "10" mtu=9000 store=persistent.
    • netsh interface ipv6 set subinterface "10" mtu=9000 store=persistent.