Synology Hyper Backup: Finish large backups despite connection loss
With release 6.0 of Synology DSM there is a new backup-tool, called “Hyper Backup”. It can be used to schedule backup-tasks to different cloud-targets. Especially when the first, initial-backup of one source is uploaded, this might take several days, depending on file size and upload rate. Too bad, if your connection resets during the upload due to ISP behavior or technical problems: The backup gets stuck. But there is a workaround for that case.
If you try uploading a large source of files to a cloud backupstorage via “Hyper Backup” you might encounter a problem where your backup gets stuck immediately after a connection loss. Especially when creating the initial backup-set, which uploads all files that are available at the source, the job might need a few days to complete. It is very likely, that during this long time there will be a connection-loss or a connection-reset by your ISP which will freeze the procedure. So you might think: “I’ll never get my data into the cloud”.
Don’t give up, there is a workaround.
To understand the workaround you first have to understand how “Hyper Backup” works. After the initial backup-set (the first time you upload the source-files to the backup-destination), in future backup-tasks “Hyper Backup” only updates or adds altered or new data, which means, that the following backups are far smaller as they only contain everything that is not already at the destination. Depending on how your backup-task is configured and scheduled, we might speek of incremental and differential backups.
This behaviour is the key to our workaround. Most people suffering from this problem, will not have this problem with the everyday-backup-tasks, but with the first initial backup, as it is much larger as I explained above. So we need to only workaround the initial backup-task.
In my case, my destination for the backups is the cloudstorage-provider “hubiC” but this should be similar when using other providers.
My backup-source is for example 250GB large, which is not uploadable within one day because of compression-time and encryption-time. My ISP resets the Internetconnection at 1pm and the backup gets stuck.
Let’s say that my backup-source, my share to backup, consists of 25 folders. If I try to upload every 25 folders at once it will fail, so we go to configure our backup-task to initially only backup folders 1 to 3.
We start the backup-task and folders 1 to 3 are successfully uploaded in less than a day.
Now we reconfigure the backup-task, adding folders 4 to 6 and start the backup again (meaning we have now activated folders 1, 2, 3, 4, 5, and 6 to be uploaded).
As I mentioned above, only new or altered files, in this case folders 4 to 6 will be uploaded. This procedure will also finish within one day.
Repeating this until folder 25, this workaround allows us to partially upload our whole source. When every folder was uploaded, we go to our backup-settings once again and activate the whole share at the higher root level.