Backup Splitting to custom sizes
Does anyone know how to split backup files to custom sizes? It seems in the Backup Options, Back up Splitting only allows for fixed sizes (100 MB, 250MB, 700MB and 4.7GB). First of all, are we still in the 90s that we are talking ZIP drives and CDs/DVDs for backups? Nowadays the amounts of data are exponentionally higher.
Anyways, we back up about 20 VMs running on a 3 host HyperV cluster and do full backups once per month and differentials daily. Some VMs Full Backup are around 600GB to 900GB in size. We have no issues with having TIB files that big in our local storage (NAS), as the job runs nice and quick in less than 5 hours, however the problem is that we copy those files to Google Cloud Storage and this is when we ran into problems. It takes days to upload those massive files, not because of our Internet speed (we got 500 Mbps up/down fiber) but because Google Storage seems to limit the speed to max 8MB/s and sometimes it drops to 1MB/s per file.
It is more efficient to upload smaller files than a massive file. However I do not want to be dealing with 2 hundred files for a single Full backup so using 4.7GB is out of the question. I want to be able to split to 50GB files or say 100GB files. Obviously this is not an option on the drop down on the GUI (it should be!) so is there a way to do this using the command line?

- Log in to post comments

Thank you that was useful! I guess that option of writing a custom value should be included in the Documentation, because it is not mentioned anywhere. I will try it at our next scheduled Full Backup..
Thanks again!
- Log in to post comments