Best Practice to Backup 20TB with minor data changes a day
Hi all,
i have a windows server with soon 20TB of data. That amount of data consists of 18 Million Files. In the past it was copied to remote location by robocopy / rsync script. As you can imagine, that script needs way to much time in order to check each file if one of them got changed by time stamp, or new files are stored somewhere on that storage / directory. When creating not a file based but system / fullbackup with acronis, will this help here? Is acronis using a different approch like block chain something in order to check much quicker if the main amount of data was touched, then checking each and every file on the server and the remote storage? To make things worse i have a iSCSI device that can be accessed by Gigabit only, so that does not help much, too.
Thanks for your comments / ideas.
Cheers


- Log in to post comments

Hello Flo,
welcome to Acronis forums!
We recommend that you enable Fast Incremental/Differential Backup and run a disk-level backup.
- Log in to post comments