Skip to main content

Best Practice to Backup 20TB with minor data changes a day

Thread needs solution

Hi all,

i have a windows server with soon 20TB of data. That amount of data consists of 18 Million Files. In the past it was copied to remote location by robocopy / rsync script. As you can imagine, that script needs way to much time in order to check each file if one of them got changed by time stamp, or new files are stored somewhere on that storage / directory. When creating not a file based but system / fullbackup with acronis, will this help here? Is acronis using a different approch like block chain something in order to check much quicker if the main amount of data was touched, then checking each and every file on the server and the remote storage? To make things worse i have a iSCSI device that can be accessed by Gigabit only, so that does not help much, too.

Thanks for your comments / ideas.

Cheers

0 Users found this helpful

Hello Flo!

Acronis also uses timestamp to figure out which files have been changed, but accoring to the docs it also uses Changed Block Tracking. Here is the relevant bit:

The CBT technology accelerates the backup process. Changes to the disk or database content are continuously tracked at the block level. When a backup starts, the changes can be immediately saved to the backup.

-- Peter

frestogaslorastaswastavewroviwroclolacorashibushurutraciwrubrishabenichikucrijorejenufrilomuwrigaslowrikejawrachosleratiswurelaseriprouobrunoviswosuthitribrepakotritopislivadrauibretisetewrapenuwrapi
Posts: 0
Comments: 2016

Hello Flo,

welcome to Acronis forums!

We recommend that you enable Fast Incremental/Differential Backup and run a disk-level backup.