Slow Performance for 2nd Full Backup
Hello. I'm trying to run backups to an external hard drive where I keep at least a couple of backups. My thought process there is if I just have one (Full+Incremental in one file) and the file gets corrupted, my backup isn't good. First I tried the "create full version after every N incremental versions" option. The first one took about 6 hours. No problem. Incrementals were very fast. No problem. Then when it got to creating the next full version, that ran for something like 48 hours! I saw this in the 2020 version and never got a good answer on how to fix this, so I wasn't completely surprised. I had actually switched to a different product for a while, and decided to come back to the 2021 version, hoping it had been resolved.
In my past experiments with 2020 I found that the mere presence of a previous backup caused this, even though it's supposedly doing a new full backup. So, I decided to try to outsmart it. I changed it to the regular Full+Incremental setting, and after a number of weeks I moved the backup to a different folder. I figured this would trigger it to do a new full backup and it would take the normal 6 hours or so. Much to my dismay, it took another 48 hours or so!
Any ideas to get around this? Am I the only person trying to keep two multiple backup files? I'm fine with the time it takes initially -- I have a lot of data and I just let it run overnight. But two days is ridiculous. Other than this I really like ATI better than other backup software I have tried, but I'm getting to the point where this is going to drive me to dump it again and go to another solution.
Thanks to anyone who has a solution for this.


- Accedi per poter commentare

Hi, Steve,
Thanks for getting back to me. I, too, used ATI through many versions and only experienced the problem starting with the 2020 version. That was on a different computer. I spent weeks working with technical support, running dozens of experiments and got nowhere until they said there must be something wrong with my windows installation. At that point I gave up and switched to another backup product. This is a completely new machine, fresh install, etc., so I thought I'd try again. Same problem. I have two internal hard drives (one SSD, one HDD), and I'm backing up up to two external USB 3.0 drives (on different days of the week). The first backup to each external drive took a reasonable number of hours given the size (about 2.8 TB for each backup file), the incrementals are very fast, but then subsequent full backups take more than four times as long as the first one did. That happened on each of the two backup jobs (one job per external hard drive).
As far as the backup task is concerned, it's a scheduled backup of the system (both hard drives), though I've verified that the same thing happens if I run it manually. I have encryption turned on, no backup splitting, and validation off. I have it set up to shut down when the job ends. Otherwise, I think it's pretty much all defaults.
I downloaded the tool you recommended and reviewed the Backup Worker logs. I found the original full backup log (a bit over 7 hours) and the latest one (close to 1-1/2 days). Both report errors initially saying it can't find the archive to open it. That makes sense, because the first one was the first one, and prior to running the latest one I moved the existing backup to another folder. Otherwise, neither reported errors. One difference I see between them is that the latest one has a bunch of progress entries (e.g., 10/17/2020 10:11:23:821 PM -07:00 22828 I00000000: Pid: 18660 id=1; progress=1) and the other doesn't. There's one approximately every 20 minutes, while the older log shows nothing for the 7+ hours it ran.
I compared the logs looking for something that jumped out. Most of the entries don't mean much to me, but I was looking for things that are noticeably different. One I see is umap writes (16933 pgs for the initial run, 65223 pgs for the latest). total req showed some differences in the sync time, but if that's a total it's only 6-7 minutes:
total req: 797132 (rd: 114482 wr: 682560 sync: 90), pgreq: 701260246 (rd: 2362502 wr: 698897744) sync: 1899.9 ms; [first backup]
total req: 814954 (rd: 128345 wr: 686227 sync: 382), pgreq: 705191997 (rd: 2700031 wr: 702491966) sync: 385634.9 ms; [latest backup]
And this jumped out at me because the numbers are quite different, though I don't know if it matters because I'm not sure what these are indicating:
wait stats: wr 12744032 rd 2364704 compr 4962509 decompr 0 (ms); [first backup]
wait stats: wr 113446971 rd 13712285 compr 14825561 decompr 0 (ms); [latest backup]
In general when I compare them, I see a lot if time differences. Just as an example, 396ms average segment_map merges vs. 1662ms. But the differences I see in the totals are 5 minutes here, 10 minutes there, etc., and don't come anywhere near explaining more than 24 extra hours to do the backup.
Any clues or suggestions on things I can try?
Thanks.
Glenn
- Accedi per poter commentare

Glenn, the first recommendation I would make is to split your backup into 2 separate tasks, one for each of your installed drives, i.e. one for the SSD and a separate task for the HDD. This will have the benefit of reducing the overall total size of data and also allow you to see if the performance is focussed more on the larger drive or on both?
ATI 2020 and later both now use a Changed Block Tracker (CBT) method to try to identify the data for subsequent backups but that really should mainly apply to the Incremental backup slices being created rather than to creating a new Full backup.
- Accedi per poter commentare

Hallo Glenn Claudi-Magnussen,
Für große Datenmengen, sollte ein neuer Backuptask, das das .tib Format verwendet erstellt werden.
For large amounts of data, a new backup task that uses the .tib format should be created.
https://forum.acronis.com/forum/acronis-true-image-2020-forum/how-creat…
- Accedi per poter commentare

Thanks, Steve and G.
On my old system I also had two drives (one SSD, one HDD) and I used separate backups, and the same issue happened on both. As I recall I even tried just backing up a single folder (large enough to take time, but much less than the whole drive and therefore faster), and saw the same thing. But I can test it.
I'll also try the tip to back up to tib instead of tibx.
One other detail I remembered from all my experiments on the old system with ATI 2020. When it was running a second full backup, I was able to see that it was doing a large amounts of reads of the first full backup file. I never understood that because if it's a full backup it should be independent of previous backups. The tech support folks never had an explanation for that.
Thanks. It'll take me a while to run these experiments and see what happens.
Glenn
- Accedi per poter commentare

Hello, again.
After much experimenting, I'm consistently getting the same thing I did on the old computer that had ATI 2020. It doesn't matter whether I'm doing a full system backup or backing up one drive. The first full backup is a lot faster than the second. I turned off encryption to rule that out. No validation. If I try moving the backup files to trick it into thinking it's starting from scratch, it seems to confuse it and take even longer (I'm totally baffled by this).
One option I see is to set up the backup as a full + incremental with no subsequent full backups, then put a reminder in my calendar to go manually unschedule that job and create a new job to a different folder. Kind of a pain. Or I can give up and switch to TIB instead of TIBX. The Acronis site talks about benefits of the TIBX format over TIB, which is the only reason I am hesitating to set them up as TIB.
Glenn
- Accedi per poter commentare

Cross reference to forum topic: Full backups: First one completes quickly, subsequent backups take 2x to 4x as long
- Accedi per poter commentare

Glenn,
Can you post your backup task configuration information? From your posts we know that you are using a incremental backup method and are currently doing so on individual disks.
It would help to know:
- Is the backup run daily?
- How many incremental files are created before a new full version?
- Is Automatic clean up enabled and if yes then how many versions is cleanup set for?
By default an incremental method backup creates 5 incremental backup versions then a full backup version. The default version chain is set for 183 days at which time a cleanup of the backups created would occur. This means that a total of 183 files must be created before a cleanup of old backups takes place of which approximately 37 of these files will be full versions. This equates to a large amount of disk space being used for backup storage obviously but also indicates that each new backup file created will be slower than the previous one by virtue of metadata processing of all files in the backup chain.
In Steve's first post to this thread he stated that he does not experience this slow behavior that you do. He also states that his backup tasks are set to create 5 incremental version and then a full version but are limited to a total of 2 chains before cleanup occurs. This means that his total file count is 13 prior to clean up. At that point in time his task configuration starts over, all previous backup files are removed, and the tasks starts anew. With far less data involved this task configuration has acceptable time periods.
Have you tried setting up your backup task in a like manner as Steve's? If not you should try it and see if things improve for you.
- Accedi per poter commentare

Hello.
On this computer I initially set up the backups exactly the same as Steve described -- full, followed by 5 incrementals, then another full, etc., with two sets kept (for space reasons). This is set up to run weekly. When it got to the second full backup after 6 weeks, that took 3-4 times as long as the initial full backup (1-1/2 days vs. about 7 hours). I have two different jobs set up exactly the same except pointing to two different USB drives, and scheduled on different days. When the second job came to the point of doing it's second full backup the same thing happened. I haven't actually gotten to the 13th week of this, so there hasn't been any automatic cleanup yet.
As part of trying to troubleshoot and test this, I have been running with no schedule. I just set up a backup task with only full backups and no automatic cleanup. Manually run it, then manually run it again. Same thing. The second run takes 3-4 times as long. I've tried a different external drive on a different USB port -- same thing. I've tried setting it to back up just my C: drive, not the whole system -- of course that's faster, but the second run is 3-4 times as long as the first.
Verification is off. I've tried with and without encryption. I've tried with and without compression. It doesn't matter.
Oddly, when I move, rename, or manually delete (via Windows Explorer) the backup files and run again, it seems to create a fresh backup but most of the time that also takes 3-4 times as long. I was hoping I could just move or rename the backups every 6 weeks as a workaround.
The only thing I have found that works is to delete the task and set up a new one.
This is the second computer I had this same thing happen on. I was running ATI 2019 on my previous computer and this started happening when I upgraded to ATI 2020. After weeks of similar experiments, the tech support people told me it was a problem with my windows files and gave up. I switched to a different backup software at that point. When I got this new computer (which was set up completely fresh, so nothing but data moved from the old one), I decided to try ATI again and even upgraded to ATI 2021, hoping the problem had been solved.
At this point I've run dozens of full backups to try to troubleshoot this, all to no avail. Two different computers, at least four different external drives, and every option that seems to have any hope of helping. It always does the same thing. I'm quite puzzled that everyone doesn't see the same effect. I guess I can switch backup software again (I like ATI better, but can't have backups taking 1-1/2 days) or I can manually create new jobs every 6 weeks so it effectively never creates a second full backup on the same job. Or I can follow the process that was posted in this thread and force it to do the backups as TIB files instead of TIBX. My hesitation there is that I've seen a page from Acronis that describes TIBX as more reliable, which is obviously a good thing for backups.
Thanks.
Glenn
- Accedi per poter commentare

Glenn, the one factor that seems to be involved in this issue is the much larger size of the source data being backed up.
I am not seeing any obvious issues for my own disk backups using .tibx files but these are all under 100GB in size, usually around 50 - 80 GB and take minutes rather than hours to run.
- Accedi per poter commentare

Hi, Steve. As part of my testing last weekend I set up a disk that only had 100 GB of data and used that for testing, but it was harder to see a consistent difference. Even two fresh backups of this took somewhat different amounts of time. However, when I run with my SSD that has 1.2 TB used, it's quite apparent. Perhaps the size is a factor.
I don't think I mentioned this before, but when I did lots of testing on my previous computer with ATI 2020, I looked closely at reads and writes on the destination disk. What I found was that when it ran the second full backup there was a lot more disk read activity on the destination disk than when it ran the first full backup. We're talking many times the number of reads from the destination vs. the first backup. I think even though it is doing a full backup, it must be reading the earlier full backup for some reason. I don't see why it would, but that could explain the performance difference.
Thanks.
Glenn
- Accedi per poter commentare

Glenn,
In addition to what Steve mentions about size of backup, the fact that your backup chain is so long (13 weeks) of large amounts of data compound the size issue.
Cleanup needs to happen much more frequently to lessen the amount of processing that occurs during a backup. The more data involved in processing the slower the application works.
I recommend that you setup new tasks that perform cleanup after say 4 weeks rather than 13 and see what your results are.
- Accedi per poter commentare

Glenn, I suspect that the Acronis Changed Block Tracker (CBT) mechanism is involved in this issue and would trigger the additional disk activity.
I posted some of my recent testing in forum topic: Full backups: First one completes quickly, subsequent backups take 2x to 4x as long where I used a Post Command to force Full backups to be created each time the task was run, and in that case my 2nd full backup was actually faster than the first one! The approach with the Post Command only works for Full backups, not for where Incremental or Differential backups are used.
- Accedi per poter commentare

Hi, Enchantech,
Just to be clear, I see the same thing if I set up a task that only does full backups, no incrementals. I set that up to run manually. I run the first backup and as soon as it's done I click Back up now and run a second backup. The second one takes 3-4 times as long as the first.
Glenn
- Accedi per poter commentare

Glenn,
The issue is that in your scenario in post #14 the backup task, even though you have set it up as a manually triggered event, is creating a single backup chain. A backup chain holds backup files that have dependency. Dependency is that each backup file of the chain is dependent on all other backup files of the backup chain. Therefore each backup file created in the backup chain is processed during creation of each subsequent backup file until cleanup occurs. This is true no matter what type of backup method is being used, full, incremental, differential, all are processed the same way.
The above holds true when performing a recovery as well. You must have all backup files of a backup chain in the same disk location to perform a recovery. If you do not then the recovery will fail and error out with missing file errors. The larger the backup chain the longer to perform the recovery.
So the size of a backup chain is key to performance. If you have large amounts of data to backup as you do, then you need to minimize as much as possible, the amount of data a task creates before cleanup occurs to increase performance (backup speed) of the task used. When the task performs cleanup, the cleanup process will also take a longer period of time. Once cleanup runs the application will report to you how much data was freed up by the cleanup. At that point you're starting fresh or new again.
- Accedi per poter commentare

Bob, while I agree that dependencies exist between the files within a version chain, the change since .tibx files is that this now extends to dependencies between different chains for the same task which is not good and is causing or contributing to these performance problems when it should never be necessary to have such dependencies.
This is clearly demonstrated when the dependencies are removed by renaming or moving files so that the task acts as if it is creating the first full backup version again!
This did not happen with .tib backup chains and there was no dependencies between those chains!
- Accedi per poter commentare

Steve,
In my testing I have found that cleanup of a backup chain restores performance, you are saying you do not see the same behavior?
- Accedi per poter commentare

Bob, I have not encountered this particular performance problem for my own chains but my volumes of source data are significantly smaller than those used by the users reporting the issue.
I always keep 2 version chains so am normally working on a subsequent new full backup as the backup continues with the scheme settings of 5 incrementals then new full.
Perhaps trying using the Clean up versions tool is an option that the other users hitting the performance issue can test and respond to? However, it defeats the object of having more than a single backup version chain.
- Accedi per poter commentare

Steve Smith said:
Perhaps trying using the Clean up versions tool is an option that the other users hitting the performance issue can test and respond to? However, it defeats the object of having more than a single backup version chain.
Yes, I believe users need to run automatic cleanup to keep performance (speed) at acceptable levels.
Users having these huge amounts of data (multiple TB) really need to par down the amount of data defined by a single task. I would think that 1TB chunks of data per task would be a point at which acceptable performance can be maintained.
I agree that dependencies existing in the .tibx format have increased the occurrence of slower performance. Having said that, backup is a resource intensive task which when run for long time periods will result in thermal throttling of CPU and RAM. This further decreases speed of backup because these components have no time to recover during a prolonged backup process.
If Acronis tightens throttle limits on resource usage of the product then users will complain about that too. A catch 22 of sorts.
- Accedi per poter commentare

I had a few spare minutes so I've been reading through this thread for the first time. I think what Glenn spotted as indicated in post #11 about all the extra reads on the destination drive could be indicative of a problem not related to the source but rather to the destination.
I understand the issue has been seen with multiple computers and external drives, but what kinds of computers and drives. Different USB drives have very different specs and maybe the drives don't handle the kind of activity you are seeing very efficiently.
What kinds of external drives are you using? What about the ports? Cables?
- Accedi per poter commentare

Bruno,
You bring up a good valid point. These extra large external HDD's (above 4TB) may well not perform like lesser capacity drives further exacerbating the problem.
The link below has some good info on price/performance.
- Accedi per poter commentare

I've been working all day and see lots of responses come in, so I will try to respond to some of the points made.
First of all, on the external drives, they are WD Elements drives, connected with USB 3.0. They are pretty new and using the cables that came with them. I don't think the problem is with the drive or how full they are. Here's why. If I have one full backup on the drive and ATI creates a second full backup under the same task, it takes 3-4 times as long as the first one. If I have one full backup on the drive and I set up a new task and create a second full backup to the same drive, it takes about as long as the first one. In both cases I have a disk with about the same amount of space available and I'm adding a backup of the same size, but one takes 3-4 times longer. Also, everything was fine under ATI 2019 and the problem started once I upgraded to ATI 2020 and started doing backups in tibx format. That was on another computer with different hard drives, but I was doing the exact same thing under ATI 2019 and keeping two full backups with automated cleanup, and it worked consistently, but when I upgraded to ATI 2020 I started seeing backups take 3-4 times longer.
Regarding splitting the backup into 1 TB chunks per task: I tested it with just one of my drives, which has 1.2 TB of data. The first full backup took 2 hours. The next (on the same task) took 4 hours. If I break up my task to portions of my drive, then I think it'll use TIB format instead of TIBX, which I suspect won't have this issue. But then I'm managing more tasks, more backups, etc.
The comments about two full backups in the same task being dependent on one another is rather concerning. It seems to me the only reason for keeping multiple full backups from a task is so that if one of the files gets corrupted, I still have the other. Otherwise I might as well just keep doing incrementals. What I think you're saying is that now if either file gets corrupted, I have no backup. That's not good at all. I didn't think it was that way in the past (ATI 2019 and earlier), and it definitely wasn't that way with other backup programs I've used in the past. I back up my computer because I know hard drives can fail, files can get corrupted, etc. The point of multiple backups is in case one of the backups is corrupted. In fact, it seems by retaining two full backups I'm actually increasing my chances of losing my backup due to a hard drive sector fail.
- Accedi per poter commentare

Glenn, take a look at forum topic: Full Backups no independent entity from August 2019 where the dependencies between full .tibx backup files has been discussed in great detail.
- Accedi per poter commentare

Glenn, first I want to say that I agree about the dependency between full backups. There should be none; a full should stand on its own.
I did some looking at benchmarks on WD Element drives (https://usb.userbenchmark.com/SpeedTest/6601/WD-Elements-1048) and it might be worth running some benchmark tests.
What I was suggesting earlier is not that the drive itself is faulty, but rather that the constant reading and writing imposed with the new backup format may be causing the drive to operate much less efficiently, especially if there is introduced a whole lot more head movement. In other words, ATI's method may be playing right into some inefficiencies of the drive's design.
On another point, what is the size of the drive and the size of the backups? Here is an interesting read about experiences with a drive that may not have a lot of free space (https://forums.tomshardware.com/threads/my-wd-elements-2tb-external-hdd-is-losing-its-transfer-speed-from-day-by-day-why-is-that.3249063/). On thing to try might be to defrag the drive after a first backup and see if that has any improvement to the second one.
- Accedi per poter commentare

Thanks, Bruno and Steve,
The drives are 10 TB, and each full backup is about 2.8 TB. I'm sure they aren't the fastest, but when a full backup takes 7 hours I just leave my computer on, the scheduled backup runs while I'm sleeping, and by the time I want to get back on the computer, it's done. When it takes something like 32 hours, that's a serious inconvenience.
It seems like the interdependency of the full backups is the culprit. I wish someone from Acronis tech support had told me that when I was working with them in December-March.
At this point, between being the performance issue and concerns about the interdependency of full backups, I think it's time for me to cut my losses and switch to another backup solution.
Thanks.
Glenn
- Accedi per poter commentare

Glenn, if you still feel up to it, then I would suggest one last test of forcing your backups to use .tib files instead of .tibx, which would help prove that this issue is purely down to the changes brought in with .tibx and not the size or performance of your 10 TB drives!
See forum topic: How to create a Disk backup as .tib (not .tibx) which will create a new backup task using the older .tib format in the Windows ATI 2020 GUI.
Note: this will only work for new backup tasks, you cannot change an existing .tibx task.
- Accedi per poter commentare

Hi, Steve,
I did this test. I ran two backups of my C: drive on the same task. The first took 1 hr 58 min, the second took 1 hr 54 min. When I did the same with tibx, the first one took a few minutes over 2 hours, and the second took about 4 hours. So, I think that's pretty conclusive that the issue is specifically related to tibx.
A few notes on the process:
The "turn off protection" button was disabled on the Protection tab of ATI 2021, so I had to go into Settings and disable it from there. I then tried editing it, but couldn't save the change until I went into Task Manager and end the task for Acronis Active Protection Service. As a further test I rebooted and tried again, and still couldn't save without ending the task for Acronis Active Protection Service. So, it looks like disabling the protection in the application doesn't actually disable the protection.
Glenn
- Accedi per poter commentare

Glenn, good that your tests with .tib files definitely shows that this issue is being caused by .tibx files.
The process was originally devised when ATI 2020 first came in and the Protection feature was much simpler with just the antiransomware elements. ATI 2021 brings in the new Cyber Protect feature and changes the method of being able to turn off protection, but the same principle of modifying the script .tib.tis XML file remains the same!
There are whole other discussions about the new Cyber Protect feature that we don't need to get into with this topic!
- Accedi per poter commentare

Wow, poor Steve did the testing that Acronis should have done. Good work Steve! But I'm disappointed in Acronis. I'm still using 2020, has this been fixed in any later version?
- Accedi per poter commentare

Wow, poor Glenn did the testing that Acronis should have done. Good work Glenn ! But I'm disappointed in Acronis. I'm still using 2020, has this been fixed in any later version?
- Accedi per poter commentare

Dear Stan Kobylanski, Glenn Claudi-Magnussen,
There are multiple reasons for slow performance, frequently it is a matter of a certain environment where you use the product.
Every new release we work on the performance improvement.
So, we would recommend to install trial version of the latest update of the product and check it out: Acronis Cyber Protect Home Office
If there any problems with it, please contact our support: Customer Service and Support (acronis.com)
- Accedi per poter commentare