Skip to main content

Why isn't the "GB processed" rate per minute consistent?

Thread needs solution

I've just run my first incremental files and folders backup using ATI 2015 (latest build) on my HP desktop running Windows 7 Home Premium. I stored it on my Toshiba USB ext drive, plugged into a rear USB port.

ATI processed almost as many GB of data (51 GB) as the size of my first "entire PC" backup (56 GB), which I ran on Feb. 14! Even though between the "entire PC" backup and now, all my wife and I did was read and send email, use Word or Excel, and update my stock portfolio. I've posted another message about why the incremental files and folders only backup was so large compared with the entire PC backup.

My question now has to do with the rate at which ATI processed files it was backing up. I watched it from start to finish. It took almost 2 full hours. It processed about 51 GB of "files and folders" in an incremental backup. I did a little X-Y plot in Excel, of the time on the X axis and the total amount of GB processed on the Y axis. I was quite surprised that there were long periods in which nothing was being transferred to the USB drive, even though the hard drive activity button on my desktop blinked continuously. Then there were a couple periods when there seemed to be a burst of activity where the GB processed per minute shot up, and real progress was made!

Why wouldn't it be consistent the whole time ATI was processing my hard drive? If I was backing up to the cloud, maybe I could understand periods where there was "congestion" in the data transfer from my desktop to the backup destination. But when it's all on my home computer, why the inconsistent rates?

I've tried to attach my Excel file, but I don't know if that worked.

Attachment Size
ati_backup_rate.xlsx 12.16 KB
0 Users found this helpful

I would say you are seeing expected behavior. If you are using compression then that process eats time as well. If your machine has a good amount of RAM installed then the app is taking advantage of that, reading, processing, and storing data to RAM prior to write to disk. If you have defragmentation running regularly or you have done so since your last backup that can cause an incremental to be large at times exceeding the original backup size.

Thanks. This makes sense. Esp about compression.