Skip to main content

Daily incremental backups do not show as separate individual files.

Thread needs solution

I recently upgraded from TIB 2015 to 2021 and my daily incremental backups do not show up as separate files like they did in 2015. It looks like they are merged into the first full backup. Is there an option that I can change so that it goes back to separating the daily incremental backups into separate files like it did in 2015?

0 Users found this helpful

This very unfortunate when doing custom online backups. After each increment, the whole file including all previous instances must be uploaded again.

This leaves only differential backups for online, which is still less efficient than incremental was.

frestogaslorastaswastavewroviwroclolacorashibushurutraciwrubrishabenichikucrijorejenufrilomuwrigaslowrikejawrachosleratiswurelaseriprouobrunoviswosuthitribrepakotritopislivadrauibretisetewrapenuwrapi
Posts: 250
Comments: 7092

Hello Peter,

when backing up to Acronis Cloud, the backup scheme is always incremental, this didn't change after having moved a local backup to the .tibx format since we already have been using .tibx for Cloud backups. However, if you are using a third-party Cloud service to upload local backups, yes the new .tibx archive is less flexible for such scenarios. 

Hi there,

PLEASE re-instate the option for incremental backup files to be in separate files. Due to the nature of the data I am backing up, I will NEVER use Acronis cloud for my backups. Having to re-backup the whole file again is also unsuitable for my scenario.

If this option is not re-instated, I will have to reconsider my backup software choice.

 

Acronis user for 15 years.

David, using files + folders backup will use the "old" *.tib architecture, and incremental backups are still in separate files. @Seve Smith has devised a method for creating disk + partitions using the *.tib architecture.

You say that "Having to re-backup the whole file again is also unsuitable for my scenario." This suggests that you are using a backup scheme with a large number of incremental backups. While I have successfully done this (with ATI 2014) with well over 100 incremental backups, it is (in my opinion) asking for trouble - if one of the incremental backups becomes damage, or is accidentally deleted, the backup is of limited use.

Please explain how your data is organized? It may be that multiple files + folders backups will achieve what you want.

Ian

See forum topic: How to create a Disk backup as .tib (not .tibx) which will create a new backup task using the older .tib format in the Windows ATI 2020 or 2021 GUI.

I'm facing the same issue with my ATI 2020 for incremental backup of disk and partitions and found this article.
What I noticed during the incremental backup process with the new .tibx format is the following:
1) I have a single file "C.tibx" on my external HD, initially with 20 Gb.
2) When I do an incremental backup to it, it says it gets to 99% and then spends several minutes doing transfers between my original C disk and my external HD.
3) This means that apparently ATI is copying C.tibx to my C:, doing all the incremental backup stuff there, and then copying everything back to my external hard drive.

This is something absurdly inefficient, since, in addition to taking up a lot of space on my C:, it will make a huge transfer to my external HD every time I make a new backup.

I don't understand why they created this inefficiency, when the simplest thing would be to simply create a SEPARATE incremental file directly on the external HD.

I don't believe what you describe in point 3) above is actually what is happening. Not sure what you are seeing.

BrunoC wrote:

I don't believe what you describe in point 3) above is actually what is happening. Not sure what you are seeing.

For example, let's say that:

  1. Both my external HD and my SSD (c:) only have 10 GB of free space;
  2. My external hard drive has an original 490 GB monolithic .tibx file;
  3. I will make a new backup, which will now be incremental;
  4. The incremental changes made to my C: amount to only 5 MB;

The logic would be for ATI to create a second file (incremental) on my external HD, with 5 MB (or less if compressed).
How does ATI proceed in this case?
Is this operation possible since none of my drives have enough space to copy the original 490 GB monolithic file from one to another to do the incremental difference add operation? 

For example, let's say that:

  1. Both my external HD and my SSD (c:) only have 10 GB of free space;
  2. My external hard drive has an original 490 GB monolithic .tibx file;
  3. I will make a new backup, which will now be incremental;
  4. The incremental changes made to my C: amount to only 5 MB;

The logic would be for ATI to create a second file (incremental) on my external HD, with 5 MB (or less if compressed).
How does ATI proceed in this case?
Is this operation possible since none of my drives have enough space to copy the original 490 GB monolithic file from one to another to do the incremental difference add operation? 

Roger, I agree with Bruno with regard to your point 3 previously.  I have never seen such behaviour as far as I remember.

Going with your example - first comment is if any of your drives have only 10GB free space remaining, then there will be a significant performance impact, especially with an SSD drive where the recommendation is to have around 20% free space for best performance.

When creating an incremental backup, Acronis will be doing a significant amount of I/O in order to identify the data that has changed since the previous backup was created, some of this will be absolved by the use of Changed Block tracking used by the application which normally resides on the source drive within the application folders structure.  The data being gathered would normally also reside on the source drive within the storage area for Microsoft VSS to create a snapshot before this is then transferred and written to the destination drive.  Snapshot data may look to cause the used space on the drive to decrease but this is released as that data is written out and new data collected.

If there is insufficient free space for the backup operation to complete successfully then this would normally be flagged up via an error message early on in the process.