Direkt zum Inhalt

Incremental settings

Thread needs solution

Try with min file size of 100gb or with a directory with 100gb files and you can see what happens.

I'm using this command to flush variables

Remove-Variable * -ErrorAction SilentlyContinue

before che exit command.

But... i have (i hope!!!) fixed the problem formatting before the hdd so the process is very quick and no variables are used. Also i have no problem of backup since i have disabled error reporting.

I'll made some test and when i'm sure about the fix i'll post the correct (for me) version of the script.

All the best

Mattia, thanks for the tip about remove-variable!

You're welcome it's a pleasure. 

Small update: with remove item instruction I have ram issue (fixed at the end of script with the remove variable command) but if I format the situation is strange (I'm trying with format since format has not problem with ram). Follow me:

  1. I made the first complete backup  = no problem
  2. I start again backup and the space is not enough (just for test), device is formatted and the new complete backup started = no problem
  3. I start backup again, the space is enough so new backup file will be incremental = problem about device (no file but device) not present but the backup device is plugged as always! 

This situation is very strange since If there had to be problems, they have to happens at point 2 not at point 3!

What do you think about? 

Mattia, one of the issues of formatting the drive is that this can give a different partition ID which ATI is tracking the drive by.  I suspect that when the format is done followed by the backup in step 2. that ATI is using the partition ID that was still cached but when the task ran again, it found the new partition ID and gave the error!

See images for before and after formatting my L: external drive where the Volume serial changes.

Looking in the associated Scripts .tib.tis file for this backup task shows that the original (pre-format) Volume Serial Number is recorded in the script as shown in the snippet of the XML below:

<volumes_locations>
        <volume_location partition_id="\local\hd_sign(54414448)\part_sn(C16AB76001D4C183)start(4194304)" 
        uri="L:\Test\WINPE_full_b1_s1_v1.tib" volume_id="3653658102" />
        <volume_location partition_id="\local\hd_sign(54414448)\part_sn(C16AB76001D4C183)start(4194304)"
        uri="L:\Test\WINPE_inc_b1_s2_v1.tib" volume_id="3442753097" />

This shows that the new Volume Serial Number for drive L: no longer matches the identifier in the script file!

Ok it's clear but it's possibile to set manually volume serial number during formatting?

Mattia, I haven't found a way of manually setting the VSN during formatting - this is taken from the date / time stamp of when the format is done.

Mattia, something else to try on your system with the script, try adding in the following line(s) in the functions doing the file deletions before the remove-item line...

clear-content $taskfiles

I have been testing this on my own computer and it has cleared the .tib files to 0 bytes before they were deleted.

function ClearFiles {
TimeStamp "Checking for .tib files to be deleted"
if (test-path $testfiles) {
TimeStamp "$taskfiles found"
sc stop "AcronisActiveProtectionService"
clear-content $taskfiles
TimeStamp "$taskfiles cleared"
remove-item $taskfiles
TimeStamp "$taskfiles deleted"
sc start "AcronisActiveProtectionService"}
}
function CheckSpace {
if ((Get-WMIObject Win32_Logicaldisk -filter $filter).FreeSpace -gt $testspace){
TimeStamp "Free space is greater than $testspace"
} else {
TimeStamp "Drive space is less than $testspace"
write-host "Drive space is less than $testspace"
sc stop "AcronisActiveProtectionService"
clear-content $taskfiles
TimeStamp "$taskfiles cleared"
remove-item $taskfiles
TimeStamp "$taskfiles deleted"
sc start "AcronisActiveProtectionService" }
}

Hello steve,

i have tried but it's the same. The only way to flush ram is adding 

Remove-Variable * -ErrorAction SilentlyContinue

at the end but...during deletion ram use is near 99%...

Thanks anyway!