Large FTP backups to NAS - backup suceeds, deletion fails
I've had at least two problem tickets for this problem over the past several years but the support team has not been able to collect sufficient diagnostic data (and I have not been willing to let them have control of my PC for the multiple hours required to collect the data). And lately the Acronis support team seems to be somewhat unresponsive so I'm hoping posting here can magically get information developers through a back channel.
When Acronis takes a backup via FTP it breaks the backup into 2GB chunks. A disk/partition backup can result in a very large number of .tib files. (The last full backup I took resulted in 1106 .tib files.) That's no problem during backup but ATI apparently opens multiple FTP connections - one per file? - during a delete. With my old FTP server on a Western Digital MyBookLive either the NAS or the Windows TCP stack would exceed its capacity and hang up - at best, the NAS would have up and have to be rebooted; at worst, Windows could not open any new TCP/IP connections (but already open connections would continue to work).
I now am using the FTP server on a Synology DS218. It limits the the FTP connections to 200 so both the NAS and Windows continue working but the delete function fails with:
| trace level: warning
| line: 0x73512af0a5645a4
| file: c:\bs_hudson\workspace\245\products\imager\archive\impl\operations\archive_operation_executor_impl.cpp:535
| function: TrueImage::Archive::EraseDataStream
| line: 0x73512af0a5645a4, c:\bs_hudson\workspace\245\products\imager\archive\impl\operations\archive_operation_executor_impl.cpp:535, TrueImage::Archive::EraseDataStream
| $module: ti_demon_vs_10640
|
| error 0xb00eb
| line: 0xd460020904af2957
| file: c:\bs_hudson\workspace\245\products\imager\archive\impl\volume_location.cpp:190
| function: TrueImage::Archive::VolumeLocation::OpenDirGently
| line: 0xd460020904af2957, c:\bs_hudson\workspace\245\products\imager\archive\impl\volume_location.cpp:190, TrueImage::Archive::VolumeLocation::OpenDirGently
| $module: ti_demon_vs_10640
|
| error 0x40011: The specified file does not exist.
| line: 0xd460020904af28c5
| file: c:\bs_hudson\workspace\245\products\imager\archive\impl\volume_location.cpp:44
| function: `anonymous-namespace'::OpenDir
| line: 0xd460020904af28c5, c:\bs_hudson\workspace\245\products\imager\archive\impl\volume_location.cpp:44, `anonymous-namespace'::OpenDir
| $module: ti_demon_vs_10640
|
| error 0x40000e: Failed to open the backup location.
| line: 0xbdf3287805c62c2c
| file: c:\bs_hudson\workspace\245\products\imager\archive\impl\uridir.cpp:847
| function: TrueImage::Archive::UriDir::OpenUriDir::FtpPlace::Open
| line: 0xbdf3287805c62c2c, c:\bs_hudson\workspace\245\products\imager\archive\impl\uridir.cpp:847, TrueImage::Archive::UriDir::OpenUriDir::FtpPlace::Open
| path: ftp://ds218_1:21/Backups/Puget-116877/
| $module: ti_demon_vs_10640
|
| error 0x40022: FTP connection has failed.
| line: 0x31813f5926d10e3b
| file: c:\bs_hudson\workspace\245\core\network\ftp\ftp_api.cpp:303
| function: OpenFTP
| line: 0x31813f5926d10e3b, c:\bs_hudson\workspace\245\core\network\ftp\ftp_api.cpp:303, OpenFTP
| Reply: Too many connections (200/200). Please try later...
| $module: ti_demon_vs_10640
I'm pretty sure that "Reply: Too many connections (200/200). Please try later..." comes from the FTP server but I don't know for sure.
The problem with collecting diagnostic data is that it takes over an hour to create 200 .tib files on the NAS. And if ATI sends multiple delete commands on each connect then I need to have many more than 200 files to reach the 200 connection limit.
At any rate, the multiple connections used during the delete process causes problems.


- Log in to post comments

I kind of doubt that is the problem. The FTP servers - both the old one I no longer use and the new one - have no trouble handling connections during the backup.In fact, ATI opens 4 control connections (for no obvious reason) that stay open all the time, and another control connection and unique data connection for each 2GB data transfer during an FTP backup. The control connections require no port negotiation, of course - the are all to port 21 on the server.
I haven't looked at the FTP RFC(s) for a long time but I believe the delete function uses only the control connection so no port negotiation should be required. (I could be wrong.)
Since one of these backups is currently running and getting near the delete stage, maybe I'll start a Wireshark trace and see what is happening.
Update:
My, oh my! I don't know what ATI is doing with the 3 control connections it's keeping open but it starts a new control connection, complete with an FTP login, for each 2GB connection. Then it negotiates a data connection, sends the 2GB block, resets the connection. And while I was confirming that odd behavior I see I captured the failure of the delete. I'll look at it tomorrow.
- Log in to post comments

I've given up trying to understand the packet trace I collected. ATI was apparently in the midst of preparing for the deletion when I started the trace. That prep seems to starting to retrieve every record to delete - with each retrieval issued on a new control connection - but aborting the retrieval soon after it begins. At some point it then starts issuing deletes - each on a new control connection - while it's still doing retrievals. I don't know which set of connections are left open but I think it's the deletion connections. And leaving any connection open while creating a bunch of new ones is guaranteed to cause problems.
I think that Acronis did not put a lot of thought into its FTP backup process. This is not ready for prime time. I will find another solution for my non-SMB backups. (Of course, if Acronis finds a way to get NAS-based .tib files protected by AAP I can give up on this quest for non-SMB backups.)
- Log in to post comments

here the same issue, we have 20 NAS synology with acronis Treu Image, 2018,2019,2020, same issue.
No delete the files, and the NAS get full avery moth hace to delete files manually.
Recreate profile wont help.
Some Nas do the clean some dont, with same configuration.
- Log in to post comments

This problem still exists on ATI 2020. The FTP support in ATI is not very robust. I now take only small FTP backups with ATI.
- Log in to post comments

This is my main problem with using ATI 2019
The FTP delete never worked with my QNAP NAS. This is very sad because the FTP Feature was my main reason for choosing the ATI product.
I use FTP instead of SMB so trojan have no rights to change files on my NAS.
- Log in to post comments

J. Astheimer wrote:The FTP delete never worked with my QNAP NAS. This is very sad because the FTP Feature was my main reason for choosing the ATI product.
I use FTP instead of SMB so trojan have no rights to change files on my NAS.
The problem with deletion is due to a confluence of several issues. ATI restricts an FTP file to 2GB (An old, obsolete FTP restriction.) That causes a large backup to consist of many files. Then the ATI Delete process for FTP backups opens a new FTP control connection for each file to delete and does not close any of them until all the files are deleted. If the NAS imposes a maximum connection limit that is lower than the number of files the delete process will hang. If the NAS does not impose a limit, and there are large number of files, NAS (or Windows) resources can become depleted requiring a reboot of the NAS, Windows, or both.
I've twice opened problem cases with Acronis and provided them with packet traces demonstrating the problem, but never got past the first level of support following their scripts and ignoring the evidence provided.
The problem still exists in ATI 2020.
- Log in to post comments