Passing Acronis data to pre/post commands?
The AHTI 2016 doc is a bit sparse when it comes to pre/post command options for backups. The "Specify and configure user command" panel has an "Arguements" field. Are there any symbolic parms set by Acronis that can be used? I'm particularly interested in getting the name of the backup file just created passed to the post-backup command.


- Log in to post comments

Sorry for the delay in responding. I don't seem to get notified when there is a response to the initial post of a thread.
Bobbo_3C0X1 wrote:The Pre-post commands are really for those with some scripting knowledge (at least basic command line and .bat scripting) - very similar to what you would need to know to add certain parameters to Windows scheduled tasks if you wanted to launch applications that wayo. If you can create a .bat file, you can add it to the front or end of your backup job and that's what it will run.
I have little experience with Windows scripting but am starting to learn. (I have a lot of now nearly obsolete experience of scrit-writting on IBM mainframes. :-) ) I had hoped that ATI would provide the name of the backup file to be created (pre environment) or just created (post environment), but I gather from your response that is not the case.
Bobbo_3C0X1 wrote:Do you just need the name? What would you do with the name?
This my attempt to circumvent the problem I'm having getting the ATI FTP client to talk with the WD MyCloud Gen2 FTP server. I want to create a post-processing script to FTP the just created backup file to the MyCloud server (since other clients like WinSCP hav no trouble talking with the MyCloud server).
Bobbo_3C0X1 wrote:You could write a post command to append the contents of a folder to a text file with the newest files listed first if you just want to know what files were just created. NOt sure if this is useful for what you're looking for thoug.
I could do something like that to extract the name of the file just created, but its a lot more complicated than I would like. And I would have to a unique script for each of my laptop and 2 PCs since they have different destination for their backups. And I would have to remember to change the script if I change the destination in Acronis.
It just occurred to me that I might be able extract something from an ATI log to get that file name.
- Log in to post comments

Easy peasy - no scripting really needed (well, not much).
Put a robocopy command in a text file and save as .bat and then put that in your post command. robocopy is built into Windows already and is a great command line copy tool. So let's assume Acronis creates files in D:\Backups\backup1
You would robocopy it to your FTP site like this:
robocopy.exe "D:\Backups\backup1" "\\ftp:myftpshare\backup1" /mir /r:0 /W:0 /COPY:DT /FFT
/MIR will mirror everyting in the source to the destination exactly (be careful with mirror as it can delete what's in the destination if it doesn't exist in the souce). An alternative to /mir is /s or /e which will append only and not delete. I like mirror though if the idea is to replicate exactly what's in the source.
/r:0 reties 0 /w:0 wait 0 seconds /fft is to exclude file permissions as the NAS will not like Windows permissions.
Play around with it outside of Acronis and once you have your robocopy command working, just pate it in notepad and change from .txt to .bat and have Acronis call that is a post command.
Alternatively, don't even use Acronis to robopcoy. You can just use Windows scheduled task to do this at certain times of the day or week if you want instead.
Please keep in mind that FTP has a 2GB file limitation for uploads so you may need to set the backup .tib size to 2gb chunks if the plan is to copy them to an FTP share.
- Log in to post comments

Hmmm, maybe not easy peasy :)
robocopy doesn't seem to like FTP locations. Trying to figure out a work-a-round natively in Windows. I'm sure it would be easier using an FTP client like FireFTP though, but still want to see if I can get it to work directly in Windows.
- Log in to post comments

Bobbo_3C0X1 wrote:Easy peasy - no scripting really needed (well, not much).Put a robocopy command in a text file and save as .bat and then put that in your post command. robocopy is built into Windows already and is a great command line copy tool. So let's assume Acronis creates files in D:\Backups\backup1
I had nerver heard of robocopy, but I see it is xcopy brought up to date and given a ridiculous name.
Bobbo_3C0X1 wrote:You would robocopy it to your FTP site like this:robocopy.exe "D:\Backups\backup1" "\\ftp:myftpshare\backup1" /mir /r:0 /W:0 /COPY:DT /FFT
I think I don't want to copy my whole backup directory because it's pretty big: over 700 GB right now for the PC I use all the time. That's 2 version chains of full + 6 incr backups to my C ( drive (about 95 GB used) and my D drive (about 360 GB). I use the backups more for data recovery if I mess something up than for system recovery. Copying the whole 700+ GB each time I do a backup is not reasonable.
However, I couple have another backup task - 1 version of one full backup excluding files that are completely static and could be recovered That's still around 300 GB, but I could cut that down a lot by excluding backups of data that is completely static. I could probably get it down to less than 150 GB. And I could ftp that directory.
Bobbo_3C0X1 wrote:...
Please keep in mind that FTP has a 2GB file limitation for uploads so you may need to set the backup .tib size to 2gb chunks if the plan is to copy them to an FTP share.
I looked into this in the past (and again now just to be sure). I'm still not absolutely sure, but I believe the 2GB limit is an old one. The Acronis FTP client still imposes that limit and breaks up and FTP backup into 2GB chunks, but it is up to the server whether or not that restriction actually exists. If I'm doing the copy outside of ATI I might not have that restriction (and it should be fairly easy to test). Of course, that means I would have to restore the .tib file with FTP before Acronis would be able to use it; the Acronis media would not be able to use the file directly from the FTP server. (But I can't get the Acronis recovery medium to work with my gen2 WD MyCloud FTP server anyway, or I wouldn't have to go through this exercise.)
Bobbo_3C0X1 wrote:Hmmm, maybe not easy peasy :)robocopy doesn't seem to like FTP locations. Trying to figure out a work-a-round natively in Windows. I'm sure it would be easier using an FTP client like FireFTP though, but still want to see if I can get it to work directly in Windows.
I'm not surprised that robocopy doesn't support FTP. The Windows native support of FTP isn't so great ... at least on the client side. For instance, the Windows native FTP client doesn't support passive mode which is pretty much necessary in the modern FTP world. I've got WinSCP which seems pretty flexible. I'll see how it does.
- Log in to post comments

Yeah, I don't use FTP very much, but have downloaded Fireftp and WinSCP. Both seem to work great, but I can't quite figure out how to make them automatically send the updates to the remote FTP share unless I have the application open and manually sync them after that. One time, FireFTP seemed to do it automatically, but that was it. I know it's possible to automate the entire process, but I'm just not familiar enough with them. If you have a link that shows how, or can explain easily, feel free to post. Good info to know for sure.
- Log in to post comments

I don't know how to get either FireFTP or WinSCP to automatically send a file, but I figured out how to get WinSCP to do it from a command line. You have to use Winscp.com rather than WinSCP.exe:
winscp.com /command "open ftp://:@WDMyCloud:21/" "Put """"
The angle brackets are part of my syntax, not WinSCP's syntax. That is, replace with the user id. I think the log file is optional but I wanted it for debugging.
WinSCP had no trouble transferring a full .tib file to my MwCloud server with no file splitting. And I can hopefully get one of the Acronis recovery systems - either Linux or the WinPE - to process it. After all, it's just a .tib file in a NAS share. It can use SMB rather than FTP to access the file. At least that's my theory. :-) I'll test it later on today.
--------------
Later today:
No problem accessing the file using the Linux-based recovery. Validation took a couple hours but the recovery system didn't care that the backup destination was a local USB-attached drive and the .tib file - all 59GB of it - had been FTPed in one chunk to a MyCloud private share.
On the other hand, the WinPE-based recovery system could not access the share at all. From the command prompt window I was able to ping the NAS so I know it was accessible. Seems like this has been discussed before. I'd better do some searching.
- Log in to post comments

Good to know.
In WinPE, you have to use "net use" commands to map network drives first as a drive letter. I know it works for SMB shares, but had never tried with FTP.
For me, I thought it would be the same using command prompt and FTP, but after I authenticated that way, I couldn't change directories or even show the contents of the directory.
I then just noticed the FTP option in the WinPE GUI. I could not authenticate when attempting to go directly to a share. Instead, I had to authenticate to the root first (just using the IP address of the NAS). That worked and then it showed the other directories in the explorer and I could navigate them after that.
Attachment | Size |
---|---|
395479-134314.jpg | 101.44 KB |
- Log in to post comments

Bobbo_3C0X1 wrote:In WinPE, you have to use "net use" commands to map network drives first as a drive letter. I know it works for SMB shares, but had never tried with FTP.
I'll give that a try. I want/need FTP for backup but I'm perfectly willing to use SMB for recovery so "net use" may solve this problem.
Bobbo_3C0X1 wrote:For me, I thought it would be the same using command prompt and FTP, but after I authenticated that way, I couldn't change directories or even show the contents of the directory.I then just noticed the FTP option in the WinPE GUI. I could not authenticate when attempting to go directly to a share. Instead, I had to authenticate to the root first (just using the IP address of the NAS). That worked and then it showed the other directories in the explorer and I could navigate them after that.
I couldn't get the WinPE GUI FTP option to work, but I didn't try too hard.
All of this is pretty far away from my original question, though. I now know I can restore using a .tib file FTPed outside of ATI. I still have to figure out how to get that FTP invoked from an ATI post-processing command. No problem if I don't need to pass the command the name of the .tib file. I'll get back to that soon.
- Log in to post comments

Yeah, sorry, we got off track from the original question, but learned some good things in the process. Ultimately, I don't really have a good way of passing off just the newly created files to your command short of just being able to identify the names of the files using a directory command output to a .txt file. I'm guessing there are applications that could do this, but probably complicates things more. That was the hope of using robocopy originally since it doesn't care what the file names are, but will just copy the newly changed or created files to sync them to the remote share, something that you could also do with allwaysynch or goodsynch or possibly WinSCP or fireftp as well.
My only other suggestion is to play around with another embedded Windows tool: Forefiles. I have it running on a server to automatically clean up files/folders recursively based on creation date. Not sure if can be harnessed to identify all files based on the new creation dates and pass those on to your post command or not:
https://technet.microsoft.com/en-us/library/cc753551(v=ws.11).aspx
- Log in to post comments

I'm not at all sorry we got off track. This has been useful and (to me) interesting. I'll take a look at the Forefiles thing.
Thanks.
- Log in to post comments