Skip to main content

Writing Large Files to Drive During Image

Thread needs solution

While creating an image I'm often doing VERY drive-intensive activity--creating huge files, etc.  Is it recommended that I Not do this while imaging my drives?

Thank you,

Chuck C.

{{ Secondarily, is my assumption correct that the Files I'm
    creating during imaging are omitted from the image? }}

0 Users found this helpful

I have never seen a recommendation like that. I don't believe the "home" versions are capable of handling large databases though.

Have a look at this link for more details:

http://www.wilderssecurity.com/showpost.php?p=754303

I would say that files created after the process has started will not be included in the image.

Even though there are very, very few problems reported due to imaging the disk with Windows running as far as disk corruption/lost files go. Imaging in Windows problems typically are related to the system not being happy with the TI driver, snapapi.dll or the odd software conflict with another application not the actual file locking and writing.

I normally fire up TI then go for a coffee or get rid of the coffee and let TI do its thing.

Thank you, Seekforever!  (And nice Username, btw).

So, it appears wise to avoid intensive activity for the first few seconds of the process.

Another reason I guessed this would be OK is that the image-verify seems to provide a very conclusive assessment of the integrity of the final image.

Chuck C.

Chuck Colsch wrote:

... Another reason I guessed this would be OK is that the image-verify seems to provide a very conclusive assessment of the integrity of the final image.

Chuck C.

While the Validation certainly ensures the data can be read and reconstructed as written to the archive, I'm not sure if it would really indicate that all is well in the realm of your question. TI writes a checksum for every 256K bytes of data but if the data is not correct, ie, TI missed something being because of other disk activity (or some other reason) then the checksum mechanism would indicate all is well because it was created based on incorrect data. This is a weakness of this method compared to the bit-for-bit comparison method such as used in CD/DVD burning programs. OTOH, the bit-for-bit comparison is not possible for a live imaging program because the source drive is dynamic and there is no reference to compare the archive data to.

ATI takes a "picture of the drive when it starts and copies only those sectors in use. It then filters disk writes and for any writes to the secotrs in the Picture, its uses the old data -- it doesn't use any "new" data in the backup.  So, no file changes (creation, deletion, edits, etc., created after a backup has started will be in the backup.

Chuck Colsch wrote:

. . .

{{ Secondarily, is my assumption correct that the Files I'm
    creating during imaging are omitted from the image? }}