Aller au contenu principal

Cloning to RAID 0 (Solved)

Thread needs solution
Beginner
Contributions: 1
Commentaires: 2

I have a Gigabyte H370 HD3 motherboard with 2 NVMe SSDs in a RAID0 array.

When I clone my boot drive to the RAID0 everything seems to be OK.

When I boot from the RAID0 the Windows logo appears, after a moment the beach cave, and that remains for a long time, until the PC is booting again.

I can boot in safe mode, but when restarting the same procedure.

I have tried different UEFI settings (CSM disabled/enabled), but always with the same result.

What else can I do?

0 Users found this helpful

Reinhard, welcome to these public User Forums.

What are you cloning from and how is that source drive connected?

What version of Windows OS is being used?

How is your RAID 0 array implemented, and importantly, how does this appear to Acronis True Image, i.e. does the array show as if it was a single disk drive?

When NVMe drives are involved, then UEFI is normally required as there is no support for these drives with Legacy / CSM BIOS mode.

This sounds to me like the Universal Restore is being applied during the clone and the drivers have been reset so Windows is going through the process of automatically searching and installing them (which could take some time initially).  Give it a good 20 minutes and see if it finally goes through and boots with this in mind.

We're recently found from other user posts that in some instances when cloning, UR is being applied automatically directly from within the True Image Application - and this is not ideal in most cases unless specifically hoping to clone and trying to migrate to a completely new computer.

If you're up for it, I would suggest re-cloning (from winpe/winre rescue media) and saving the log to a USB drive or something accessible to see if UR is indeed being applied or not.  If it is, Acronis is asking people to submit their logs to them for review and a system report from Acronis as well. 

For reference, here is the post from Ekaterina and the existing thread about UR being applied in some clone instances.  I haven't been able to re-produce in Win 10, only Win 7.  But I also don't have a physical RAID setup of NVME drives which might produce similar behavior based on the drivers of the storage controller possibly changing (or Acronis considering them changing when cloning from a single drive to RAID or vice versa)

https://forum.acronis.com/forum/acronis-true-image-2019-forum/restore-workstation

I recently upgraded my NVME raid 0 setup to new drives.  Rather than clone I used WinPE recovery media to make a backup of the original raid 0 array to a secondary external drive.  I then removed the old drives and installed the new ones.  Then I booted the computer entering the bios and setup the new drives as a raid 0 array.  Once that was done I then booted the WinPE media and recovered the backup I had created to the new array.  Worked perfectly.

Beginner
Contributions: 1
Commentaires: 2

I am cloning from a SSD connected via SATA.

The OS is Windows 10 Pro x64 version 1809.

The RAID0 is created in UEFI and appears to ATI as a single drive (Intel RAID0).

Beginner
Contributions: 1
Commentaires: 2

The described problem raised about 3 weeks ago, when I tried to clone a backup copy to the RAID 0, without success.

Now I remenbered, that this attempt was after the update to Windows 10 vers. 1809.

Today I visited the Gigabyte website and saw, that there exists a driver update for Intel's Rapid Storage Technology. which is in the BIOS responsible for RAID. The new driver has support for Win 10 1809.

I downloaded and installed the driver, created a new RAID 0 array, and now I am up and running.

Thanks for your help.

Thanks for the feedback and glad it's all working now. Odd that the IRST driver itself was the issue - so to be clear, it was just a matter of updating the IRST driver in Windows before the clone that resolved the behavior, not the rescue media, or you updated both (in the OS and in the rescue media).

I wouldn't think the driver in rescue media would change the boot behavior after cloning.  Also strange that the OS worked with 1809 on the original setup, but not after the clone until the driver was updated. So, still a bit odd, but if the latest driver helps, then good to know!

Bobbo,

I believe the driver in question here is that of the motherboard PCH Oprom.  Since Intel has gone to the 16 series driver, it makes sense that OEM'S are updating their bios firmware to that series.   They may not all do that however so this is something that maybe problematic for some users.

I have 2 Nvme drives set up in raid 0. In windows they show up as one drive and work just fine but when I try to use my acronis restore the drive doesnt show up as an option to restore to. I installed the drivers and it showed up for the windows install and everything is funtioning propperly. What am I missing to get acronis to recognize the drive?

 

Bryan, welcome to these public User Forums.

The key point for getting Acronis rescue media to work with your RAID drives is to ensure that all required drivers are included in the media when it is built.

This assumes that you either have RAID setup at the BIOS level or else you inject all necessary RAID drivers in the rescue media.

You haven't said what version of Acronis software you are using here?  If you have ATI 2019 as per this forum, then you can use the MVP Assistant - New 2.0 with Rescue Media Builder (New Version 2.2.1) which can help you to manage your drivers.