ok, so i installed the OS on a USB Stick and take the space of the 3Ware 9650SE as Storage
only. But here i get System freeze
The System (ESXi 5.0) boot normal, but after a few minutes the System hangs.
I tryed the last BIOS versions (archived and current) of the Controller but nothing.
Is it possible that the BIOS of the Board have a bug ? Somebody test a raid controller with two
SATA 160GB 2,5" HDD's with this Intel DN2800MT board ?
ok i tryed to boot from an normal Silicon Image 3132 controller but the same
error - i think the Board don't boot from a controller on the PCI-e port ?
So what can i do - please help me !
I have basically the same setup as you. Specifically I am using the following hard-/software:
Mainboard: Intel DN2800MT, BIOS version 0165 (MTCDT10N.86A.0165.EB.EXE) released on 14.01.2013.
HDD Controller: LSI 3ware 9650SE-2LP, configured as RAID 1.
DVD Drive: Connected on USB port.
Software Platform: VMware ESXi 5.1.0 Release 799733 (with driver modification).
Remove additional drives from the onboard SATA ports.
Do a normal VMware ESXi installation on the RAID 1 volume.
Let the CD-Drive connected to the system.
Put the VMware ESXi installation CD into the drive.
Start the system and wait until the first screen of the ESXi Installer (blue) appers.
Select manually "Boot from local disk" within the given timeframe.
The system will boot into the completed ESXi installation. Enjoy.
Sadly this selection must be made manually on every boot. Maybe this steps can be made unattended. See below.
Explanation and trials I did:
First I made a VMware ESXi installer ISO which includes the LSI 3ware 9650SE driver. Then I started the regular installation from a CD. the RAID 1 was recognised successfully and the basic installation was successful. After the first reboot I had a BIOS error that there is no bootable device. I have checked the boot settings in the BIOS. I recognised that the "boot device priority" listing order never change, even if you select the "Hard Disk Drives" or something else. It seams that this is a bug from the BIOS and it follows always the same order which is Optical device > Removable Device > Hard Disk device > Network.
The same problem is recognisable with the Hard Drive Order when there are additional Drives connected to the onboard SATA ports. The BIOS always tries to boot from the onboard ports first - which actually works properly.
But ... it seams that the BIOS never tries to boot from the 3ware RAID controller (which is correctly listed in the BIOS).
I have additionally tested a random non RAID SATA controller which I had nearby - With the same result.
Just for additional testing, I started to install a Windows 7. Interestingly, the Installation was properly found again after the first reboot and was continued. The reason for this interesting happening was, that I did not removed the Windows DVD from the drive. As we know the Windows installer waits some seconds to press any key to start the installation process. If no key is pressed, it proceeds to boot from the HDD by default. After the DVD was removed, the system was not able to boot from the RAID, as known from the ESXi trial - Until the DVD was put back in. But this shows that is basically possible to boot into a successfully finished installation which gave me hope.
ESXi installed again ... Windows DVD placed in the drive ... But not as expected, Windows started an installation without asking me to press any key to start the installation. Apparently Windows starts the installation automatically (without a choice) if it detects something which is not Windows on the HDD.
Then I put the ESXi CD back in to the drive ... As expected the system was booting from the CD. In the first screen (blue) of the ESXi installer allows to select "Boot from local disk" which I chose. The system will boot into the completed ESXi installation. Enjoy :-). This will not work if there are additional drives connected to the onboard SATA ports because of the not changeable HDD.
Sadly the selection must be made manually on every boot - Which is not the kind of solution I am looking for.
On one hand I don't like to have a CD drive which is connected all the time just for the boot up - An USB flash memory would be better. On the other hand the boot process should work unattended - It is a server.
To boot from an USB flash memory should work, because removable devices are tried to boot directly after the CD-Drive.
The unattended process should also be possible. The boot loader looks very similar to Grub and the ESXi-Customizer (http://www.v-front.de/p/esxi-customizer.html) which I have used to integrate the 3ware driver into the VMware ISO has supplemented the boot menu with some additional notes about the ESXi-Customizer. If it is, as I think, similar to Grub, it should be possible to change the order of the commands and set the "Boot from local disk" as default on top.
Maybe also Intel is able to provide a fix for this problem, which would definitively the best solution.
many thanks for you answer. I managed it with an USB Stick.
So i installed the ESXi 5.1 on this stick and it boot from the Stick.
It function, but not very well
I can't anderstand why the Board doesn't boot from a extrenal PCIe cards.
Last question, i have the problem, that my ESXi Server restart some
times. I have 3 VMs and if 2 VMs (with Linux) On i haven't problems,
but if i start a third VM with Windows the Server restart after a few hours
How many VMs runs on your ESXi Server ?
It was my primary target to have everything on the RAID 1. So having the ESXI Installation on a USB Stick or on a dedicated HDD was never an option for me.
I actually work with an USB CD Drive which I connect temporarily for the boot process. Because the ESXI don't need to reboot very often I guess this will also be my long term solution.
Until now I have no long term experience because the system was still in in "experimental" operation which means that it was always shutdown over night. But now everything is set up and next week I will start testing with 24/7 running.
I have planned to run at least 2x WinXP and 1x Debian. I really hope that this will not be a problem. The system which I will replace with this setup is a now 3 year old Intel Atom N330 Platform with Debian as host system and VMware Server 2.02. I had the same 3 VMs running. The system had never a very high performance but was very stable.
I will leave a feedback here in a few days.
after a long long trip through the internet - perhaps these is the answer :
vSphere 5.0, VMware changed the partitioning layout of the system disk from Master Boot Record (MBR) to GPT. Does the Intel DN2800MT motherboard have a UEFI option ? If not booting from GPT isn’t going to work ?
Here’s how to enable ESXi to format the installation disk with MBR:
1.) Boot server from ESXi installation media.
2.) Wait for prompt in lower left-hand corner that says <SHIFT+O: Edit boot options> . Press SHIFT+O (that’s the letter O, not a zero).
3.) In the lower-left hand corner, after the word ‘runweasel’ insert a space then type formatwithmbr. Press enter to continue the installation process.
ESXi should now install as normal.
Many Thanks to JOSHUA TOWNSEND
I didn't test it yet - but i think i'm testing it soon
First of all, sorry for my late replay, I had a very busy time during the last few weeks.
Anyway, in the meantime, I had my system continuously running 24/7 for over two weeks.
The Guest systems are, as originally planned:
1x Debian Linux. This system primary works as storage pool and runs some small services.
1x Windows XP, This System is a "workspace" for over night jobs like transfering backups etc.
1x Windows XP, Testing environment.
During this time period I had only one Guest OS crash. It was my Win XP Testing environment during a not successful software installation which was reproducible. After I went back to the last snapshot, everything was ok again. The other two systems had no reboots during this time.
About your GPT/MBR approach ...
This is a very interesting hint. The BIOS has an "UEFI Boot" option in the "Boot" category. Description: "If Enabled, BIOS will attempt to boot via UEFI before using the legacy boot sequence. I Disabled, BIOS will use the legacy boot sequence.
UEFI Boot must be enabled to boot to a drive larger than 2 TB (terabytes)".
I am using 2x 2.0TB (1.81 TB net) drives on my RAID Controller. I have already tested both settings without any effect to the situation.
During my very first trials with this board, I only connected one 2.0TB drive directly to the onboard SATA and installed VMware 5.1, just to play around with it and discover the options. This setup did boot directly without any manual interaction or the described workaround with the CD drive. The Problem is only present together with the RAID Controller.
Could you do further testing in the meantime?
I have just spent several hours to test the "formatwithmbr" approach...
First, I did the format with MBR installation which was successful from the MBR point of view. Then, once again I have tested several different BIOS Boot settings combinations t boot the VMware installation..
Sadly I had no luck with any trial...
Just to test it once again and to prove that the RAID Controller and VMware are not the problem ... I moved the controller with the disks to an other PC. Plug in, BIOS Boot order set correctly, start and a successful VMware boot from the RAID array.
Additionally, the board starts to not recognise any of my USB CD drive correctly anymore - for no reason of course...
In fact, I am currently not able to do an installation from the CD drive. Interestingly USB flash drives still work ...
This board or at least the current BIOS version is crap ...
Next steps ...
I will install VMware on a mSATA (mini PCI-E) SSD drive, and only the VM's itself should be on the RAID 1 array.
Sadly this is absolutely against my initial target to have everything under complete redundancy... But, as it seams, this is the only solution that currently will work on this platform.
We would like to inform you that the DN2800MT was not intended or designed to support Linux or any other server environment applications and Operating Systems, reason why the assistance we offer for this issue is limited. You can try any of the workarounds suggested by previous forum members or contact your Linux distribution support group for further assistance on this matter. Thank you for your understanding
Thank you for your Answer.
But the problem we are facing has absolutely nothing to do with what we are using the board for ... So blaming our "server usage" or the Linux OS is (I am very sorry) a bit unqualified.
In fact, you cannot change the BIOS boot device order in any way with the current BIOS release... (you can chose the device order, but the choice will not be saved and therefore stays always the same ...).
I have this problem even if I would use the board for a very "normal" application like as a media PC or a low end office box - I hope this is an "intended" enough use..
Anyway ... Example - And in fact this is something I have really tried because I want to know what the Windows 7 system benchmark results are...
Setup: Intel DN2800MT, 4 GB Ram, 64 GB SATA SSD, SATA DVD Rom (No PCI-E extension card) - Everything you have normally connected to a standard PC...
I installed a Win7 Home Premium x64.
Basically in such kind of setup you change the boot device order to HDD > CD-Drive > USB > Network (maybe you even disable CD-Drive, USB and Network) to increase the boot time and prevent from possible (user caused) system damage from bootable media which were connected to the system.
But, as already written (and this is in fact the core problem) it is not possible to change the boot device order...
I agree with you, that for this maybe "intended" use, it is not a tragedy but it is still something which did not work correctly.
So it would be great if Intel would be able to fix this BIOS bug.