Why AutoLab makes building your home lab simple

*Some of you may have seen a couple of twitter posts of mine regarding AutoLab lately, that’s because I am in love with this awesome automation tool that simplifies setting up a home lab.Whether it is for study, training, testing, or whatever you use your vSphere lab for, AutoLab is going to be something you will want, especially if you are rebuilding your lab over and over again.

AutoLab was created by the extremely talented Alastair Cooke () with the help of many, including Nick Marshall (), Grant Orchard (), Ariel Antigua (), Damian Karlson (), James Bowling () and the vBrownBag crew.

First of all, head over to LabGuides website where you can download AutoLab 1.5 (vSphere 5.5 or 1.1a – vSphere 5.1).  There are a few versions of each version. You can get AutoLab for ESXi if you have a server that is hosting the type 1 hypervisor, or there is the Workstation 8 version that you can nest on your desktop/laptop running VMware Workstation/Fusion. (I can confirm that the 1.1a works well on Workstation 10)
You can find the system requirements here.

Once the ZIP is downloaded, you can extract it to a location that has 100Gb spare space or more. Keep in mind that the VMDks are thin provisioned and are going to grow. The overall download size is only ~300mb but this does not include the VMvisor and the VIMsetup ISOs that you will need to download from your my.vmware.com account. If you do not have a license, VMware offers a free 60 Eval. License for your convenience. While you are grabbing the required vSphere ISOs from VMware, you can also grab the powerCLI installation file.

The last lot of ISOs you will need are the Windows Server 2003, Windows XP and Windows Server 2008.

Once you have downloaded or created your ISOs for the installation, you start added your VMs to VMware Workstation. (I’ll just take you though the steps for Windows).
Navigate to the location you extracted the zip file to and open up each folder and double click on the .vmx file available – this will register the virtual machine to Workstation.

At this point, **DO NOT** power on VMs, there is a little configuring with workstation to do first – not much, but it needs to be done. 

First you need to set the total amount of host memory the VMs can use. To do this, you need to select Edit > Preferences > Memory. A recommendation on the amount of RAM to use is presented, but you can choose your preferred amount.


The next step is to open up Virtual Network Editor (Under the Edit Menu) – All you need to do is create a new VMNetwork using the VMnet3 and give it an address of 192.168.199.0

You can now start powering on your VMs – but wait! – Before you get too excited, the Router and NAS need to be turned on, leaving all other VMs powered off. Once the NAS has loaded you can browse to the share via \\192.168.199.7\build  Here you will need to place the Windows ISOs and the extracted files from the ESXi ISOs. 
Once you have gone through the guide and placed mandatory files on the share, you can go ahead and start to power on your DC. Providing the floppy disc is connected, your system will go through a sysprepped stage and set itself up with the required files and will build. The build process can take up to an hour, on my system it was done in 25 minutes, so this can very. 
As soon as the DC build completes you can continue to build your vCenter server, this is also done the same way as the DC, running through the sysprep. During both builds, several powershell scripts will run setting up each component of the server. There is also a validation script that you can use to confirm if each component has been installed or not.   
You can now see the lab starting to take shape, you can start powering on your hosts and these should build via AutoDeploy – the AutoDeploy and TFTP server were configured during the DC build.

(To find out the full procedure to set up AutoLab, or to set it up using a nested ESXi environment or Fusion, you can go through the guide on the LabGuides website. – I recommend reading the guide 🙂 )
All in all, it took me around 2 hours to have a fully fledged lab up and running, I am planning on pulling it down and doing through AutoLab 1.5 build shortly. I have seen that there have been even more adjustments and extra features. I highly recommend this route if you are needing a lab in a short amount of time, or would like an automated process if you find yourself rebuilding your lab over and over again. 
If you get the chance, please thank Alastair for all his hard work and dedication that he puts in each release. 
Thank you.
Keiran.
*Post was originally written in April. 

Advertisements

VDI 2 Week Challenge – Day 2

When starting your VDI design you need to understand your client’s needs, and they might not be as simple as you may think. Many things can turn your simple VDI infrastructure into a complex infrastructure just by having mixed needs of the users. Your company that you are building the environment for may have several departments, with each department requiring a different experience.

Take for example an engineering company. If you are familiar with engineers straight away you will probably jump to the conclusion that you will need an infrastructure that can do high graphics rendering – That’s a good start, but what the other departments? There may be receptionists, CEO, contractors, project managers, and many more that all require different access to software and processing power.

Understanding your users paramount – oversubscribing their resources or under subscribing can have a detrimental effect on either that users experience or on another.

There are certain components that you need to pay close attention to, and can be rated in a different order to Server Virtualization:

Disk is very important, when you think about it a standard desktop used by your average user is running on a single hard drive. When you add several more users accessing that particular drive at once and reading the same data source, things become a little bit slow and frustrating. The way to combat low IOPs and disk contention is to understand how many IOPs a single 7.2k hard drive and then times it by the number of users – This number could end up being quite high, but it will give you an idea of what your users are using.
There are several options on how you can deploy your disks, you can either run them from a SAN using Fiber Channel, Fiber Channel over Ethernet or iSCSI. You can also now leverage vSAN with Horizon 6, this is not only giving you a simpler design, but may also increase your performance while driving down infrastructure costs.

IOPs is not the only aspect of the disk that needs attention, size is also critical, particularly if you have users who like to store their items on the “Local” disk and not on any of the mapped storage drives, etc.  As part of the configuration control that you as an administrator has, you can use certain settings to help prevent the user from storing their data on what is potentially the local disk by setting up profile redirection – both windows server and horizon have the capabilities of this.

CPU is one area that you will need prior knowledge to make sure you can get it right. Unlike server virtualization where you can be more sure of the VM requirements, VDI can be a much larger kettle of fish. It’s best to research and find out what the OS, applications and user requirements are before over/under subscribing CPU resources.

Memory  is quite simple to work with in a VDI design as over subscribing is not a good idea as you do not want your users to experience the effects of the swap file. A good way to avoid any swap file usage is to make sure that your hosts are sitting comfortably with a good amount of memory.

There are three layers that make up a virtual desktop experience for the user. The three elements are Persona, Application and OS. By separating these you are then creating less administrative support required.

Figure 1.1

As figure 1.1 shows, you are able to create a master image that has your configuration set for all the users, next you have your ThinApp that is again a master application, followed by each users persona. Any settings or changes the user makes is stored as part of their persona. This allows the administrator to focus on only needing to provide an operating system and application per department needs, if not the entire company.

Hopefully this has given some insight into some of the components and design aspects that need to be understood before implementing a solution.  

The key is to know what your users require, where your resources will be used and where will they sit idle if you provide too many resources.

If you have any comments or questions, please feel free to contact me.

Thanks for viewing.

-Keiran.