AutoIt Consulting https://www.autoitconsulting.com/site Windows Deployment and Scripting Specialists Wed, 04 Feb 2015 14:51:38 +0000 en-US hourly 1 http://wordpress.org/?v=4.1 Automating Office 365 Click-to-Run First Use Without Group Policyhttps://www.autoitconsulting.com/site/deployment/automating-office-365-click-run-first-use-without-group-policy/ https://www.autoitconsulting.com/site/deployment/automating-office-365-click-run-first-use-without-group-policy/#comments Wed, 04 Feb 2015 14:30:41 +0000 https://www.autoitconsulting.com/site/?p=1254 This post shows how to automate the numerous “first-run […]

The post Automating Office 365 Click-to-Run First Use Without Group Policy appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

This post shows how to automate the numerous “first-run” dialogs that are shown when a user runs an Office 2013 or Office 365 application for the first time.

With a standard local installation of Office 2013 this can be done in two ways. Firstly, by using Group Policy (the recommended way). Secondly, by using the Office Customization Tool (OCT) to create an .msp file that can be used during setup to apply custom user settings. However, when using the “Click-to-Run” version of Office 365 there is no way to use the OCT so user settings can only be configured using Group Policy.

In a recent assignment I needed to create an Office 365 package that would install silently on in various scenarios:

  • Domain joined / Group Policy
  • Domain joined / No Group Policy
  • Home machine

The customer was happy with the “default” settings that Office comes with, but when the user first runs Office there are numerous screens that popup which they wanted to suppress (with the exception of the Office 365 sign-in screen). These are:

“First things first” / License Agreement

The usual annoying license agreement and questions about auto updates and product improvement that no user will care about or fully understand.

Office First Things First

 

Default File Types

There isn’t a non-IT user alive that has any idea which of these options is correct. I believe this screen only appears on EU machines. Thanks EU!

Office Default File Types

 

“Welcome to you new Office”

A useless tutorial about OneDrive and forcing the user to set critical options like “ribbon background”.

Office Welcome

 

To disable all these screens we need to configure the following registry entries:

Registry KeyValue
HKCU\Software\Microsoft\Office\15.0\FirstRun\BootedRTM1 (DWORD)
HKCU\Software\Microsoft\Office\15.0\FirstRun\disablemovie1 (DWORD)
HKCU\Software\Microsoft\Office\15.0\Common\General\shownfirstrunoptin1 (DWORD)
HKCU\Software\Microsoft\Office\15.0\Common\General\ShownFileFmtPrompt1 (DWORD)
HKCU\Software\Microsoft\Office\15.0\Common\PTWatson\PTWOptIn1 (DWORD)
HKCU\Software\Microsoft\Office\15.0\Common\qmenable1 (DWORD)

 

The problem with this is that these keys are all user settings. If you try and set them at the end of your installation they usually won’t work as the installation is likely to be done under an admin account. Unless you set them from a user-side login script or Group Policy then they won’t apply to each user that logs onto the machine.

To get this to work we can use a little known feature of Office which allows you to specify some HKEY_LOCAL_MACHINE keys that are automatically migrated into HKEY_CURRENT_USER when an Office application is first run for that user. There is no worthwhile documentation of this process except on this Deployment Guys blog post.

In summary, you create keys under HKLM in the following locations (depends on the OS and version of Office):

OS VersionOffice VersionKey
32bit32bitHKLM\SOFTWARE\Microsoft\Office\15.0\User Settings\MyCustomSettings
64bit32bitHKLM\SOFTWARE\Wow6432Node\Microsoft\Office\15.0\User Settings\MyCustomSettings
64bit64bitHKLM\SOFTWARE\Microsoft\Office\15.0\User Settings\MyCustomSettings

The MyCustomSettings part of the key can be anything you like. You can even have multiple different names for different groups of settings. Then under this key make another key called Create and under this create the registry settings that you want to set in HKEY_CURRENT_USER. The 15.0 part refers to Office 2013 / Office 365.

When a user runs an Office application it checks to see if it has previously migrated these settings before and if not it creates the relevant keys in HKEY_CURRENT_USER. This is done before any user interface is shown so can successfully be used to set the options that hide the first run dialogs.

This can be a little tricky to understand but if you look at the example script below you should get the idea. Something interesting to note is that this will work for any registry key – it doesn’t have to be Office related!

 

Example Script to Automate First Run

Here is a small batch script that can be run at the end of the Office installation to configure all keys so that when any new user runs Office for the first time they don’t see all the first run dialogs. For this example I will assume we will be using a 64bit OS with the 32bit version of Office 365 (by far the most common configuration in most Enterprises).

reg add "HKLM\SOFTWARE\Wow6432Node\Microsoft\Office\15.0\User Settings\MyCustomUserSettings" /v Count /t REG_DWORD /d 1 /f >nul
reg add "HKLM\SOFTWARE\Wow6432Node\Microsoft\Office\15.0\User Settings\MyCustomUserSettings\Create\Software\Microsoft\Office\15.0\FirstRun" /v BootedRTM /t REG_DWORD /d 1 /f >nul
reg add "HKLM\SOFTWARE\Wow6432Node\Microsoft\Office\15.0\User Settings\MyCustomUserSettings\Create\Software\Microsoft\Office\15.0\FirstRun" /v disablemovie /t REG_DWORD /d 1 /f >nul
reg add "HKLM\SOFTWARE\Wow6432Node\Microsoft\Office\15.0\User Settings\MyCustomUserSettings\Create\Software\Microsoft\Office\15.0\Common\General" /v shownfirstrunoptin /t REG_DWORD /d 1 /f >nul
reg add "HKLM\SOFTWARE\Wow6432Node\Microsoft\Office\15.0\User Settings\MyCustomUserSettings\Create\Software\Microsoft\Office\15.0\Common\General" /v ShownFileFmtPrompt /t REG_DWORD /d 1 /f >nul
reg add "HKLM\SOFTWARE\Wow6432Node\Microsoft\Office\15.0\User Settings\MyCustomUserSettings\Create\Software\Microsoft\Office\15.0\Common\PTWatson" /v PTWOptIn /t REG_DWORD /d 1 /f >nul
reg add "HKLM\SOFTWARE\Wow6432Node\Microsoft\Office\15.0\User Settings\MyCustomUserSettings\Create\Software\Microsoft\Office\15.0\Common" /v qmenable /t REG_DWORD /d 1 /f >nul

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.

 

facebooktwitterredditlinkedinmail

The post Automating Office 365 Click-to-Run First Use Without Group Policy appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/automating-office-365-click-run-first-use-without-group-policy/feed/ 0
Get the Current Script Directory in PowerShell, VBScript and Batchhttps://www.autoitconsulting.com/site/scripting/get-current-script-directory-powershell-vbscript-batch/ https://www.autoitconsulting.com/site/scripting/get-current-script-directory-powershell-vbscript-batch/#comments Thu, 23 Oct 2014 13:30:09 +0000 https://www.autoitconsulting.com/site/?p=1230 This post shows how to quickly get the current script d […]

The post Get the Current Script Directory in PowerShell, VBScript and Batch appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

This post shows how to quickly get the current script directory using PowerShell, VBScript and Batch – the most commonly used scripting languages for Windows.

The scripts I write usually read in other files or call other scripts. In order for these scripts to run from any location – such as a UNC path – without hard coding paths they need to use relative paths. Depending on how the script is called the working directory may not be the same as the script file. For example, if your current directory is C:\Windows and you run the script \\server\share\somepath\myscript.cmd then any relative paths in the script file won’t work correctly.

One way around this is to make the script change working directory right at the start and then use relative paths after that. But in some situations – such as batch files on UNC paths – this won’t aways work. The best way to get around this is to determine the directory that the script resides in at the start of the script and then make all other paths reference that.

Because I jump around various scripting languages all the time, I tend to forget the best way to do this and have to hunt for examples in old scripts. As a reference for myself this post gives the template for getting the current script directory in the languages I tend to use: PowerShell, VBScript and batch.

 

Windows Batch

Windows batch is the trickiest in some ways – it also is the one that cannot support UNC working directories. There is a built-in variable %~dp0 which expands to the path that the script is located in including the trailing slash. This can make for messy looking scripts because to run setup.exe from the current script directory you would use %~dp0setup.exe. This works great but can be a little confusing for others to read because it looks like a typo.

My preferred method is to create a new variable at the top of the script using %~dp0 and then stripping the trailing backslash. Here is the script:

@ECHO OFF
REM Determine script location for Windows Batch File

REM Get current folder with no trailing slash
SET ScriptDir=%~dp0
SET ScriptDir=%ScriptDir:~0,-1%

ECHO Current script directory is %ScriptDir%

 

VBScript

VBScript is fairly straightforward the full path of the running script is available in WScript.ScriptFullName and then you can use the FileSystemObject class to get the parent folder name. Here is the script:

' Determine script location for VBScript
Dim oFSO
Set oFSO = CreateObject("Scripting.FileSystemObject")
sScriptDir = oFSO.GetParentFolderName(WScript.ScriptFullName)

Wscript.Echo "Current script directory is " & sScriptDir

 

PowerShell

PowerShell users have long used a snippet from this post that gets the script folder. However, it doesn’t work as expected depending on how the script was loaded. This altered version should work in all cases:

# Determine script location for PowerShell
$ScriptDir = Split-Path $script:MyInvocation.MyCommand.Path

Write-Host "Current script directory is $ScriptDir"

 

Hopefully this post will be a useful reference to you when trying to remember how to get the current script directory. I know that I’ll end up referencing it myself in the future!

 

facebooktwitterredditlinkedinmail

The post Get the Current Script Directory in PowerShell, VBScript and Batch appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/scripting/get-current-script-directory-powershell-vbscript-batch/feed/ 0
UTF-8 and UTF-16 Text Encoding Detection Libraryhttps://www.autoitconsulting.com/site/development/utf-8-utf-16-text-encoding-detection-library/ https://www.autoitconsulting.com/site/development/utf-8-utf-16-text-encoding-detection-library/#comments Sat, 23 Aug 2014 17:02:22 +0000 https://www.autoitconsulting.com/site/?p=1180 This post shows how to detect UTF-8 and UTF-16 text and […]

The post UTF-8 and UTF-16 Text Encoding Detection Library appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

This post shows how to detect UTF-8 and UTF-16 text and presents a fully functional C++ and C# library that can be used to help with the detection.

I recently had to upgrade the text file handling feature of AutoIt to better handle text files where no byte order mark (BOM) was present. The older version of code I was using worked fine for UTF-8 files (with or without BOM) but it wasn’t able to detect UTF-16 files without a BOM. I tried to the the IsTextUnicode Win32 API function but this seemed extremely unreliable and wouldn’t detect UTF-16 Big-Endian text in my tests.

Note, especially for UTF-16 detection, there is always an element of ambiguity. This post by Raymond shows that however you try and detect encoding there will always be some sequence of bytes that will make your guesses look stupid.

Here are the detection methods I’m currently using for the various types of text file. The order of the checks I perform are:

  • BOM
  • UTF-8
  • UTF-16 (newline)
  • UTF-16 (null distribution)

The C# and C++ library can be downloaded here: TextEncodingDetect.zip
download_zip_106x51@2x

 

BOM Detection

I assume that if I find a BOM at the start of the file that it is valid. Although it’s possible that the BOM could just be ANSI text, it’s highly unlikely. The BOMs are as follows:

EncodingBOM
UTF-80xEF, 0xBB, 0xBF
UTF-16 Little Endian0xFF, 0xFE
UTF-16 Big Endian0xFE, 0xFF

 

UTF-8 Detection

UTF-8 checking is reliable with a very low chance of false positives, so this is done first. If the text is valid UTF-8 but all the characters are in the range 0-127 then this is essentially ASCII text and can be treated as such – in this case I don’t continue to check for UTF-16.

If a character is in the range of 0-127 then it is a single character and nothing more needs to be done. Values above 127 indicate multibyte encoding using the next 1, 2 or 3 bytes.

First byteNumber of bytes in sequence
 0-1271 byte
 194-2232 bytes
 224-2393 bytes
 240-2444 bytes

These additional bytes are in the range 128-191. This scheme means that if we decode the text stream based on this method and no unexpected sequences occur then this is almost certainly UTF-8 text.

 

UTF-16 Detection

UTF-16 text is generally made up of 2-byte sequences (technically, there can be a 4-byte sequence with surrogate pairs). Depending on the endianness of the file the unicode character 0x1234 could be represented in the character stream as “0x12 0x34″ or “0x34 0x12″.  The BOM is usually used to easily determine if the file is in big or little endian mode. Without a BOM this is a little more tricky to determine.

I use two methods to try and determine if the text is UTF-16 and the endianness. The first is the newline characters 0x0a and 0x0d. Depending on the endianness they will be sequenced as “0x0a 0x00″ or “0x00 0x0a”. If every instance of these characters in a text file is encoded the same way then that is a good sign that the text is UTF-16 and if it is big or little endian. The drawback of this method is that it won’t work for very small amounts of text, or files that don’t contain newlines.

The second method relies on the fact that many files may contain large amounts of pure ASCII text in the range 0-127. This applies especially to files generally used in IT like scripts and logs. When encoded in UTF-16 these are represented as the ASCII character and a null character. For example, space, 0x20 would be encoded as “0x00 0x20″ or “0x20 0x00″. Depending on the endianness this will result in a large amount of nulls in the odd or even byte positions. We just need to scan the file for these odd and even nulls and if there is a significant percentage in the expected position then we can assume the text is UTF-16.

 

The Library

The C# and C++ library can be downloaded here: TextEncodingDetect.zip
download_zip_106x51@2x

Using C# as the example, the two main public functions are:

public Encoding CheckBOM(byte[] buffer, int size)
public Encoding DetectEncoding(byte[] buffer, int size)

These functions return the Encoding which is the following enum:

public enum Encoding
{
    None,               // Unknown or binary
    ANSI,               // 0-255
    ASCII,              // 0-127
    UTF8_BOM,           // UTF8 with BOM
    UTF8_NOBOM,         // UTF8 without BOM
    UTF16_LE_BOM,       // UTF16 LE with BOM
    UTF16_LE_NOBOM,     // UTF16 LE without BOM
    UTF16_BE_BOM,       // UTF16-BE with BOM
    UTF16_BE_NOBOM      // UTF16-BE without BOM
}

The DetectEncoding function takes a byte buffer and a size parameter. The larger the buffer that is used, the more accurate the result will be. I’d recommend at least 4KB.

Here is an example of passing a buffer to the DetectEncoding function:

// Detect encoding
var textDetect = new TextEncodingDetect();
TextEncodingDetect.Encoding encoding = textDetect.DetectEncoding(buffer, buffer.Length);

Console.Write("Encoding: ");
if (encoding == TextEncodingDetect.Encoding.None)
{
    Console.WriteLine("Binary");
}
else if (encoding == TextEncodingDetect.Encoding.ASCII)
{
    Console.WriteLine("ASCII (chars in the 0-127 range)");
}
else if (encoding == TextEncodingDetect.Encoding.ANSI)
{
    Console.WriteLine("ANSI (chars in the range 0-255 range)");
}
else if (encoding == TextEncodingDetect.Encoding.UTF8_BOM || encoding == TextEncodingDetect.Encoding.UTF8_NOBOM)
{
    Console.WriteLine("UTF-8");
}
else if (encoding == TextEncodingDetect.Encoding.UTF16_LE_BOM || encoding == TextEncodingDetect.Encoding.UTF16_LE_NOBOM)
{
    Console.WriteLine("UTF-16 Little Endian");
}
else if (encoding == TextEncodingDetect.Encoding.UTF16_BE_BOM || encoding == TextEncodingDetect.Encoding.UTF16_BE_NOBOM)
{
    Console.WriteLine("UTF-16 Big Endian");
}

 

Nulls and Binary

One quirk of the library is how I chose to handle nulls (0x00). These are technically valid in UTF-8 sequences, but I’ve assumed that any file that contains a null is not ANSI/ASCII/UTF-8. Allowing nulls for UTF-8 can cause a false return where UTF-16 text containing just ASCII can appear to be valid UTF-8. To disable this behaviour just set the NullSuggestsBinary property on the library to false before calling DetectEncoding. In practice, most text files don’t contain nulls and the defaults are valid.

 

 

facebooktwitterredditlinkedinmail

The post UTF-8 and UTF-16 Text Encoding Detection Library appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/development/utf-8-utf-16-text-encoding-detection-library/feed/ 0
Mass Redistribute Packages in ConfigMgr 2012https://www.autoitconsulting.com/site/deployment/mass-redistribute-packages-configmgr-2012/ https://www.autoitconsulting.com/site/deployment/mass-redistribute-packages-configmgr-2012/#comments Fri, 10 Jan 2014 17:30:26 +0000 http://www.autoitconsulting.com/site/?p=1105 Recently I came across an issue with System Center Conf […]

The post Mass Redistribute Packages in ConfigMgr 2012 appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

Recently I came across an issue with System Center Configuration Manager (ConfigMgr) 2012 where a large number of packages and applications had failed to be distributed to a new Distribution Point (DP). There were many thousands of packages in the environment and a few hundred were showing in the error state in the console. Unfortunately, once packages get into this state then they don’t fix themselves as it appears that the ConfigMgr validation schedule only reports that there was a problem distributing – it doesn’t actually trigger a redistribution.

To fix the problem you have to identify each package that has a problem and go into its properties. Then you need to select the DPs you want to redistribute to and click Redistribute.

CM2012 Console Redistribute

You have to to do this for each package that has a problem taking care to only redistribute to the DPs that have have failures. In an environment with a few hundred problem packages with thousands of DPs this is completely impractical. What we need is a script that can do the following:

  • Identify packages that were unsuccessfully distributed.
  • Mass redistribute packages that are in the error state.
  • Only redistribute packages to DPs that need it to avoid needless processing and network traffic.
  • Optionally, limit the process to a specific DP. For example, a newly built DP was incorrectly configured, has subsequently been fixed but all the packages assigned to it have entered an error state.

The VBScript is listed below and can also be downloaded here: CM2012_DP_Redist.zip

' ****************************************************************************
'
' Copyright (c) 2013, Jonathan Bennett / AutoIt Consulting Ltd
' All rights reserved.
' http://www.autoitconsulting.com
'
' THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
' ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
' WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
' DISCLAIMED. IN NO EVENT SHALL AUTOIT CONSULTING LTD BE LIABLE FOR ANY
' DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
' (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
' LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
' ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
' (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
' SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
'
' ****************************************************************************
'
' Purpose:   Checks all packages assigned to a DP and redistributes any with errors.
' 
' Usage:     cscript.exe CM2012_DP_Redist.vbs
' 
' Version:   1.0.0
' 
' History:
' 1.0.0	20-Nov-2013 - Jonathan Bennett 
'       First version.
'
' ****************************************************************************

' ****************************************************************************
' Global constant and variable declarations
' ****************************************************************************

Option Explicit

' CHANGEABLE VARIABLES

' The name of the CAS/Primary site server
Public Const CASServerName = "CASSERVERNAME"

' Which DP to refresh packages for - leave this blank to check ALL DPs
Public Const DPServerName = "DPSERVERNAME"

' END OF CHANGABLE VARIABLES


Public Const wbemFlagReturnImmediately = 16
Public Const wbemFlagForwardOnly = 32

Dim oSWbemServices

' ****************************************************************************
' End declarations
' ****************************************************************************

' ****************************************************************************
' Main routine
' ****************************************************************************

' Connect to CM 2012 (CAS)
If ConnectServer(CASServerName) = False Then
	WScript.Echo "Unable to connect to server."
	WScript.Quit 1
End If


' Find all packages with a bad status
Dim queryString
Dim dpStatuses, dpStatus, serverName, packageID, packageDPs, packageDP, nalPath

If DPServerName = "" Then
	queryString = "SELECT Name,PackageID,MessageState FROM SMS_DistributionDPStatus WHERE MessageState > 2"
Else
	queryString = "SELECT Name,PackageID,MessageState FROM SMS_DistributionDPStatus WHERE Name LIKE '%" & DPServerName & "%' AND MessageState > 2"
End If

Set dpStatuses = oSWbemServices.ExecQuery(queryString,, wbemFlagForwardOnly+wbemFlagReturnImmediately)
For Each dpStatus in dpStatuses
	serverName = UCase(dpStatus.Name)
	packageID = dpStatus.PackageID

	queryString = "SELECT * FROM SMS_DistributionPoint WHERE PackageID = '" & packageID & "'"
	Set packageDPs = oSWbemServices.ExecQuery(queryString,, wbemFlagForwardOnly+wbemFlagReturnImmediately)
	For Each packageDP in packageDPs
		nalPath = UCase(packageDP.ServerNalPath)

		If InStr(1, nalPath, serverName) > 0 Then
			WScript.Echo "Redistributing " & packageID & " on " & serverName & "..."
			packageDP.RefreshNow = True
			On Error Resume Next
			packageDP.Put_
			On Error Goto 0
		End If

	Next
Next


WScript.Quit 0


' ****************************************************************************
' Functions
' ****************************************************************************

Function ConnectServer(ByVal serverName)
	If serverName = "" Then serverName = "."
	
	Dim oSWbemLocator : Set oSWbemLocator = CreateObject("WbemScripting.SWbemLocator")
	
	On Error Resume Next
	Set oSWbemServices = oSWbemLocator.ConnectServer(serverName, "root\sms")
	If Err.Number <> 0 Then
		ConnectServer = False
		Exit Function
	End If
	On Error Goto 0
	
	Dim ProviderLoc : Set ProviderLoc = oSWbemServices.ExecQuery("Select * FROM SMS_ProviderLocation WHERE ProviderForLocalSite='True'")

	Dim Location
	For Each Location In ProviderLoc
		Set oSWbemServices = oSWbemLocator.ConnectServer(Location.Machine, "root\sms\site_" + Location.SiteCode)
		Exit For
	Next
	ConnectServer = True
End Function

At the top of the script you will want to change the CASServerName and DPServerName values to match your environment. If DPServerName is left blank then all DPs will be in scope, if a server is specified then redistributions are limited to just that DP. For example, my CAS is called server04 and I want to trigger packages on all DPs so I would use:

Public Const CASServerName = "server04"
Public Const DPServerName = ""

How does the script work?

Well you need to understand that there are two main WMI classes that we need to work with:

SMS_DistributionDPStatus

This class details the status of a package on a DP. The main properties we are interested in are Name (the DP server name), PackageID and MessageState. The MSDN article for this class defines a MessageState of 3 to indicate an error. While testing the results where that this is not correct and the correct value is 4. Just in case, our script will assume anything greater than 2 to indicate an error.

SMS_DistributionPoint 

This class represents an instance of a package assigned to a DP. It provides the RefreshNow method which triggers the redistribution of the specific package on the specific DP.

It might be surprising to learn that there is an instance of each of these classes for each package for each DP. So if you have 100 packages and 50 DPs then there are 5000 SMS_DistributionDPStatus and 5000 SMS_DistributionPoint objects. This is useful to understand if you are using a WMI browser when troubleshooting.

The script follows this process:

  1. Gets a collection of all SMS_DistributionDPStatus objects where the MessageState is greater than 2.
  2. Extracts the PackageID and DP server name for each of these objects.
  3. Gets a collection of all SMS_DistributionPoint objects where the PackageID and DP match from our SMS_DistributionDPStatus collection.
  4. Triggers the SMS_DistributionPoint.RefreshNow method for each member of the collection.

After running the script any packages that have failed will be redistributed and the progress can be seen in the normal way in the ConfigMgr console.

As the script only redistributes failed packages it is safe to rerun as often as required.

 

facebooktwitterredditlinkedinmail

The post Mass Redistribute Packages in ConfigMgr 2012 appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/mass-redistribute-packages-configmgr-2012/feed/ 0
Detect an SSD Disk Using a Scripthttps://www.autoitconsulting.com/site/scripting/detect-an-ssd-disk-using-a-script/ https://www.autoitconsulting.com/site/scripting/detect-an-ssd-disk-using-a-script/#comments Tue, 07 Jan 2014 10:58:29 +0000 http://www.autoitconsulting.com/site/?p=1108 A common request when creating automated desktop builds […]

The post Detect an SSD Disk Using a Script appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

A common request when creating automated desktop builds or custom maintenance tools is the ability detect if the main system drive is an SSD disk or not. This information can be used to configure a system in a particular way. For example, to run (or not run) certain maintenance tasks or to optimally configure the system settings or services to take advantage of the way SSD disks perform.

Recent versions of Windows, like Windows 8, automatically optimise certain tasks if an SSD is detected, but earlier versions of Windows commonly used in corporate environments lack this functionality so it must be done manually. This kind of process is prone to errors as it relies on having the engineer decide if the disk is an SSD as some sort of build parameter. In other cases I’ve seen people maintain lists of drive vendors and models that are entered into .txt files and used by scripts and WMI calls to look for a match.

I recently released an updated version of the AutoIt scripting language and as part of the update I wanted to update the DriveGetType function to provide a solution to this problem. I added the ability to detect the bus and SSD status of a drive. The basis of the method I used is that described in this TechNet blog post about how the Windows defragmenter determines SSD drives. The main tests it performs are:

  • Disks whose driver reports “no seek penalty”.
  • Disks that report a nominal media rotation rate of 1.

Mostly all modern SSD drives are populating these values so it’s a better way of detection than checking WMI for vendor strings.

Here is a basic AutoIt script that will output a simple onscreen message showing the type, bus and SSD status of a given drive letter.  Remember to download AutoIt to run it.

; Some constants
Const $DT_DRIVETYPE = 1
Const $DT_SSDSTATUS = 2
Const $DT_BUSTYPE = 3

; Drive letter to check
Const $DriveLetter = "C:"

; Get drive type and verify it exists
$type = DriveGetType($DriveLetter, $DT_DRIVETYPE)
If @error Then
	MsgBox(4096, "Error", "Invalid drive requested")
	Exit
EndIf

; Get SSD status (blank return is non-SSD)
$ssd = DriveGetType($DriveLetter, $DT_SSDSTATUS)
If $ssd = "" Then $ssd = "Non SSD"

; Get Bus type
$bus = DriveGetType($DriveLetter, $DT_BUSTYPE)

; Create output message
$output = "Type: " & $type & @CRLF
$output &= "SSD: " & $ssd & @CRLF
$output &= "Bus: " & $bus
MsgBox(4096, "Drive Info", $output)

The above script will just output a message with some info – which is great for testing the functionality. However, here is how you take take that script and turn it into something useful in a scripting environment or from a ConfigMgr or MDT Task Sequence.

Using AutoIt compile the following script as IsSSD.exe.

; Some constants
Const $DT_DRIVETYPE = 1
Const $DT_SSDSTATUS = 2
Const $DT_BUSTYPE = 3

; Check command line params
If $CmdLine[0] <> 1 Then
	MsgBox(4096, "Usage", "Usage:" & @CRLF & "IsSSD.exe ")
	Exit
EndIf

; Drive letter to check is the first parameter
$DriveLetter = $CmdLine[1]

; Get SSD info
$ssd = DriveGetType($DriveLetter, $DT_SSDSTATUS)

; Return 1 if it is an SSD, otherwise 0
If $ssd = "SSD" Then
	Exit 1
Else
	Exit 0
EndIf

When compiled, the above script can be run along with the required drive letter as follows:

IsSSD.exe C:

It will return 1 if the specified drive was SSD, or 0 if it is not SSD or it cannot be determined. This return value can be accessed using the standard ERRORLEVEL variable from a batch file or used by a Task Sequence in ConfigMgr or MDT.

If you don’t want to compile the script, you could run it directly with the AutoIt3.exe interpreter as well:

AutoIt3.exe IsSSD.au3 C:

I’ve created a .zip file with these scripts and the final IsSSD.exe file which can be downloaded here: IsSSD.zip.

facebooktwitterredditlinkedinmail

The post Detect an SSD Disk Using a Script appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/scripting/detect-an-ssd-disk-using-a-script/feed/ 2
GImageX v2.1.0 Released for Windows 8.1https://www.autoitconsulting.com/site/image-engineering/gimagex-v2-1-0-released-windows-8-1/ https://www.autoitconsulting.com/site/image-engineering/gimagex-v2-1-0-released-windows-8-1/#comments Thu, 21 Nov 2013 12:14:23 +0000 http://www.autoitconsulting.com/site/?p=1086 GImageX v2.1.0 has been released. GImageX is a graphica […]

The post GImageX v2.1.0 Released for Windows 8.1 appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

GImageX v2.1.0 has been released.

GImageX is a graphical user interface for the ImageX tool from the Windows Assessment and Deployment Kit (Windows ADK). ImageX is used to capture and apply WIM images for Windows deployments. GImageX uses the supported Microsoft WIMGAPI API for working with WIM files.

Previous versions of GImageX were compiled for the Windows Automated Installation Kit (WAIK). The old version actually worked fine for Windows 8 but I’ve updated it to make sure it works correctly with the latest Windows ADK which was released with Windows 8.1 (this also works with older operating systems like Windows XP, Vista, 7, Server 2008, etc.)

GImageX allows you do perform the most common WIM operations from an easy to use interface. Including:

  • Image capture
  • Image deployment
  • Retrieve image information and XML information
  • Mount an image
  • Export images

To install GImageX you need a few files from the Windows ADK (wimgapi.dll, etc.). The Windows ADK is a large download by default, but you can get the required files by only choosing the Deployment Tools option during setup. Instructions for which files you need to extract are given in the GImageX help file.

Download GImageX here.

Download the Windows ADK here.

In summary the changes to GImageX are:

  • Compiled against the latest WIMGAPI libraries supplied in the Windows ADK for Windows 8.1.
  • Added missing SKU flags (Professional, ServerStandardCore, Embedded, etc.) – although these can actually left blank so that GImageX automatically determines them (even the old version of GImageX correctly did this for new operating systems).

GImageX works on Windows XP and later and also works in WinPE.

Here is a screenshot:

GImageX GUI

Download GImageX here.

Download the Windows ADK here.

facebooktwitterredditlinkedinmail

The post GImageX v2.1.0 Released for Windows 8.1 appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/image-engineering/gimagex-v2-1-0-released-windows-8-1/feed/ 0
Configuring an AirPort Extreme for NAT Only Modehttps://www.autoitconsulting.com/site/networking/configuring-an-airport-extreme-for-nat-only-mode/ https://www.autoitconsulting.com/site/networking/configuring-an-airport-extreme-for-nat-only-mode/#comments Fri, 08 Feb 2013 15:28:45 +0000 http://www.autoitconsulting.com/site/?p=1025 This post will show how you can use an configure an Air […]

The post Configuring an AirPort Extreme for NAT Only Mode appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

This post will show how you can use an configure an AirPort Extreme for NAT only mode so that you can allow an additional DHCP server on your network to handle IP address allocation. The instructions are for the AirPort Extreme in the Time Capsule, but I believe this should be the same for a standard AirPort Extreme as well.

I recently got an Apple Time Capsule to replace my old linksys cable router. It’s a great little unit but one thing was causing me an issue with my home setup and I couldn’t initially get it to play nice with my home network.

Due to the nature of my work I have quite a few machines on my home network. These are using for testing out various bits of Microsoft software such as System Center Configuration Manager and MDT. This involves a number of virtual and physical hosts, a full Microsoft Active Directory, DHCP and DNS setup. I also have a number of virtual client machines which are constantly rebuilt for testing. The way I have it setup is that the Microsoft DHCP server is responsible for allocating IP addresses and it causes the clients to use the Microsoft DNS (along with the Dynamic DNS registrations) along with some other specific DHCP scope options. The Microsoft DNS is setup to forward external DNS requests to the router. This ensures that all the Microsoft clients can correctly register themselves in the Microsoft DNS but can still access the internet directly. All my other non-Microsoft devices (laptop, iPad, TV, etc.) can work normally as well.

For this setup to work, all I do is to turn off the DHCP server on the router so that the Microsoft DHCP server can take over. This is where the problems started because you don’t have that option in the interface for the Router Mode. You only get these options:

  • DHCP and NAT – This is the default mode and it runs a DHCP server and lets clients access the internet.
  • DHCP Only – This runs a DHCP server but doesn’t function as a router.
  • Off (Bridge Mode) – This is just used for acting as a wifi extender.

None of these modes work for me. What I actually need is an AirPort Extreme “NAT Only” router mode that doesn’t exist.

If you have more than one DHCP server on a LAN then both will try and hand out IP addresses to clients, but the client will register with the first server that responds. My solution was to configure the Time Capsule so that it was running in the DHCP and NAT mode so that it could be correctly used as an internet gateway, but I would configure it so that it had no free IP addresses to hand out. This would mean that any clients would only be able to successfully request an IP address from my Microsoft DHCP server.

My solution is:

  • Set the Router Mode to “DHCP and NAT”.
  • Create the smallest possible DHCP range (2 IP addresses in the AirPort software).
  • Create “dummy” reservations for the DHCP range so that the addresses can’t actually be used.

Here is how I configured it. I’ll be using the IP range of 192.168.0.x

Open the AirPort utility and go to the Network tab. Set the Router Mode to “DHCP and NAT” as shown in the screenshot above.

Click the Network Options… button and setup the DHCP for the 192.168 network and the range will be from 253 to 254 then click Save.

This will mean that the AirPort Extreme will have the address 192.168.0.1 and it will hand out the 192.168.0.253 and 192.168.0.254 addresses to clients. But we don’t want to the hand out any addresses! We get around this be creating a couple of dummy reservations. From the Network tab and in the DHCP Reservations section, click the + symbol.

Now enter a new reservation with the name DummyReservation1 and a MAC address of 00:00:00:00:00:00 and click Save.

Add a second reservation with the name DummyReservation2 and a MAC address of 00:00:00:00:00:01 and click Save. (Note: the two reservations must have different MAC addresses or they will vanish when you save the configuration).

The DHCP Reservation list should now look like this:

Finally click Update to store and activate the new configuration. Remember that your other DHCP server is now in charge of handing out IP addresses in that range – in this case that is 192.168.0.2 to 192.168.0.252.

facebooktwitterredditlinkedinmail

The post Configuring an AirPort Extreme for NAT Only Mode appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/networking/configuring-an-airport-extreme-for-nat-only-mode/feed/ 3
Website Server Rebuild Diaryhttps://www.autoitconsulting.com/site/hosting/website-server-rebuild-diary/ https://www.autoitconsulting.com/site/hosting/website-server-rebuild-diary/#comments Mon, 19 Nov 2012 14:57:25 +0000 http://www.autoitconsulting.com/site/?p=954 Not a usual scripting or deployment post this time – th […]

The post Website Server Rebuild Diary appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

Not a usual scripting or deployment post this time – this is a website server rebuild diary. I wanted to make a quick post to document a server rebuild and to log the sort of work that goes into maintaining this and the www.autoitscript.com site. Even small (ish) sites can be quite a bit of effort.

Current environment is CentOS 5.8 (upgraded many times from earlier versions of 5.x) running on a dedicated server (Quad core, 8GB RAM, mirrored disks).

CentOS packages aren’t upgraded throughout the lifetime of a major release (all 5.x versions) aside from security fixes. This means that the OS is very stable and great for hosting but it can require some custom packages of newer software to cater for the websites that I want to run. Significant packages include:

  • Apache 2.2.22 (Jason Litka Repo)
  • MySQL 5.1.48 (Jason Litka Repo)
  • PHP 5.2.17 (Jason Litka Repo)
  • Subversion 1.7 (self build)
  • Python 2.4

As you can see, the only “standard” CentOS package is Python 2.4 as many of the CentOS internals rely on it and upgrading it can cause major problems.

The main web applications we use are:

We currently handle around 40GB of traffic per day and the forum frequently has close to 1000 active users at once. I have recently started to use Amazon CloudFront to serve images so that the site is quite fast no matter where it is accessed from.

Tuesday 13th November, 2012

Attempted to upgrade MediaWiki to version 1.20 and found that it now has a minimum requirement of  PHP 5.3. I found that CentOS had unusually released a special set of PHP 5.3 RPMs so I installed those. PHP immediately stopped working and this was traced to my APC (PHP accelerator) extension which needs to be compiled against the correct version of PHP. I reinstalled APC using “pecl” and everything worked again. I was pretty pleased that PHP 5.3 was running as there are more and more applications (particularly WordPress extensions) which require 5.3 that I could now use.

I looked to upgrade Trac to v1.0 which has always been fairly straightforward, but found that it now had Python 2.5 as a minimum requirement. I could attempt to build a custom version of Python just for use in Trac, but I decided that as my hosting provider could supply a CentOS 6 image that I would use the opportunity to rebuild the server entirely (with so many upgrades and custom packages it would be nice to get back to a fresh install).

Unfortunately, the ISP imaging process will wipe the disk entirely, making the upgrade “fun”…

Wednesday 14th November, 2012

I spent a couple of hours double checking my backup processes. These are performed daily using a combination of:

  • Plesk Backup Manager – Backs up hosting settings, email, DNS and MySQL databases.
  • Custom shell scripts – Backs up anything not handled by Plesk – there’s a lot of tweaks to config files that happen over the years that I want to preserve. I also manually backup the MySQL databases in addition to Plesk as I’m a little paranoid…

The custom shell scripts create a rolling 30 day set of backups locally on the server. They also store the backups on a special FTP server provided by the ISP for off-server backups. Once a month I take a copy of  the latest backup to my home machine. This is not a quick operation as the backup files are 4 GB…zipped!

After checking the backup scripts I notice that the local FTP upload has not been working correctly due to an ISP infrastructure change. I also find that the newer version of Plesk that I’m using no longer backs up some of the customisations to web hosting config files that I’d made.

Time to triple check the scripts!

Thursday 15th November, 2012

During the rebuild the site is likely to be offline for a number of hours. At some point (after Plesk initially restores the hosting and database settings) the websites will likely be “up” but “broken” as I install the various software packages that the web applications depend on. I don’t want my Google rankings to be hit during this time and I want to minimise the problems that users get trying to use a half-installed site. This Google Webmaster post recommends using the 503 HTTP result code to indicate that the site is temporarily down so that search results don’t get mangled. I added the following code to my hosting configuration and will be enabling it just before I start the full site backup and restore.

ErrorDocument 503 "Our website is temporarily closed for maintenance. Please check back later."
RewriteEngine On
# TO ALLOW YOURSELF TO VISIT THE SITE, CHANGE 111 222 333 444 TO YOUR IP ADDRESS.
RewriteCond %{REMOTE_ADDR} !^111\.222\.333\.444$
RewriteRule .* - [R=503,L]

This will allow me to access the web sites normally with my IP address, but anyone else will get the 503 message.

Friday 16th November, 2012

I took the sites offline with the 503  message above and started the backup to the ISP local FTP site. I also took a copy to my home machine where I manually extracted the files to make sure that they looked OK. I started the ISP server image process for their CentOS 6.3 image. A nice feature of the 1&1 hosting package I use is that you can connect to your server via a SSH Serial console – this means that even if the server is completely broken and not on the network you can still interact with it. And when doing a server rebuild this helps to reassure myself that something is actually happening.

The imaging process took about an hour, and then it took another hour to download and extract the backup from FTP. Unlike my backup scripts I’d don’t have scripts to automate the restore process – usually I’m rebuilding to a different platform so it would be tricky to automate properly. However I have notes for every piece of software I need which documents how I set it up previously. I also keep a master list of Yum packages in a script so that I can reinstall every Yum package quickly.

A summary of the restore procedure is:

  • Run “Yum Update” to ensure that all components are current
  • Reinstall Yum packages from my master list
  • Run some custom scripts that recreate the users and permissions I need
  • Extract backup files from FTP
  • Use the Plesk Backup Manager to restore the website files, basic hosting settings, DNS, email and MySQL databases

At this point the main website functionality was restored and the website and forums could be brought online again which I did by removing my 503 message. The site was usable again for the majority of users and I could restore the rest of the site services over the next few days (Subversion, Trac, etc).

facebooktwitterredditlinkedinmail

The post Website Server Rebuild Diary appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/hosting/website-server-rebuild-diary/feed/ 0
Using Custom XML and Config Files with USMT and ConfigMgrhttps://www.autoitconsulting.com/site/deployment/using-custom-xml-and-config-files-with-usmt-and-configmgr/ https://www.autoitconsulting.com/site/deployment/using-custom-xml-and-config-files-with-usmt-and-configmgr/#comments Thu, 02 Feb 2012 18:00:45 +0000 http://www.autoitconsulting.com/site/?p=840 Using USMT In System Center Configuration Manager (SCCM […]

The post Using Custom XML and Config Files with USMT and ConfigMgr appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

Using USMT In System Center Configuration Manager (SCCM / ConfigMgr)  when performing an Operating System Deployment (OSD) can be a bit of a chore. The built in Task Sequence actions available for capturing and restoring user state only allow you to perform a very basic migration. Most of the time USMT requires quite a bit of  tweaking to get right and once you’ve got everything working in a standalone script it can seem like a step backwards when you see the default ConfigMgr capture and restore actions. When using USMT in anger it usually involves:

  • Creating a custom XML file for migrating customer-specific application settings
  • Creating a custom XML file for migrating user data as per the customer requirements
  • Creating a custom config.xml file to fine tune built-in migration settings
Unfortunately, the USMT options in the ConfigMgr interface are basic and you have to start setting special Task Sequence variables in order to get the ‘clever’ stuff to work. This post details how to achieve that.

These instructions are for ConfigMgr 2012 but they are pretty much identical for ConfigMgr 2007 as well. A state capture and restore is usually done as part of a Operating System Deployment Task Sequence. For the purposes of this walkthrough I’ll be showing a cut down Task Sequence that just captures user state using a ConfigMgr State Migration Point (SMP). Note: I’m also assuming that any custom XML files and a custom config.xml file are copied into the same folder as the USMT package.  (For example. USMT\amd64\config.xml and USMT\amd64\CustomData.xml). To see how to create your USMT package see this previous post.

The Task Sequence actions we are interested in are:

  • Request State Store – This allows the client to access the ConfigMgr State Migration Point
  • Set OSDMigrateAdditionalCaptureOptions – This sets a special Task Sequence variable to allow the use of custom USMT command line options
  • Capture User State – This performs the state capture
  • Release State Store – This releases access to the ConfigMgr State Migration point

When creating an OSD Task Sequence with User State Migration the Task Sequence automatically includes Request State Store, Capture User State and Release State Store. The Set OSDMigrateAdditionalCaptureOptions is our custom action. The default actions can also be manually created from the User State Task Sequence menu.

Let’s look at the Set OSDMigrateAdditionalCaptureOptions step first. This is just a standard action that sets a Task Sequence variable.

The OSDMigrateAdditionalCaptureOptions variable allows us to add our own parameters to the scanstate.exe command line when executed. In this example we set the value to the following:

/uel:30 /ue:* /config:"%_SMSTSMDataPath%\Packages\%_OSDMigrateUsmtPackageID%\%PROCESSOR_ARCHITECTURE%\Config.xml"

Those options in more detail:

  • /uel:30 /ue:* – Standard scanstate.exe options that mean we exclude local accounts, and we exclude accounts not used in the last 30 days
  • /config:”…” – This allows us to specify our custom config.xml file. Unfortunately we must use a full path to the USMT package and config.xml file. The various variables in this allow use to specify the location of the USMT package after it is downloaded to the client so that we can provide a full path to the config.xml file.
  • %_SMSTSMDataPath% – Resolves to the root location of the cached package on the client, e.g. C:\_SMSTaskSequence
  • %_OSDMigrateUsmtPackageID% – Resolves to the package ID, e.g. AUT00002
  • %PROCESSOR_ARCHITECTURE% – Resolves to the build architecture, e.g. amd64 or x86

The next step is to use the Capture User State task to specify which migration XML files we want to use. We use the Customize how user profiles are captured option and add the filenames of the migration XML files. In this example I want to use a couple of default USMT files (MigApp.xml and MigUser.xml) along with my own custom XML file (CustomData.xml). These files must be stored in the USMT\amd64 or USMT\x86 folder of the USMT package as appropriate.

And that’s it. This example USMT capture will now use our custom command line switches, custom config.xml, and custom migration files along with the ConfigMgr State Migration Point. I’d also recommend using the verbose logging option in the Capture User State action. This means that a log is produced on the client at C:\Windows\CCM\Logs\scanstate.log. This is very handy when trying to get this procedure to work as one of the first log entries is a summary of the command line options that were used.

The procedure for creating custom Restore User State options is similar except the custom Task Sequence variable name is OSDMigrateAdditionalRestoreOptions and the variable used in this would be %_OSDMigrateUsmtRestorePackageID%.

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post Using Custom XML and Config Files with USMT and ConfigMgr appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/using-custom-xml-and-config-files-with-usmt-and-configmgr/feed/ 3
Create a USMT Package in ConfigMgrhttps://www.autoitconsulting.com/site/deployment/create-a-usmt-package-in-configmgr/ https://www.autoitconsulting.com/site/deployment/create-a-usmt-package-in-configmgr/#comments Wed, 01 Feb 2012 11:26:30 +0000 http://www.autoitconsulting.com/site/?p=818 In System Center Configuration Manager (SCCM / ConfigMg […]

The post Create a USMT Package in ConfigMgr appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

In System Center Configuration Manager (SCCM / ConfigMgr)  when performing an Operating System Deployment (OSD) there is usually some form of user data and settings migration. This is often performed using the Microsoft User State Migration Toolkit (USMT). The first step in using USMT is to create the USMT tool ConfigMgr package. This post details how to create such a package for USMT 4.

 

These instructions are for ConfigMgr 2012 but they are pretty much identical for ConfigMgr 2007 as well.

 

First, get hold of the USMT 4 files. These can be found on the ConfigMgr server in the folder C:\Program Files\Windows AIK\Tools\USMT.

 

Copy these to your ConfigMgr software library share. In this example we will copy them to \\SERVER\ConfigMgr_SWStore$\OSD\USMT.

Note: In order to properly support migrations to/from Office 2010 you must also download the updated USMT components from this knowledge base article: http://support.microsoft.com/kb/2023591.

Now create a standard ConfigMgr package:

 

On the program type screen select Do not create a program – the OSD task sequences will handle this for us.

 

The USMT package is now ready for use in an OSD Task Sequence using the Capture User State and Restore User State actions.

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.

facebooktwitterredditlinkedinmail

The post Create a USMT Package in ConfigMgr appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/create-a-usmt-package-in-configmgr/feed/ 0
Create a Windows 7 BitLocker Partition in ConfigMgrhttps://www.autoitconsulting.com/site/deployment/create-a-windows-7-bitlocker-partition-in-configmgr/ https://www.autoitconsulting.com/site/deployment/create-a-windows-7-bitlocker-partition-in-configmgr/#comments Mon, 30 Jan 2012 21:47:25 +0000 http://www.autoitconsulting.com/site/?p=784 In System Center Configuration Manager (SCCM / ConfigMg […]

The post Create a Windows 7 BitLocker Partition in ConfigMgr appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

In System Center Configuration Manager (SCCM / ConfigMgr)  something I’ve done a few times is to create a BitLocker partition for Windows 7 during an Operating System Deployment (OSD) Task Sequence. I’ve seen the method used here a few times before but I wanted to document it for myself so that I can use it for an upcoming article on configuring BitLocker with TPM+PIN from ConfigMgr.

 

These instructions are the same for both ConfigMgr 2007 and 2012.

Edit your deployment Task Sequence and select the Partition Disk 0 node. By default there is a single partition, we will create an additional partition so that we end up with two partitions named:

  • System Reserved
  • OS

 

Call the first partition System Reserved and configure it with the following options:

  • Partition type: Primary, Use a specific size (300 MB)
  • Make this a boot partition: Checked
  • File system: NTFS, Quick format

 

For Windows 7 you only need a 100 MB partition for BitLocker, but I prefer to use 300 MB to leave room in case you want to use the Windows Recovery Environment.

Call the second partition OS and configure it with the following options:

  • Partition type: Primary, Use a percentage of remaining free space (100%)
  • File system: NTFS, Quick format
  • Variable: OSPART

 

The variable OSPART can now be used to correctly identify the partition to be used in the Apply Operating System step of the Task Sequence. Configure as follows:

  • Destination: Logical drive letter stored in a variable
  • Variable name: OSPART

 

Now when this Task Sequence runs the disk will be partitioned correctly for future use of BitLocker. Alternatively, you can also just use a single partition and run the BitLocker preparation utility to shrink and partition the drive. This is sometimes useful when using USMT and hard linking to preserve disk contents. Personally, I’m a little paranoid and generally prefer to use USMT to backup to the network and then properly clean and partition the disk as above.

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post Create a Windows 7 BitLocker Partition in ConfigMgr appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/create-a-windows-7-bitlocker-partition-in-configmgr/feed/ 0
ConfigMgr 2012 SQL Database Collationhttps://www.autoitconsulting.com/site/deployment/configmgr-2012-sql-database-collation/ https://www.autoitconsulting.com/site/deployment/configmgr-2012-sql-database-collation/#comments Thu, 26 Jan 2012 11:39:44 +0000 http://www.autoitconsulting.com/site/?p=741 I just tried to update my ‘default’ installation of Sys […]

The post ConfigMgr 2012 SQL Database Collation appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

I just tried to update my ‘default’ installation of System Center Configuration Manager (SCCM / ConfigMgr) 2012 RC1 to RC2. I got a failure at the pre-req check stage stating that my SQL 2008 R2 database was using the wrong collation type and that installation couldn’t continue. It turns out that the default collation that I had used when I installed SQL 2008 R2 (a next…next…next…finish install) was no good. I’m not sure why ConfigMgr 2012 RC1 was happy with that configuration but RC2 certainly wasn’t!

This article states the supported database configuration for ConfigMgr 2012 – it doesn’t appear to mention collation type but apparently this will be updated in the future: http://technet.microsoft.com/en-us/library/gg682077.aspx#BKMK_SupConfigSQLDBconfig

The supported collation type is: SQL_Latin1_General_CP1_CI_AS

For reference, here are some screenshots on how to set this collation type during the installation of SQL Server 2008 R2.

 Collation 1

Collation 2

Collation 3

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post ConfigMgr 2012 SQL Database Collation appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/configmgr-2012-sql-database-collation/feed/ 0
MDT and ConfigMgr Build Drivers for Windows 7https://www.autoitconsulting.com/site/deployment/mdt-and-configmgr-build-drivers-for-windows-7/ https://www.autoitconsulting.com/site/deployment/mdt-and-configmgr-build-drivers-for-windows-7/#comments Sun, 03 Jul 2011 09:27:25 +0000 http://www.autoitconsulting.com/site/?p=685 When doing Windows 7 builds in either MDT 2010 or Confi […]

The post MDT and ConfigMgr Build Drivers for Windows 7 appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

When doing Windows 7 builds in either MDT 2010 or ConfigMgr 2007 you have to download and install a selection of drivers that are to be installed in your images. Some drivers are used for the WinPE boot image so that the boot image can access the local harddisk (a boot critical driver) and also drivers that allow the machine to connect to the network so that it can continue the MDT/ConfigMgr build process. I tend to use the latest versions of certain boot critical drivers direct from the main vendor (i.e. Intel and Broadcom) for my boot images and references machines. Then, at deployment-time, I’ll inject the vendor-specific drivers on a model-by-model basis. This also helps keep the reference machine ‘clean’ and free from all those bits of add-on software and control panel applets that some vendor drivers inflict on us. It also helps keep the number of drivers injected into the boot image to a minimum, all too often I’ve seen hundereds of vendor versions of the exact same driver injected into a boot image and some of them can be really buggy or bloat the image for no reason.

As these drivers can be a little difficult to find, this post details the download locations of MDT and ConfigMgr build drivers for Windows 7 that will cater for 99% of the machines that you will ever likely encounter. I’ll try to keep it updated whenever the download locations change (I’ll be using this page as a reference point for myself too!).

Intel® Rapid Storage Technology – Boot Critical Driver

This driver will handle Intel RAID and SATA for the vast majority of Intel desktop and laptop machines. Without this driver Windows will be unable to access the local hard disk.

Download Intel® Rapid Storage Technology Driver (select your required OS version in the dropdown list)

To extract the driver files run:

iata_enu.exe -e -p C:\Intel_RST_Driver

Intel® PROSet Wired Network Drivers – Network Driver

These PROSet drivers will handle Intel wired Ethernet on the majority of desktop and laptop machines.

Download Intel® PROSet Wired Network Drivers (select your required OS version in the dropdown list)

To extract the driver files run:

PROWin.exe /f C:\Intel_PROSet_Driver /s /e

Broadcom NetXtreme Desktop Drivers – Network Driver

If a machine doesn’t use the Intel PROSet drivers then it’s almost certainly going to be the Broadcom NetXtreme variety.

Download Broadcom NetXtreme Network Drivers

Intel® 5, 4, 3, 900 Series Chipsets – System Drivers

These Intel chipset drivers aren’t required for boot or reference images but they handle a lot of the ‘unknown’ devices on Intel-chipset machines. I tend to include these as a generic set of drivers for most machines rather than using the vendor versions that are often outdated.

Download the Intel® 5, 4, 3, 900 Series Chipsets Drivers (select your required OS version in the dropdown list and download the .ZIP version)

 

Summary

Armed with the latest versions of these drivers you should be able to get your MDT or ConfigMgr build process working on most of the machines in common use saving you hours of hunting around the baffling array of drivers on Intel’s website!

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post MDT and ConfigMgr Build Drivers for Windows 7 appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/mdt-and-configmgr-build-drivers-for-windows-7/feed/ 1
Setting a UK Keyboard in ConfigMgr 2007 OSDhttps://www.autoitconsulting.com/site/deployment/setting-a-uk-keyboard-in-configmgr-2007-osd/ https://www.autoitconsulting.com/site/deployment/setting-a-uk-keyboard-in-configmgr-2007-osd/#comments Tue, 28 Jun 2011 20:46:54 +0000 http://www.autoitconsulting.com/site/?p=659 In the UK, we actually get the US version of Windows 7 […]

The post Setting a UK Keyboard in ConfigMgr 2007 OSD appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

In the UK, we actually get the US version of Windows 7 and Office 2010, so whenever I do any build engineering in either MDT or ConfigMgr one of the first things I need to customize are the regional settings. In particular the keyboard layout and location. There are many ways to achieve this ranging from manually making the change in a reference image or creating an unattend.xml file. In this post I will show how you can set regional settings for Windows 7 at deploy-time using ConfigMgr 2007 OSD and collection variables. This will allow you to automatically have different settings depending on which ConfigMgr collection a machine belongs to. The process shown in this post is for UK settings, but the method will work equally well for any other regional requirements.

First, on a collection used for OSD deployment select Modify Collection Setttings and goto the Collection Variables tab. Enter the following variables:

  • OSDSystemLocale: en-GB
  • OSDInputLocale: en-GB
  • OSDUserLocale: en-GB
  • OSDUILanguage: en-US
  • OSDUILanguageFallback: en-US

Note: A common error is trying to use a value of en-GB for the UILanguage - there is no such UI setting, Brits have to make do with American spellings in our Microsoft products… The collection variables should now look like this:

Now create a simple unattend.xml file using the Windows Automated Installation Kit and the Windows System Image Manager tool. You need to:

  • Add the component amd64_Microsoft-Windows-International-Core_neutral to the specialize section. (use the x86 component if you are using a 32 bit version of Windows 7)
  • Enter the ‘OSD…’ variables we created above in the relevant sections wrapping them with % signs, for example %OSDSystemLocale% for the SystemLocale entry.

When finished it should look something like this:

Save the unattend file in a suitable folder (I’m saving it in \\server\ConfigMgr_SWStore$\OSD\Unattend\Win7SP1x64_Unattend.xml – using the folder structure suggested in this post). I tend to put various unattend.xml files for different uses in the same folder then I create a single package for them all for simplified use – in this example I’ll call it OSD Unattend Files.

The final step is to attach this unattend.xml file to your OSD Task Sequence. Select the Apply Operating System task and enter the package and unattend file name in the Use an unattended or sysprep answer file… boxes:

Now when you use this task sequence the collection variables will be expanded inside the unattend file and voila – per collection regional settings! You can also set these same variables directly on computer objects if you need to specialise for an individual computer.

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post Setting a UK Keyboard in ConfigMgr 2007 OSD appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/setting-a-uk-keyboard-in-configmgr-2007-osd/feed/ 0
ConfigMgr 2007 OSD Folder Structurehttps://www.autoitconsulting.com/site/deployment/configmgr-2007-osd-folder-structure/ https://www.autoitconsulting.com/site/deployment/configmgr-2007-osd-folder-structure/#comments Sun, 26 Jun 2011 10:38:55 +0000 http://www.autoitconsulting.com/site/?p=624 When configuring ConfigMgr 2007 Operating System Deploy […]

The post ConfigMgr 2007 OSD Folder Structure appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

When configuring ConfigMgr 2007 Operating System Deployment (OSD) there are an endless number of applications, operating system images, operating system install files, drivers, source files, package files, temp files. Everytime I set it up I end up with a different structure so this blog post presents an example ConfigMgr 2007 OSD folder structure that can be used for general application and operating system deployment files. This post will be updated as I get better ideas and suggestions on how to organise things – but the main issue is that there is no ‘right’ way of organising files that everyone can agree on so this is as good a place as any!

First I create a single share with the following properties:

SettingConfigNotes
Share NameConfigMgr_SWStore$I use $ just to make the share hidden from casual browsing.
Share SecurityEveryone: Full Control
NTFS SecuritySYSTEM: Full ControlAdministrators: Full ControlConfigMgr Site Servers: ModifyConfigMgr Network Access Account: ModifyI usually create a group containing the computer accounts of the ConfigMgr site servers which makes applying permissions easier. Also, the Network Access Account (configured in the Computer Client Agent properties) will require access during OSD.Depending on your security polcies you may also wish to remove users from accessing this share.

Then create a folder structure as shown in this image:

The top level folders are split into Apps, OSD and Updates.

The Apps folder:

FolderDescription
Apps \ App-V
The main folder for App-V related files.
Apps \ App-V \ Source
App-V source packages used when first importing the application into ConfigMgr.
Apps \ App-V \ PackagesThe resulting App-V package Data Source.
Apps \ NativeNative packages (Data Source for MSIs, .exes, batch files, etc.)
Apps \ Native \ Microsoft \ Office 2010 Pro
Apps \ Native \ Adobe \ Reader X

The OSD folder:

FolderDescription
OSD \ Install
Operating System Install Packages (full set of source files with setup.exe used for creating the reference image).
OSD \ Install \ Windows 7 Enterprise x64 SP1
OSD \ Install \ Windows 7 Enterprise x86 SP1
OSD \ ImageOperating System Image (.wim) files – the reference images – used for deployment.
OSD \ UnattendUnattend.xml files and packages used for customizing deployment.
OSD \ CaptureTemporary location used for storing the reference image during capture. After capture move the resulting image file into the Image folder.
OSD \ DriversDevice drivers.
OSD \ Drivers \ SourceSource files for device drivers used when importing drivers into the ConfigMgr interface.
OSD \ Drivers \ PackagesResulting driver packages (Data Source).
OSD \ ToolsSource files for tool packages such as MDT or USMT.
OSD \ Tools \ USMTSource files for the User State Migration Toolkit (USMT).
OSD \ Tools \ MDT_ToolkitSource files for the User State Migration Toolkit (USMT).
OSD \ Tools \ MDT_CustomSettingsMDT CustomSettings.ini files.

The Updates folder:

FolderDescription
Updates
Folder for storing Software Updates packages.
Updates \ All Updates 2010
Example package.
Updates \ All Updates 2011Example package.

Here is a zip file of the entire folder structure you can use as a template: ConfigMgr_SWStore.zip

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post ConfigMgr 2007 OSD Folder Structure appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/configmgr-2007-osd-folder-structure/feed/ 0
Get Selected Microsoft TechNet Library Content Offlinehttps://www.autoitconsulting.com/site/deployment/get-selected-microsoft-technet-library-content-offline/ https://www.autoitconsulting.com/site/deployment/get-selected-microsoft-technet-library-content-offline/#comments Wed, 15 Jun 2011 12:42:31 +0000 http://www.autoitconsulting.com/site/?p=574 No-one can ever know everything they need to know about […]

The post Get Selected Microsoft TechNet Library Content Offline appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

No-one can ever know everything they need to know about Microsoft products and techies will freely admit that they spend half their time checking resources such as Microsoft TechNet while working on a problem or design. Sometimes I need to go on to a site where I won’t have internet access. As anyone who relies on access to online resources will know, this can really slow things down and small questions (such as “is it port 60 or 67?”) can be frustrating.

There is a free tool called Package This that can turn selections from the online TechNet library into a compiled help file (.chm) that you can then read and search offline! Yes, it seems to have a few bugs but if you keep the selections small it seems to work very well indeed.

As an example I will demonstrate downloading the Operating System Deployment section of the Config Mgr 2007 documentation. Here are the steps to create a .chm from selected content:

  1. Download and install HTML Help Workshop (If running Windows Vista and later you will get a message after installation saying that you already have a later version installed – that’s fine, but Package This doesn’t seem to be able to use it until you’ve attempted to install this older version).
  2. Download the latest version of PackageThis.exe.
  3. Run PackageThis.exe.
  4. Select the TechNet Library from the Library menu.
  5. Find the content you want – you can either choose single pages or get a whole section by using Select This Node and All Children. Note: it’s best to keep the selections small as the program has a few bugs and will sometimes not work correctly. 
  6. The selected articles will be downloaded – this can take some time depending on the number selected.
  7. Once all the content has been download select Export to Chm File… from the File menu.
  8. Enter the filename and title for the .chm file and click .

 

If all goes well you should now have a .chm file with the selected content that you can access when away from the Internet! As a bonus it’s even searchable.

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post Get Selected Microsoft TechNet Library Content Offline appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/get-selected-microsoft-technet-library-content-offline/feed/ 0
App-V Recipe: AutoIt v3https://www.autoitconsulting.com/site/deployment/app-v-recipe-autoit-v3/ https://www.autoitconsulting.com/site/deployment/app-v-recipe-autoit-v3/#comments Fri, 20 May 2011 11:39:30 +0000 http://www.autoitconsulting.com/site/?p=538 This blog post will show you how to sequence AutoIt usi […]

The post App-V Recipe: AutoIt v3 appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

This blog post will show you how to sequence AutoIt using App-V 4.6 SP1 (the latest version at the time of writing). Now, AutoIt is probably the easiest application to sequence in the world and doesn’t really need a blog post describing how to do it. However, I intend to write some more posts around using App-V in System Center Configuration Manager (ConfigMgr) 2007 for deployment and upgrades so a nice easy to follow recipe for an application that is free should be useful.

 

For sequencing I’ll be using Windows XP SP3 and creating a package that can be used on all later operating systems. I’ve set up the Windows XP machine as per the recommendations in App-V 4.6 SP1 Sequencing Guide – essentially all AV has been turned off, Windows update is off, and the App-V 4.6 SP1 sequencer has been installed. In my case the App-V sequencer machine is a Hyper-V virtual machine so that I can use snapshots to quickly get it back to a clean state.

Onto the recipe:

  1. Download the AutoIt full installer from this page.
  2. Run the Microsoft Application Virtualization Sequencer.
  3. Select Create a New Virtual Application Package.
  4. Select Create Package (default).
  5. On the Prepare Computer screen check that there are no warnings (Windows Defender is running, etc.) and then click Next.
  6. Select Standard Application (default) and click Next.
  7. At the Select Installer screen browse to the AutoIt installer (it should be something like ‘autoit-v3.3.6.1-setup.exe’) and then click Next.
  8. In the Package Name screen enter the name AutoIt v3 as the package name. Note that this automatically generates the ‘Primary Virtual Application Directory’ of Q:\AutoIt v3. (in App-V 4.6 you no longer have to use 8.3 filenames so this automatically generated name is OK). Click Next.
  9. The AutoIt installer will start. Use the defaults for all installation questions except for the installation folder which must be changed to match the Primary Virtual Application Directory of Q:\AutoIt v3.
  10. At the end of installation deselect Show release notes and then click Finish.
  11. Back in the App-V sequencer, select I am finished installing and then click Next.
  12. Select the following tasks to run: AutoIt Window Info, Compile Script to .exe, Run Script, SciTE Script Editor. Then click Run Selected.
  13. Close down all the launched applications then back in the App-V Sequencer click Next.
  14. Review the Installation Report and then click Next.
  15. We need to customize the package a little so select Customize and then click Next.
  16. Remove shortcuts as required for your corporate environment and then click Next. I would recommend removing ‘AutoIt v3 Website’ and ‘Check for Updates’.
  17. At the Prepare for Streaming page run the same applications as shown in step 12, close them after launch, and then click Next.
  18. Select Allow this package to run on any operating system and then click Next. (In theory, you could also create a specific 64-bit package as the AutoIt installer only installs 64-bit components when installed on a 64-bit machine, but the 32-bit version is fine for 99% of cases).
  19. Select Save the package now and Compress Package and click Create. Optionally enter the version name in the comments field.
  20. Click Close and you’re done!

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post App-V Recipe: AutoIt v3 appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/app-v-recipe-autoit-v3/feed/ 0
Windows 7 Aero Theme Not Enabled After Deploymenthttps://www.autoitconsulting.com/site/deployment/windows-7-aero-theme-not-enabled-after-deployment/ https://www.autoitconsulting.com/site/deployment/windows-7-aero-theme-not-enabled-after-deployment/#comments Fri, 13 May 2011 15:02:52 +0000 http://www.autoitconsulting.com/site/?p=488 An issue I saw a number of times at customers was that […]

The post Windows 7 Aero Theme Not Enabled After Deployment appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

An issue I saw a number of times at customers was that the Windows 7 Aero theme was not enabled after deployment. I saw this numerous times in early workshops and pilots for customers and completely forgot about it until I saw the question asked again a few weeks ago. So here is a post describing the sort of symptoms you might see and how to workaround it.

 

 

There are two main tools that people turn to when troubleshooting Aero:

  • Running the ‘Find and fix problems with transparency and other visual effects‘ troubleshooter that attempts to automatically find and fix issues.
  • Re-running the Desktop Window Manager (DWM) WinSAT test (either through the standard interface or by running ‘winsat dwm’ from the command line).

In the case of this particular issue both of these methods will not solve the problem and simply report that ‘DWM not running‘ or ‘Desktop Window Manager is disabled‘ and the system will still not enable Aero. The strange thing is you can get Aero by doing the following:

  • Right-click the desktop, select Personalize.
  • Re-select the standard Windows 7 theme (or any other Aero theme).
  • Aero magically starts working again.

Obviously, manually selecting the Windows 7 theme on each machine isn’t going to work when you are deploying more than a couple of machines, but there are two simple solutions.

Solution 1

The easiest solution for an enterprise customer is probably going to be by using Group Policy:

  1. Find the path of a theme we want to use – aero.theme is the standard Windows 7 theme and can be found in C:\Windows\Resources\Themes\aero.theme
  2. Open the Group Policy console (gpmc or local group policy as required).
  3. Find the policy User Configuration \ Administrative Templates \ Personalization \ Load a specific theme
  4. Enter the path of the theme, we will use an environment variable so that it works on all machines: %windir%\Resources\Themes\aero.theme
  5. Save the policy and apply it to the relevant Organizational Units (OUs).

Note: This only changes the theme for a user’s first-time logon which makes it quite a neat solution.

Solution 2

The second solution can be used when deploying Windows 7 using an unattend.xml answer file. To make the required changes:

  1. Open the unattend.xml answer file in the Windows System Image Manager which is installed with the Windows Automated Installation Kit (WAIK).
  2. Add the component amd64_Microsoft-Windows-Shell-Setup_neutral to the Specialize pass (use the x86 version if not using Windows 7 64-bit).
  3. Find the Themes \ ThemeName entry.
  4. Add the Aero theme name you wish to use – use just the main part of the filename (see the Windows Theme folder screenshot above). For the standard Windows 7 theme the value is ‘aero‘.
  5. Save the unattend.xml file and deploy!

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post Windows 7 Aero Theme Not Enabled After Deployment appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/deployment/windows-7-aero-theme-not-enabled-after-deployment/feed/ 1
Windows 7 Self Optimizing Boot Processhttps://www.autoitconsulting.com/site/performance/windows-7-self-optimizing-boot-process/ https://www.autoitconsulting.com/site/performance/windows-7-self-optimizing-boot-process/#comments Fri, 06 May 2011 09:50:34 +0000 http://www.autoitconsulting.com/site/?p=422 You may have heard the about the fact that Windows 7 (a […]

The post Windows 7 Self Optimizing Boot Process appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

You may have heard the about the fact that Windows 7 (and actually, Windows Vista too) has improved boot up times over Windows XP and dismissed it as “that there marketing speak”. Surprisingly, it’s actually true! On top of that, the boot process is also optimized over time to make things even faster. This blog post gives a high-level overview of how this works and also provides some actual measurements.

The descriptions given in this article are fairly high-level for a number of reasons. I wanted it to be readable for the general IT admin, and hard documentation on the exact workings of this stuff is virtually non-existent so it contains quite a lot of educated guesses. The epic Windows Internals book discusses both the logical prefetching and ReadyBoot mechanics in a lot of detail, but many of the services and registry keys mentioned in that book reference Windows Vista and they no longer apply to Windows 7 (although the functionality is still there and has been improved, it’s just less obvious which services and keys now control the process).

Logical Prefetcher

Various analysis of boot tracing have shown that one of the main factors that slows the boot process down are disk seek times. As the various boot files, dlls, and drivers are loaded there are lots of page faults and disk seek requests as different parts of files and directories are accessed. Windows 7 keeps track of what files were accessed and the location of these files on the disk and this tracing continues for up to 120 seconds after boot, or 30 seconds after the user’s shell (explorer.exe) starts, whichever comes first. These traces are stored in C:\Windows\PreFetch:

Logical Prefetcher Trace Folder

 

Each trace contains a list of the files and directories accessed when a given application starts (or during boot) and this information is used to give the prefetcher a chance to prefetch all the data required in one go, so that the loading of the application can be optimized.

In addition, any files referenced by these boot applications (DLLs, SYS files, etc) are also tracked in C:\Windows\PreFetch\Layout.ini. Every few days, when the system is idle, defrag.exe is called with a command-line parameter that causes all these referenced files to be moved to a contiguous area of the disk. This means that the prefetching of these files is much more efficient and further improves the boot time. You can manually invoke defrag to optimize boot files by running the following command (Windows 7 only, the parameters are different on Windows Vista):

defrag.exe c: /b

(This will only work after the machine has been booted around 6 times, otherwise you will get an error about the lack of boot optimization files)

Note: When fully optimized this defrag command should complete quickly (a couple of minutes or so) as the boot files will already be in a contiguous area of the disk. However, I’ve seen machines many months old that have taken up to an hour for this defrag command to complete which leads me to believe that the automatic idle processing may not actually work correctly in all situations. Therefore, it’s a good idea to run the command manually.

You can see the last time the automatic boot defrag occurred by checking the value of the registry key HKEY_LOCAL_MACHINE\Software\Microsoft\Windows NT\CurrentVersion\Prefetcher\LastDiskLayoutTimeString. Unfortunately, this value doesn’t appear to be changed when you run the defrag command manually.

ReadyBoot

According to the Windows Internals book, the logical prefetching described above is used when the system has less than 512MB of memory. If the system has 700MB or more then an in-RAM cache is used to further optimize the boot process (it’s not clear from the book whether or not this ReadyBoot cache completely replaces the logical prefetching approach or just builds on it, my assumption is that both work together). After each boot the system generates a boot caching plan for the next boot using file trace information from up to the five previous boots which contains details of which files were accessed and where on the disk they were located. These traces are stored as .fx files in the C:\Windows\PreFetch\ReadyBoot folder.

ReadyBoot Trace Folder

 

Services

Under Windows 7, the service that handles ReadyBoot optimization is part of the Superfetch service.

Under Windows Vista, ReadyBoot optimization was part of the ReadyBoost service (ironically, when Windows Vista came out the advice on “tweaking” sites was to disable ReadyBoost if they weren’t going to use USB/ReadyBoost to improve performance – errrm, no…)

There are some semi-undocumented registry keys that control the prefetch and superfetch operations, but in all but the most exeptional cases they should not be touched. They are only documented on the Windows 7 embedded sites for Disabling Superfetch and Disabling Prefetch. The default value for boot these keys is 3 which enables both boot and appliction prefetching.

An Example

So, does this prefetching and boot optimizing actually work? The following table shows the boot times on an ancient Toshiba Tecra M5 (circa 2004) through its first boot, the subsequent five boots and finally after the disk is defragged with the boot optimization switch. The machine is using a vanilla version of Windows 7 Ultimate SP1 x86 – no applications are installed. The boot time is measured using the bootDoneViaPostBoot metric generated from the Windows Performance Toolkit (WPT) (The boot logging process is detailed in this previous blog post)

Boot CycleBoot Time (bootDoneViaPostBoot)Boot Time Improvement
1 (first boot after installation)85 secondsN/A
2 (first boot training)73 seconds14%  (+14%)
331 seconds63%  (+49%)
431 seconds63%  (+0%)
528 seconds67%  (+4%)
6 (last boot training)26 seconds69%  (+2%)
7 (after boot optimizing defrag)24 seconds72%  (+3%)

You can see the massive improvement that occurs in the first few boots and then some smaller but significant gains in the later boots and the final defrag optimization. This example used a vanilla machine with very few drivers installed and no additional applications. The more drivers and software installed the more files must be accessed during boot which means these optimizations are likely to have an even more pronounced effect.

As this optimization is automatic it’s not something that most people will need to worry about. But if you are building machine that you would like to boot as quickly as possible from the moment it is deployed (appliance PCs, laptops, etc.) then it may be worthwhile adding a “boot training” and defrag task to your deployment processes.

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post Windows 7 Self Optimizing Boot Process appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/performance/windows-7-self-optimizing-boot-process/feed/ 1
Windows Performance Toolkit: Simple Boot Logginghttps://www.autoitconsulting.com/site/performance/windows-performance-toolkit-simple-boot-logging/ https://www.autoitconsulting.com/site/performance/windows-performance-toolkit-simple-boot-logging/#comments Thu, 28 Apr 2011 22:47:01 +0000 http://www.autoitconsulting.com/site/?p=276 Troubleshooting slow boots and logons are a common requ […]

The post Windows Performance Toolkit: Simple Boot Logging appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

Troubleshooting slow boots and logons are a common request. In this post I will show you how to perform boot logging using the Windows Performance Toolkit (WPT) on a Windows 7 machine and perform some basic analysis of it.

Preparation

First, you need to install the WPT on the machine you wish to examine.

Sysinternals Autologon

Secondly, we will be tracing the boot process all the way until the user has logged in and the desktop is shown. If we rely on quickly and manually logging in we introduce inconsistencies to any timings we do. The simplest solution is to use the Sysinternals Autologon tool available from the Sysinternals site and to configure it with the local or domain user we will be using for testing.

Performing the Boot Trace

  1. Logon to the machine as an administrative user.
  2. Use AutoLogon to setup the test user that will be used to automatically login during the trace. The test user need not be an administrator, but if not you will need to respond to any UAC prompts during the process to allow the tools to elevate to complete the trace.
  3. Create a local folder, for example C:\PerfTrace, to store the boot trace.
  4. Open an Administrator command prompt and change to the trace folder created above (cd C:\PerfTrace).
  5. Run the command:
  6. xbootmgr -trace boot
  7. The machine will automatically shutdown, reboot and finally login.
  8. A “Delaying for boot trace” message will appear and the system will pause for 120 seconds to capture post-logon events.
  9. The tool will now elevate and a UAC consent box or prompt for credentials will appear.
  10. The trace will be completed and the trace files will be written into C:\PerfTrace\boot_BASE+CSWITCH_1.etl.

Analysing the Boot Trace

You can look at the boot trace in two main ways. The first way is to export the trace into XML which allows you to see the main timing points and the second is using the xperfview GUI.

Analysing using the XML Summary

To export the XML summary run the following command with the trace captured in the previous section:

xperf -i boot_BASE+CSWITCH_1.etl -o summary.xml -a boot

The resulting XML file can be opened in Internet Explorer (or your favourite XML editor). In order to expand and contract the individual nodes in IE you will need to allow active content by clicking on the yellow information warning box at the top of the screen.  Contract all nodes apart from the those in the “timing” node to show the following view:

The two most immediately useful metrics are:

  • bootDoneViaExplorer – Duration of the boot (in milliseconds) until the start of Explorer.exe
  • bootDoneViaPostBoot – Length of the boot transition including PostBoot. This metric represents the total time of a boot transition.

In this example, bootDoneViaPostBoot would seem to indicate that the total boot time was 50 seconds (50094 millseconds). However, a boot trace waits for 10 seconds (postBootRequiredIdleTime) at the end of a boot until the system reaches an idle state. Therefore to get the actual total boot time we must subtract 10 seconds, in this example the adjusted boot time was 40 seconds.

Analysing using the xperfview GUI

To use a GUI to examine the boot trace open the trace in xperfview with the following command:

xperfview boot_BASE+CSWITCH_1.etl

There are many different views to look at in the xperfview GUI, but for this post we will concentrate on the main boot and logon processes (similar to the XML summary). Scroll down to the Winlogon section:

There are many different checkpoints here but some useful ones are:

  • GP Client – This checkpoint occurs at a number of different points. Before the user logs in (Computer Group Policy) and after logon (User Group Policy). It is very useful to identify any GPO related problems.
  • CreateSession Notification – This checkpoint occurs when the user enters their credentials and starts the logon process.
  • Profiles – This checkpoint occurs when the user’s profile is being loaded.
  • StartShell Notification – This is the last checkpoint when the shell is ready to load and explorer.exe is about to be launched. It corresponds to the WinlogonInit endTime entry from the XML summary.

Summary

This post showed how to perform boot logging using WPT at the most basic level. This can be a very complicated process and far too much to cover in a single post, future articles will go into more detail in individual areas.

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post Windows Performance Toolkit: Simple Boot Logging appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/performance/windows-performance-toolkit-simple-boot-logging/feed/ 0
Windows Performance Toolkit Installationhttps://www.autoitconsulting.com/site/performance/windows-performance-toolkit-installation/ https://www.autoitconsulting.com/site/performance/windows-performance-toolkit-installation/#comments Fri, 22 Apr 2011 22:30:33 +0000 http://www.autoitconsulting.com/site/?p=229 The Windows Performance Toolkit (WPT) is a suite of too […]

The post Windows Performance Toolkit Installation appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

The Windows Performance Toolkit (WPT) is a suite of tools designed for measuring and analysing system and application performance on Windows XP, Windows Vista, Windows 7 and Windows Server 2008. They can be used by enterprises to log and analyse clients in order to detect and optimise performance problems. I intend to use the WPT in a number of upcoming articles so this post will cover how to obtain and install it.

Unfortunately, the WPT is included as part of the Windows 7 SDK and cannot be downloaded separately, and the SDK download is a whopping 2.5GB! However, by following these steps it is possible to download only the minimal files required to gain access to the WPT MSI files. These MSI files can then be internally redistributed for easy installation of WPT. Here are the steps:

  1. Download the Microsoft Windows SDK for Windows 7 and .NET Framework 4 (it’s a bootstrap setup and is only around 500KB).
  2. Run the file winsdk_web.exe.
  3. Accept all defaults until the Installation Options screen is reached.
  4. Deselect all components except Redistributable Packages \ Windows Performance Toolkit. (Depending on the client, you may not have the option to deselect the .NET 4 tools)

    Windows 7 SDK Options for WPT

    Windows 7 SDK Options for WPT

  5. Complete the installation.
  6. Browse to C:\Program Files\Microsoft SDKs\Windows\v7.1\Redist\Windows Performance Toolkit. Copy the wpt_x64.msi and wpt_x86.msi files.

    WPT Redist Files

    WPT Redist Files

These wpt_*.msi files can now be used to install the WPT on any client machine and should be kept in a safe place so that you don’t need to download the SDK each time.

 

If your enterprise needs help with anything mentioned in this blog post then you can hire us for consulting work.
facebooktwitterredditlinkedinmail

The post Windows Performance Toolkit Installation appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/performance/windows-performance-toolkit-installation/feed/ 0
Welcome to the AutoIt Consulting Blog!https://www.autoitconsulting.com/site/site-news/welcome-to-the-autoit-consulting-blog/ https://www.autoitconsulting.com/site/site-news/welcome-to-the-autoit-consulting-blog/#comments Sun, 10 Apr 2011 10:12:51 +0000 http://www.autoitconsulting.com/site/?p=128 Welcome to the AutoIt Consulting Blog! This blog will s […]

The post Welcome to the AutoIt Consulting Blog! appeared first on AutoIt Consulting.

]]>
facebooktwitterredditlinkedinmail

Welcome to the AutoIt Consulting Blog!

This blog will share tips, tricks and scripts related to Microsoft Windows deployment and related technologies.

Until we start blogging in earnest, here are some links to some articles done by myself (Jonathan Bennett) for my former employer Microsoft MCS on the excellent Deployment Guys blog. Send us an email if you have any requests for the sort of content you would like to see.

Previous Posts on the Deployment Guys

Windows 7 VDI Optimization

GImageX

Dealing With Duplicate User Profile Links in Windows Vista

Working with crashdumps – debugger 101

facebooktwitterredditlinkedinmail

The post Welcome to the AutoIt Consulting Blog! appeared first on AutoIt Consulting.

]]>
https://www.autoitconsulting.com/site/site-news/welcome-to-the-autoit-consulting-blog/feed/ 0