SCCM Data Warehouse – SQL SSL certificates on AlwaysOn cluster

I was setting up the new Data Warehouse feature of SCCM on our corporate SQL 2016 AlwaysOn cluster.

You will notice that after you have installed the DW and try to run a report, you may get this error:

An error has occurred during report processing. (rsProcessingAborted) Cannot create a connection to data source ‘AutoGen__39B693BB_524B_47DF_9FDB_9000C3118E82_’. (rsErrorOpeningConnection) A connection was successfully established with the server, but then an error occurred during the pre-login handshake. (provider: SSL Provider, error: 0 – The certificate chain was issued by an authority that is not trusted.)

The Data Warehouse reporting data source has this parameter: TrustServerCertificate=false

You can change this to true – but it will get set back to false by SCCM 🙂

The correct way forward – in my my opinion, is to install trusted certificates from your root CA on the SQL Instances hosting the DW database. There are other posts on installing the self signed certificate as a trusted root CA onto the reporting server, but I don’t like using self signed certs as they are difficult to track and revoke.

This solution was tested on a 3 node AlwaysOn SQL 2016 cluster running on Server 2016 VMs in Azure.

The certificates are requested from the internal publishing CA’s using powershell.

The SQL instance where the Data Warehouse database exists will likely have >1 Availability Group and Availability Group Listener (AGL), therefore when requesting the certificate, ALL the AGLs netbios and fqdn’s should be included in the Subject Alternative Name (SAN) properties, as well as the hostname.

I ran this PowerShell on each of the 3 SQL nodes to request a unique certificate.

# Create and submit a request
$Hostname = [System.Net.Dns]::GetHostByName(($env:computerName)).Hostname
$template = "WebServer2008"
$SAN = $Hostname,"AGL1-INSTANCE1","AGL1-INSTANCE1.domain.local","AGL2-INSTANCE1","AGL2-INSTANCE1.domain.local","AGL3-INSTANCE1","AGL3-INSTANCE1.domain.local","AGL4-INSTANCE1","AGL4-INSTANCE1.domain.local"
Get-Certificate -Template $template -DnsName $SAN -CertStoreLocation cert:\LocalMachine\My

Now, hop onto your issuing CA and approve the pending request for the certificate.

Back on the SQL nodes, retrieve the requested certificate from the CA. It will be installed into the machines Personal store.

# Now the certificate is approved, you can retrieve the request:
Get-Certificate -Request (Get-ChildItem -Path cert:\LocalMachine\Request)

Open an MMC and add the Certificates snap-in targeted at the local computer store. Find the newly installed certificate (The one with the above SANs in it) and retrieve the certificate thumbprint.

On each of the SQL Nodes, change the following registry value so that it matches the thumbprint of the newly approved certificate:

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\MSSQL13.INSTANCE1\MSSQLServer\SuperSocketNetLib]

The SQL Server service for the instance will need restarting following this Registry update.

Before testing the SCCM Data Warehouse reports, you can try in SQL Server Management Studio using these options:



SCCM Compliance – enforce Unified Write Filter on Windows 10

I am in the final stages of deploying a new solution to libraries across a borough in London. The solution involved replacing a Windows 7 physical desktop that had the Deep Freeze product, with Windows 10.

I didn’t want to use Deep Freeze as the console is clunky and it’s yet another agent to manage on the client OS. Windows 10 has the Unified Write Filter (UWF) built-in and this would do the job perfectly.

All the desktops are managed using SCCM and SCCM fully supports the Windows write filters.

I started off looking at basic batch files calling uwfmgr.exe, but things quickly got messy, so I switched to trusty old WMI, but managed using PowerShell and SCCM of course 🙂

I’ve tried to break it down into sections.

It is best practice to have the pagefile on a separate disk when using UWF as it cannot be added to the exclusions list, so here we have a function that we can pass in your new volume and remove the NTFS permissions to stop normal users from fiddling:

function Remove-NTFSPermissions($folderPath, $accountToRemove, $permissionToRemove) {

    $fileSystemRights = [System.Security.AccessControl.FileSystemRights]$permissionToRemove

    $inheritanceFlag = [System.Security.AccessControl.InheritanceFlags]"ContainerInherit, ObjectInherit"

    $propagationFlag = [System.Security.AccessControl.PropagationFlags]"None"

    $accessControlType =[System.Security.AccessControl.AccessControlType]::Allow

    $ntAccount = New-Object System.Security.Principal.NTAccount($accountToRemove)

    if($ntAccount.IsValidTargetType([Security.Principal.SecurityIdentifier])) {

        $FileSystemAccessRule = New-Object System.Security.AccessControl.FileSystemAccessRule($ntAccount, $fileSystemRights, $inheritanceFlag, $propagationFlag, $accessControlType)


        $oFS = New-Object IO.DirectoryInfo($folderPath)

        $DirectorySecurity = $oFS.GetAccessControl([System.Security.AccessControl.AccessControlSections]::Access)






        return "Permissions " + $permissionToRemove + " Removed on " + $folderPath + " folder"


    return 0


The main function to enable the Unified Write Filter, configure the overlay, exclusions and pagefile.

First part of the function is to check if UWF is currently enabled or not. If we need to enable it, it will require a restart before the write filter becomes active.

function Enable-UWF {

    # Global variables
    $NAMESPACE = "root\standardcimv2\embedded"

    # Get current state of UWF
    $objUWFFilter = Get-WmiObject -Namespace $NAMESPACE -Class UWF_Filter;
    if(!$objUWFFilter) {
        write-output "`nUnable to retrieve Unified Write Filter settings. from $NAMESPACE" | out-file -FilePath $env:temp\UWF.log -Append

    # Check if UWF is enabled
    if(($objUWFFilter.CurrentEnabled)-or($objUWFFilter.NextEnabled)) {
        write-output "`nUWF Filter is enabled" | out-file -FilePath $env:temp\UWF.log -Append
    } else {
        write-output "`nUWF Filter is NOT enabled, enabling now..." | out-file -FilePath $env:temp\UWF.log -Append

        # Call the method to enable UWF after the next restart.  This sets the NextEnabled property to false.
        $retval = $objUWFFilter.Enable();
        # Check the return value to verify that the enable is successful
        if ($retval.ReturnValue -eq 0) {
            write-output "Unified Write Filter will be enabled after the next system restart." | out-file -FilePath $env:temp\UWF.log -Append
        } else {
            "Unknown Error: " + "{0:x0}" -f $retval.ReturnValue


Once the computer has restarted, SCCM compliance will run again at your specified evaluation schedule, at which point the following (also still part of the Enable-UWF function) will run to protect the C drive:

    # Only perform config if after the next restart the UWF is enabled
    $objUWFFilter = Get-WmiObject -Namespace $NAMESPACE -Class UWF_Filter;
        write-output "UWF is set to enabled after next restart, continue to check config" | out-file -FilePath $env:temp\UWF.log -Append

        # Get volume protect state
        $objUWFVolumeC = Get-WmiObject -Namespace $NAMESPACE -Class UWF_Volume -Filter "CurrentSession = false" | ? {(get-volume -DriveLetter C).UniqueId -like "*$($_.VolumeName)*"}

        # Check if C is protected
            write-output "C Drive not protected, will enable protection now.." | out-file -FilePath $env:temp\UWF.log -Append
            #enable protection
            #$retval = $objUWFVolumeC.Protect()
            uwfmgr.exe volume protect c:
             # Check the return value to verify that it was successful
            #if ($retval.ReturnValue -eq 0) {
            #    write-host "Unified Write Filter will protect the C drive after the next system restart." -ForegroundColor Green
            #} else {
            #    "Unknown Error: " + "{0:x0}" -f $retval.ReturnValue

Set the overlay size based on the % of free space remaining on the disk:

    # Overlay size and type

        $objUWFOverlayConfig = Get-WmiObject -Namespace $NAMESPACE -Class UWF_OverlayConfig -Filter "CurrentSession = false"
        If($objUWFOverlayConfig.MaximumSize -le 1024){
            # need to set maximum size
            $OverlaySize = (Get-Volume -DriveLetter C).SizeRemaining-((Get-Volume -DriveLetter C).SizeRemaining/2) | % {[math]::truncate($_ /1MB)}
            write-output "`nTry to set overlay max size to $OverlaySize MB." | out-file -FilePath $env:temp\UWF.log -Append
            $WarningSize = [math]::Round($OverlaySize/10*8)
            $CriticalSize = [math]::Round($OverlaySize/10*9)
            uwfmgr.exe overlay set-warningthreshold $WarningSize
            uwfmgr.exe overlay set-criticalthreshold $CriticalSize

        If($objUWFOverlayConfig.Type -ne 1){
            # Set overlay type to Disk based
            write-host "`nTry to set overlay type to Disk based" -ForegroundColor Yellow

Set all the file and registry exclusions based on best practice and Microsoft recommendations (links included in the code below):

# File exclusions

        $objUWFVolumeC = Get-WmiObject -Namespace $NAMESPACE -Class UWF_Volume -Filter "CurrentSession = false" | ? {(get-volume -DriveLetter C).UniqueId -like "*$($_.VolumeName)*"}
        $FileExclusionList = @(
           #Exclusions for Defender and SCEP
           "\ProgramData\Microsoft\Microsoft Security Client", `
           "\ProgramData\Microsoft\Windows Defender", `
           "\Program Files\Windows Defender", `
           "\Program Files (x86)\Windows Defender", `
           "\Users\All Users\Microsoft\Microsoft Security Client", `
           "\Windows\WindowsUpdate.log", `
           "\Windows\Temp\MpCmdRun.log", `
           "\Users\All Users\Microsoft\Network\Downloader", `

           "\Windows\wlansvc\Policies", `
           "\Windows\dot2svc\Policies", `
           "\ProgramData\Microsoft\wlansvc\Profiles\Interfaces", `


        write-host "`n"

        ForEach($File in $FileExclusionList){

                write-host "$File needs to be added to exclusions"


        # Reg exclusions
        $objUWFRegFilter = Get-WmiObject -Namespace $NAMESPACE -Class UWF_RegistryFilter -Filter "CurrentSession = false"
        $RegExclusionList = @(

            #Exclusions for Defender and SCEP
            "HKLM\SOFTWARE\Microsoft\Windows Defender", `
            "HKLM\SOFTWARE\Microsoft\Microsoft Antimalware", 

            "HKLM\Software\Microsoft\Windows\CurrentVersion\BITS\StateIndex", `
            "HKLM\SOFTWARE\Policies\Microsoft\Windows\Wireless\GPTWirelessPolicy", `
            "HKLM\SOFTWARE\Policies\Microsoft\Windows\WiredL2\GP_Policy", `
            "HKLM\SYSTEM\CurrentControlSet\services\Wlansvc", `
            "HKLM\SYSTEM\CurrentControlSet\services\WwanSvc", `
            "HKLM\SYSTEM\CurrentControlSet\services\dot3svc", `
            "HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Time Zones", `


        ForEach($Reg in $RegExclusionList){


                write-host "$Reg needs to be added to exclusions"



    } else {

        write-host "UWF is not set to enabled on the next restart, will not check config"


Create a new volume off of the free space on the disk which will will be used to host the pagefile. Once the volume is created, set the PageFileSetting to use this new drive:

 # Pagefile creation on a separate volume
    $PageFileDriveLetter = "P"
    $PageFileDriveSizeGB = 5
    # Check page file does not exist
    $PFUsage = Get-WmiObject -Class Win32_PageFileUsage -Property Name
    If(!($PFUsage) -or ($($PFUsage.Name) -eq "C:\pagefile.sys")){
        Write-Warning "Pagefile does not exist, will create one on a $PageFileDriveLetter drive"

        # create page file drive if does not exist
        $PageFileDrive = Get-CimInstance -ClassName CIM_StorageVolume -Filter "Name='$($PageFileDriveLetter):\\'"

            Write-Warning -Message "Failed to find the DriveLetter $PageFileDriveLetter specified, creating new volume now...."
            $CVol = get-volume -DriveLetter C
            $VolSizeRemaining = [int]($CVol.SizeRemaining /1GB).ToString(".")
            If($VolSizeRemaining -lt $PageFileDriveSizeGB){
                Write-Error "Not enough free space on the C drive to create a new volume for the page file"
            } else {
                write-host "Enough free space on the C drive is available to create the new $PageFileDriveLetter drive"
                #enable optimise drives service (defrag) otherwise resize fails
                Set-Service -Name defragsvc -StartupType Manual -ErrorAction SilentlyContinue

                $NewCDriveSize = $CVol.Size-"$($PageFileDriveSizeGB)GB"
                write-host "Resizing C: from $($CVol.Size) to $NewCDriveSize"
                Get-Partition -DriveLetter C | Resize-Partition -Size $NewCDriveSize -ErrorAction Stop
                write-host "Resized C to $NewCDriveSize. Now creating new $PageFileDriveLetter drive from the free space..."
                # Create new partition
                Get-Volume -DriveLetter C | Get-Partition | Get-Disk | New-Partition -UseMaximumSize -DriveLetter $PageFileDriveLetter | Format-Volume


        } else {
            write-host "$PageFileDriveLetter already exists"

        write-host "Creating page file on $PageFileDriveLetter drive"
        New-CimInstance -ClassName Win32_PageFileSetting -Property  @{Name= "$($PageFileDriveLetter):\pagefile.sys"} -ErrorAction Stop | Out-Null
        $InitialSize = [math]::Round((get-volume -DriveLetter $PageFileDriveLetter).SizeRemaining /1MB /10 *9)
        $MaximumSize = [math]::Round((get-volume -DriveLetter $PageFileDriveLetter).SizeRemaining /1MB /10 *9)
        Get-CimInstance -ClassName Win32_PageFileSetting -Filter "SettingID='pagefile.sys @ $($PageFileDriveLetter):'" -ErrorAction Stop | Set-CimInstance -Property @{
            InitialSize = $InitialSize ;
            MaximumSize = $MaximumSize ; 
        } -ErrorAction Stop
        Write-Verbose -Message "Successfully configured the pagefile on drive letter $DriveLetter"

    } else {

        write-host "Pagefile already exists: $($PFUsage.Name)"


Bringing it all together, calling the functions above, and installing the UWF feature if its not currently:

$UWF_Feature = (Get-WindowsOptionalFeature -Online -FeatureName Client-UnifiedWriteFilter -ErrorAction SilentlyContinue).State

If($UWF_Feature -eq "Disabled"){

    write-host "Not installed"
    Enable-WindowsOptionalFeature -Online -FeatureName Client-UnifiedWriteFilter -All -ErrorAction SilentlyContinue

    write-host "Please run this script again after a restart, to enable UWF filter" -ForegroundColor Yellow


    exit 3010

} else {

    write-host "`nClient-UnifiedWriteFilter WindowsOptionalFeature installed, now configure UWF" -ForegroundColor Green
    Enable-UWF -ErrorAction SilentlyContinue

    If(test-path p:){
        $folder = "P:\"
        Remove-NTFSPermissions $folder "Authenticated Users" "Modify"
        Remove-NTFSPermissions $folder "Users" "ReadAndExecute"


Azure Automation Account Runbook – Update VM tags from SQL Server table

As you can probably see from previous posts, I spend a lot more time these days automating tasks with PowerShell using Azure Automation hybrid workers.

In this post, I wanted to show you a way of reading from a SQL Server database the opening hours of a remote site/office and inputting these times into Azure VMs tags.

This ensures that Azure VMs are available during office hours and also helps to keeps Azure running costs down.

# Get office opening times from SQL database

        [Parameter(Mandatory = $false)]
        [string] $SQLServer = "SQL1.test.local",

        [Parameter(Mandatory = $false)]
        [string] $Database = "Sites",

		[Parameter(Mandatory = $false)]
        [string] $RGbaseName = "RG-WIN10-",

        [Parameter(Mandatory = $false)]
        [string] $AzureSubscriptionName = "azuresub",

		[Parameter(Mandatory = $false)]
        [string] $Filter = "*WIN7*"

$ErrorActionPreference = "Stop"

Next is a simple function that calls a separate runbook that sends an email report on all the output of the last job run of the specified runbook (let me know if you want more details on this):

Function Send-EmailReport
    #Send email report of this runbook job
    $params = @{AutomationRunBookName="OpeningTimesToTags";SendReportByEmail=$true;SendToAddress="",""}
    Start-AzureRmAutomationRunbook -AutomationAccountName "AA-WE-SUB-01" -Name "Report-AutomationJobOutput" -ResourceGroupName "RG-AA-01" –Parameters $params -RunOn "AA-HybridWorker" >$null 2>&1 | out-null

Grab the credentials needed to run the remainder of the script from the AA credentials vault:

# Azure Global Admin
$AzureOrgIdCredential = Get-AutomationPSCredential -Name 'sv-azure-automation'

# Database Read Only account
$DB_RO = Get-AutomationPSCredential -Name 'DB Read Only'

# Login to Azure
Login-AzureRmAccount -Credential $AzureOrgIdCredential >$null 2>&1 | out-null

#Select the specified subscription now
Select-AzureRMSubscription -SubscriptionName $AzureSubscriptionName >$null 2>&1 | out-null

Load the SQLServer PowerShell module or install if not currently. This one is great as we no longer need SSMS:

# SqlServer PoSh Module
If(-not(get-module -ListAvailable -Name SqlServer)){
    write-output "Could not find module 'SqlServer', installing now..."
    Find-Module SqlServer | Install-Module

import-module -Name SqlServer -ErrorAction SilentlyContinue

If(Get-Module -Name SqlServer){
    write-output "Module 'SqlServer' loaded, continue"
} else {
    write-error "Could not load or install 'SqlServer' PoSh Module"

Get all the VMs from Azure where the Resource Group name is like $RGbaseName:

write-output "`nLooping through all RGs like $RGbaseName to find specific VMs"
$RGs = Get-AzureRmResourceGroup | ? ResourceGroupName -like "$RGbaseName*"
If(!$RGs){write-error "Could not find any Resource Groups with a name like $RGbaseName*";return} else
	write-output "Retrieved $($RGs.count) RGs"

Filter the VMs so we are only working with the ones we want, based on $Filter:

write-output "`nGetting filtered VMs from $($RGs.count) RGs using filter $Filter"
$FilteredVMs = $RGs | Get-AzureRmVM | ? Name -like $Filter
If(!$FilteredVMs){write-error "Could not find any VMs with a name like $Filter";return} else
	write-output "Retrieved $($FilteredVMs.count) filtered VMs"

If the days of the week in the database are using a custom format or start the week starts on a different day, we can convert with this simple function:

# 1 = "Sunday"
# 2 = "Monday"
# 3 = "Tuesday"
# 4 = "Wednesday"
# 5 = "Thursday"
# 6 = "Friday"
# 7 = "Saturday"
Function Convert-DBWeekday {
		[Parameter(Mandatory = $true)]
		[string] $DBweekday
	Switch ($DBweekday) {
				1 {
					$OutputDay = "Sunday"

				2 {
					$OutputDay = "Monday"

				3 {
					$OutputDay = "Tuesday"

				4 {
					$OutputDay = "Wednesday"

				5 {
					$OutputDay = "Thursday"

				6 {
					$OutputDay = "Friday"

				7 {
					$OutputDay = "Saturday"


	return $OutputDay


Finally, we are looping through all the filtered VMs to get a site specific code in the VM name – this is how we match the opening times to the office/site location stored in the SQL database.

Once we have the site and opening times, we subtract 60 mins from the opening time and add 30 mins to the close time. This provides a small maintenance window for the VM to complete any updates or SCCM software installs.

Then apply the VM Tags. Schedule_$Weekday
Obviously a separate Automation runbook is required to read the tags and shutdown/startup the VMs at the time specified in the tags.

ForEach($FilteredVM in $FilteredVMs){
	write-output "`nLooping through each filtered VM. Now using: $($FilteredVM.Name)"

	write-output "`nDetermine site code of VM $($FilteredVM.Name)"
	$Site = ($($FilteredVM.Name) -split "-")[2] -replace '[a-zA-Z]',''
	If(($Site.length -eq 3)-and($Site -match "\d{3}")){
		write-output "Site code determined as $Site"
	} else {
		write-error "Site code determined as $Site which does not match 3 digits, quitting"

	# Connect to SQL database and get Site info
	Write-Output "Getting side ID from SQL database for site $Site"
	$Site = Read-SqlTableData -ServerInstance $SQLServer -DatabaseName $Database -Credential $DB_RO -TableName site -SchemaName dbo | ? sitename -like "*$Site"
		Write-Output "Site ID identified as: $($"
	} else {
		write-error "Could not retrieve $Site from the $Database database."

	Write-Output "`nGetting opening hours for site $($Site.sitename)...."
	$DefaultHours = Read-SqlTableData -ServerInstance $SQLServer -DatabaseName $Database -Credential $DB_RO -TableName tblopenhours -SchemaName dbo | ? {$_.site_id -eq $($ -and $_.usertypecode -eq "Default"}


	write-output "`nGetting VMTags from $($FilteredVM.Name)"
	$VMTags = (Get-AzureRmResource -ResourceId ($FilteredVM).Id -ErrorAction Stop).Tags

	ForEach($Day in $DefaultHours){

		$StartDate = [datetime]$($Day.starttime)
		$EndDate = [datetime]$($Day.endtime)
		$StartTime = get-date $StartDate -Format HH:mm
		$EndTime = get-date $EndDate -Format HH:mm
		$WeekDay = Convert-DBWeekday -DBweekday $($Day.weekday)
		write-output "`nOpening hours for $WeekDay are $StartTime - $EndTime"

		write-output "-60 mins from $StartTime and +30 mins to $EndTime"
		$StartTime = get-date (Get-Date $StartTime).AddMinutes(-60) -Format HH:mm
		$EndTime = get-date (Get-Date $EndTime).AddMinutes(30) -Format HH:mm
		$SchedWeekday = "Schedule_$Weekday"
		write-output "Setting tag $SchedWeekday to value: $StartTime->$EndTime"
		$VMTags.$SchedWeekday = "$StartTime->$EndTime"


	write-output "`nWriting VM Tags back to Azure..."
	Set-AzureRmResource -ResourceGroupName $($FilteredVM.ResourceGroupName) -Name $($FilteredVM.Name) -ResourceType "Microsoft.Compute/VirtualMachines" -Tag $VMTags -Force -Confirm:$false


# Send report of job output by email

Modifying Group Policy Preferences Registry collections with PowerShell

Ok guys, loosely sticking with the VDI infrastructure in Azure topic, I needed some automation around Group Policies – seeing as VDI is all about automation and scalability, I didn’t want the solution being hindered by too many manual processes (like maintaining GPO’s for new sites/VMs etc).

The VDI solution is for libraries in a London borough and is accessible to members of the public using their library card to authenticate on Insight Media’s iCAM platform. The thin client’s need to startup and automatically log into an assigned desktop VM in Azure. The VMs are Windows 10 1607 CBB 64-bit.

As we all know, first time login to Windows10/8 is quite slow as it creates the appx packages for the Start Menu, but the main reason for the auto-login is iCAM.

iCAM locks out the Windows desktop until a user enters their library card#, but it first needs to load the explorer shell so that the iCAM app can then be launched. It’s the not greatest app, but that’s for later.

As the Azure VMs are deployed using Azure Automation, I need to create the appropriate Group Policy so that each VM automatically logs into Windows with a specific AD user account.

I have written a PowerShell function that updates Group Policy Preferences XML with Registry preferences.

The function clones an existing Registry collection and edits the properties with username/password/VM name etc. and saves it back to the domain.

This is the screenshot of the Reg collection I’m cloning and modifying: VM-L270-001

Reg GPO Prefs collection

[Parameter(Mandatory = $false)]
[string] $GPOName = "Win10 VDI-Computers",</code>

[Parameter(Mandatory = $false)]
[string] $VMName = "VM-270-002",

[Parameter(Mandatory = $false)]
[string] $UName,

[Parameter(Mandatory = $false)]
[string] $PWord,

[Parameter(Mandatory = $false)]
[string] $TemplateVMName = "VM-270-001"

If(-not(get-module -Name GroupPolicy)){
write-host "Importing GroupPolicy Module"
Import-Module GroupPolicy

$GPO = Get-GPO -Name $GPOName
    $GPOGuid = ($GPO.Id).Guid
    $Domain = $GPO.DomainName
    $GPOPath = "\\$Domain\SYSVOL\$Domain\Policies\{$GPOGuid}\Machine\Preferences\Registry\Registry.xml"
    $Site = ($VMName -split "-")[2] -replace '[a-zA-Z]',''
    $TemplateSite = ($TemplateVMName -split "-")[2] -replace '[a-zA-Z]',''

    #Retrieve XML
    $xml = [xml](Get-Content $GPOPath)

        write-output "XML retrieved from $Domain"
    } else {
        Write-Error "XML could not be retrieved from $Domain"

    # Check if collection for the specified site exists
    $SiteColUserAutoLoginVM = ((((($xml.ChildNodes).Collection | ? Name -EQ "iCAM Client VDI").Collection | ? Name -like "$Site*").Collection | ? Name -Like "User Auto Login").Collection | ? Name -Like $VMName)

        Write-Output "Site collection $Site exists and contains 'User Auto Login' with a sub collection $VMName already"
    } else
        Write-Output "No 'User Auto Login' collection with name $VMName for $Site found. Copying template $TemplateVMName Site collection...."
        # Many arrays in the GPO as it has a lot of collections (folders) for GPPref filtering
        # Get index of the icam client collection
        $ClientCollections = [Collections.Generic.List[Object]]($xml.DocumentElement.Collection)
        $indexclient = $ClientCollections.FindIndex( {$args[0].Name -eq "iCAM Client VDI"} )
        # Get index of the site collection
        $SiteCollections = [Collections.Generic.List[Object]]($xml.DocumentElement.Collection[$indexclient].Collection)
        $indexsite = $SiteCollections.FindIndex( {($args[0].Name -like "$site*")} )
        $indexsiteTemplate = $SiteCollections.FindIndex( {($args[0].Name -like "$TemplateSite*")} )

        # Get template VM collection
        $TemplateSiteVMCols = [Collections.Generic.List[Object]]($xml.DocumentElement.Collection[$indexclient].Collection[$indexsiteTemplate].Collection.Collection)
        $indexVMTemplate = $TemplateSiteVMCols.FindIndex( {($args[0].Name -like "$TemplateVMName")} )

        # Clone the 270 site node 
        $NodeToClone = $xml.RegistrySettings.Collection[$indexclient].Collection[$indexsiteTemplate].Collection.Collection[$indexVMTemplate].Clone()

        # Modify the XML before comitting it
        $ = $VMName
        $NodeToClone.uid = [string]"{$((new-guid).Guid)}".ToUpper()
        $ = $VMName
        $NodeToClone.Registry[0].uid = [string]"{$((new-guid).Guid)}".ToUpper()
        $NodeToClone.Registry[0].Properties.value = $PWord
        $NodeToClone.Registry[1].uid = [string]"{$((new-guid).Guid)}".ToUpper()
        $NodeToClone.Registry[1].Properties.value = $UName
        # Add the clone to the xml
            write-output "Writing cloned template to new site..."
        } else {
            write-error "Please create 'User Auto Login' registry collection under site $site in GPO: $GPOName"


Download all Office 2016 updates with PowerShell and SCCM

Hi Guys,

Sorry for the delay, been working on a big project to rollout thin clients across libraries in a London borough. I will be writing up a collection posts which will make up the automation of a cloud based VDI solution in Azure, starting off today with keeping the Office 2016 MSI/Volume License edition app up to date in SCCM.

When deploying any modern Office version with SCCM, you want to keep it up to date by adding all the latest .msp files to the /updates folder.

I adapted some popular scripts found online  and added what I thought were cool additions.

This one: maintaining-your-office-2016-installation-source involved creating a new SUG and SU Package in SCCM, a bit yucky.

This one: download-office-2016-proplus-x86-updates had a static list that has to be maintained by the script owner and used custom guids to name the files.

Both these issues are ‘resolved’ using the following code.

    • Access to the ConfigManager PSD1 PoSh module. You can load the module in a Admin PoSh prompt like:
        $ConfigMgr_psd1_path = $Env:SMS_ADMIN_UI_PATH.Substring(0,$Env:SMS_ADMIN_UI_PATH.Length-5) + '\ConfigurationManager.psd1'

        Import-Module $ConfigMgr_psd1_path
  • SCCM Sofware Updates sync configured to download Office 2016 metadata
  • Full Administrator access to SCCM Primary Site Server
$siteserver = "srv-sccm.domain.local"
$sitecode = "AB0"
$class = "SMS_SoftwareUpdate"
$NameSpace = "root\SMS\Site_$sitecode"
$StagingLocation = "T:\O2016_Updates"
$MSPsFolder = "$StagingLocation\ready"</code>

Function ConvertFrom-Cab
    $comObject = "Shell.Application"
    Write-Verbose "Creating $comObject"
    $shell = New-Object -Comobject $comObject
    if(!$?) { $(Throw "unable to create $comObject object")}
    Write-Verbose "Creating source cab object for $cab"
    $sourceCab = $shell.Namespace("$cab").items()
    if(-not (Test-Path $destination))
        Write-Verbose "Creating destination folder object for $destination"
        new-item $destination -ItemType Directory
    $DestinationFolder = $shell.Namespace($destination)
    Write-Verbose "Expanding $cab to $destination"
    ForEach($Cabitem in $sourceCab){
        If($ -like "*.msp"){
            Rename-Item -Path "$destination\$($" -NewName ($fileName)

Remove-Item $StagingLocation -Force -Confirm:$False -Recurse
$Office2016Updates = Get-CMSoftwareUpdate -Name "*Microsoft*2016*32-Bit*" -Fast | ? {($_.IsExpired -ne "False") -and ($_.IsSuperseded -ne "False")}
md $StagingLocation

ForEach($Update in $Office2016Updates){
    $CI_ID = $Update.CI_ID
    $ContentID = (get-wmiobject -ComputerName $siteserver -Query "select * from SMS_CItoContent where ci_id=$CI_ID" -Namespace $NameSpace).ContentID
    $objContent = Get-WmiObject -ComputerName $siteserver -Namespace $NameSpace -Class SMS_CIContentFiles -Filter "ContentID = $ContentID"
    $Filename1 = "KB$((([uri]$Update.LocalizedInformativeURL).AbsolutePath -split "/")[2])"
    $CabFileName = $objContent.Filename
    $FileName2 = ($CabFileName -split ".cab")[0]
    $FileNameWithoutExtension = $FileName2 + "_" + $Filename1
    write-host "FileNameWithoutExtension will be: $FileNameWithoutExtension"
    $URL = $objContent.SourceURL
    write-host "$URL" -ForegroundColor Green
        $FileNamePath = "$StagingLocation\$CabFileName"
        write-host "FileNamePath = $FileNamePath"
        Start-BitsTransfer -Source $URL -Destination $FileNamePath
        If(Test-Path $FileNamePath)
            Write-Output "$FileNamePath found, extracting files from cab..."
            ConvertFrom-Cab -cab $FileNamePath -destination $MSPsFolder -filename $($FileNameWithoutExtension+".msp")
            write-host "Deleting items cabs from $StagingLocation"
            Get-ChildItem -Path $StagingLocation -File -Filter *.cab | Remove-Item -Force -Confirm:$false
            write-host "Deleting items xmls from $MSPsFolder"
            Get-ChildItem -Path $MSPsFolder -File -Filter *.xml | Remove-Item -Force -Confirm:$false
        write-host "stopping here"

In your output folder $MSPsFolder you will have all the MSPs you need to copy into the /updates folder in the Office 2016 app/package source.