Cloud, Scripting

Microsoft Teams and PowerShell

I just started playing around with the MicrosoftTeams PowerShell module (available in the PowerShell Gallery, use Find-Module MicrosoftTeams for more information). Here’s a quick sample of how you can get started using it…

$conn = Connect-MicrosoftTeams

# list all Teams
Get-Team

# get a specific Team
$team = Get-Team -DisplayName "Benefits"

# create a new Team
$team = New-Team -DisplayName "TechSupport" -Description "Technical Support" -Owner "dave@contoso.com"

# add a few channels to the new Team
New-TeamChannel -GroupId $team.GroupId -DisplayName "Forms Library" -Description "Forms and Templates"
New-TeamChannel -GroupId $team.GroupId -DisplayName "Customers" -Description "Information for customers"
New-TeamChannel -GroupId $team.GroupId -DisplayName "Development" -Description "Applications and DevOps teams"

# dump properties for one Team channel
$channelId = Get-TeamChannel -GroupId $team.GroupId |
Where-Object {$_.DisplayName -eq 'Development'} |
Select-Object -ExpandProperty Id

# add a user to a Team
Add-TeamUser -GroupId $team.GroupId -User "dory@contoso.com" -Role Member

Here’s a splatted form of the above example, in case it renders better on some displays…

$conn = Connect-MicrosoftTeams

# list all Teams
Get-Team

# get a specific Team
$team = Get-Team -DisplayName "Benefits"

# create a new Team
$params = @{
DisplayName = "TechSupport"
Description = "Technical Support"
Owner = "dave@contoso.com"
}
$team = New-Team @params

# add a few channels to the new Team
# NOTE: You could form an array to iterate more efficiently
$params = @{
GroupId = $team.GroupId
DisplayName = "Forms Library"
Description = "Forms and Templates"
}
New-TeamChannel @params

$params = @{
GroupId = $team.GroupId
DisplayName = "Customers"
Description = "Information for customers"
}
New-TeamChannel @params

$params = @{
GroupId = $team.GroupId
DisplayName = "Development"
Description = "Applications and DevOps teams"
}
New-TeamChannel @params

# dump properties for one Team channel
$channelId = Get-TeamChannel -GroupId $team.GroupId |
Where-Object {$_.DisplayName -eq 'Development'} |
Select-Object -ExpandProperty Id

# add a user to a Team
$params = @{
GroupId = $team.GroupId
User = "dory@contoso.com"
Role = 'Member'
}
Add-TeamUser @params

Advertisements
Cloud, Projects, Scripting, Technology

Part 2 – Copy Azure Blob Containers between Storage Accounts using Azure Automation with Fries and a Drink

So, my previous post was about using PowerShell from a compute host (physical/virtual computer) to connect to an Azure subscription and copy containers between storage accounts.  I call that “part 1”.  This will be “part 2”, as it takes that smelly pile of compost and shovels into Azure Automation.

In short, the basic changes from the previous example:

  • Not nearly as much fuss with credentials within the PowerShell code
  • Configuration settings are stored in Azure as Variables, rather than a .json file
  • Less code!

The previous article refers to the diagram on the left.  This one refers to the one not on the left.

aa0

Assumptions

  • You have access to an Azure subscription
  • In Azure, you have at least one (1) Resource Group, having two (2) Storage Accounts (one for “source” and the other for “destination”.  The “backup” this performs is copying from “source” to “destination”)
  • You somehow believe I know what I’m talking about
  • You stopped laughing and thought “Shit. Maybe this idiot doesn’t know what he’s talking about?
  • After a few more minutes you thought “Why am I reading what I’m actually thinking right now?  How does he know what I’m thinking?  It’s like he’s an idiot savant!  Maybe he counts toothpicks on the floor while brushing his teeth…
  • You consider that this was written in November 2018, and Azure could have changed by the time you’re reading this.

Basic Outline

The basic goals of this ridiculous exercise in futility are (still):

  • Copy all (or selected) containers from Storage Account 1 to Storage Account 2 using an Azure Automation “runbook”, once per day.
  • The copy process will append “yyMMdd” datestamps to each container copied to Storage Account 2
  • The copy process will place the destination containers under a container named “backups”.  For example, “SA1/container1” will be copied to “SA2/backups/container1-181117”
  • Both storage accounts should be within the same Azure Resource Group, and in the same Region
  • New Goal: Eliminate a dedicated host machine for running the script in lieu of an Azure Automation Runbook.

Important!

This is a demo exercise only.  DO NOT perform this on a production Azure tenant without testing that absolute living shit out of it until your fingers are sore, your eyes are bloodshot and you’ve emptied every liquor bottle and tube of model glue in your house/apartment.

The author assumes NO responsibility or liability for any incidental, accidental, intentional or alleged bad shit that happens resulting from the direct or indirect use of this example.  Batteries and model glue not included.

Preparation

First, we need to set up the automation account and some associated goodies.  Some of the steps below can be performed using Azure Storage Explorer, or PowerShell, but I’m using the Azure portal (web interface) for this exercise.  Then we’ll create the Runbook and configure it, and run a test.

  1. From the Azure portal, click “All Services” and type “Automation” in the search box.
  2. Click the little star icon next to it.  This adds it to your sidebar menu (along the left)
  3. Click on “Automation Accounts
  4. Click “Add” near the top-left, fill in the Name, select the Resource Group, Location and click Create
  5. From the Automation Accounts blade (I hate the term “blade”, in fact I hate the Azure UI in general, but that’s for another paint-fume-sniffing article), click on the new Automation Account.

Credentials and Variables

  1. Scroll down the center menu panel under “Shared Resources” and click on “Credentials“, and then click “Add a credential” at the top. Fill in the information and click Create.  This needs to be an account which has access to both of the storage accounts, so you can enter your credentials here if you like, since this is only a demo exercise.
  2. Go back to “Automation Accounts” (bread crumb menu along top is quickest)
  3. Go back to the Automation Account again, and scroll down to “Variables
  4. Add the variables as shown in the example below.  All of the variables for this exercise are “String” type and Encrypted = “No”.  This part is a bit tedious, so you should consume all of your elicit substances before doing this step.

The Runbook

  1. Go back to the Automation Account again and click on “Runbooks
  2. Click “Add a runbook” from the menu at top, then click “Quick Create / Create a new runbook” from the middle menu pane.
  3. Enter a name and select “PowerShell” from the Runbook type list.  Enter a Description if you like, and click Create.

When the new Runbook is created, it will (should) open the Runbook editor view.  This will have “> Edit PowerShell Runbook” in the heading, with CMDLETS, RUNBOOKS, and ASSETS along the left, and line 1 of the editor form in the top-middle.

  1. Copy/Paste the code from here into the empty space next to line 1 in the code editor.
  2. Make sure the variable names at lines 10-16 match up with those you entered at step 9 above.  If not: for each variable that needs to be corrected: delete the code to the right of the equals sign (“= Get-AutomationVariable -Name …”), place the cursor after the “=” and click Assets > Variables > then click the “…” next to the variable you want, and select “Add “Get Variable” to canvas“. (see example below)
  3. After entering the code and confirming the variable assignments, click Save.  Don’t forget to click Save!

Testing

  1. Click “Test pane” to open the test pane (I’m shocked they didn’t call it the “test blade”) – Tip: If you don’t see “Test pane” go back to the Runbook editor, it’s at the top (select the Runbook, click Edit).
  2. Click “Start” and wait for the execution to finish.  (Note: Unlike running PowerShell on a hosted computer, Azure Automation doesn’t show the output until the entire script is finished running)

Code Note: You may notice that line 97 ($copyJob = Start-AzureStorageBlobCopy…) is commented out.  This is intentional so as to mitigate the chances of you accidentally copying an insane amount of garbage and running your Azure bill into the millions of dollars.

Testing Note: Since line 97 is commented out, the test should simply show what was found, but no copies are actually processed.  In the last image example (below) you will still see “copy completed” for each container set, but that’s just more glue-sniffing imaginary hallucination stuff for now.  Once you remove the comment, that becomes very real.  As real as Kentucky Fried Movie 3D punching scenes.

When you’ve tested this to your satisfaction, simply uncomment that line (or better yet, add $WhatIfPreference = $True at the top of the script, just below the $VerbosePreference line)

aa1

aa2

(there was a sale on red arrows and I couldn’t say no)

aa3a

aa3b

aa3

aa4

aa5

aa6

aa7.png

 

 

 

Cloud, Projects, Scripting, Technology

Backup Azure Blob Containers Between Storage Accounts

This is NOT an article which is intended to say “this is how you do it”. This is an article that says “this is ONE way you COULD do it, if you’re dealing with the same conditions”. Okay. Step away from the ledge and sit down. You’re blocking my spot on the ledge.

The Challenge

Customer asked for the following:

  • Copy blob containers from Storage Account 1 (SA1)to Storage Account 2 (SA2)
  • Source containers should be copied to folders on SA2 such that they’re renamed to append a date stamp (e.g. “Container1-181114”)
  • Should be able to explicitly control includes and excludes for groups of source containers
  • Should support job scheduling
  • Should support using external configuration files (Azure subscription, Resource Group, Storage Accounts, etc.)
  • Should support running from a Windows Server VM running in Azure (IaaS)
  • You have 2 hours to accomplish this

Options

  • Third-party backup products
  • Azure Function App
  • PowerShell script

Rationale

Third-party backup products cost money and require installation, learning curves, etc.  Azure Function App is arguably the ideal option, but I haven’t worked with it/them enough to meet the requirements in the allotted time.  I chose PowerShell because it requires the least effort and impact for on-boarding (installation, configuration, learning curve, cost, etc.).

Approach

  • A single script (portable)
  • A set of .JSON configuration files
  • AzureRM module and Azure credentials

I had some code laying around from a past project that involved uploading and downloading content to Azure RM storage.  This time it was copying between two Azure RM storage accounts using a VM host residing in Azure.  So I decided to map out the inputs/outputs and draft the configuration data file.

Sample Configuration File

{
    "ResourceGroupName": "Toilet",
    "SourceStorageAccount": "turdstorage1",
    "DestinationStorageAccount": "turdstorage2",
    "StorageAccountKey1": "<insert really long key here>",
    "StorageAccountKey2": "<insert another really long key here>",
    "CustomerName": "StinkTech",
    "AzureUserID": "you@yourcompany.com",
    "SubscriptionName": "YourSubscriptionName",
    "DestinationContainer": "backups",
    "BackupDateFormat": "yyMMdd",
    "IncludeContainers": "",
    "ExcludedContainers": ["azure-webjobs-hosts","junk"]
}

So, what does this “say”?  Besides mentioning fecal matter a few times, it describes the following:

  • The Azure RM resource group name
  • The two (2) storage account names
  • The two (2) storage account access (primary) keys
  • I’ll explain the “customername” later
  • The Azure Subscription name
  • The Azure Subscription user ID
  • The Destination container name (beneath DestinationStorageAccount)
  • The date-stamp format for appending to the destination folders
  • Lists of (source) container names to constrain the total list and/or exclude from the total list

Sample

  • Container1
  • Container2
  • Container3

If “IncludeContainers” = [“Container1″,”Container3”] then only those two will be copied/backed-up.

If “ExcludeContainers” = [“Container3”] then only Container1 and Container2 will be copied/backed-up.

If “IncludeContainers” = [“Container1″,”Container3”] and “ExcludeContainers” = [“Container2″,”Container3”] then the final (result) list of containers that will be copied is “Container1”.  And if “IncludeContainers” = “ExcludeContainers” then you’re just fucked.  I’m sorry, but life can be hard sometimes.

“CustomerName” is used for a rather handy little feature buried inside this mess: Azure login credential storage and recall.  Basically, the first time you run the script, you will be prompted to enter credentials to connect to the specified Azure tenant.  This will use the “SubscriptionName” and “AzureUserID” values from the .JSON file to process the login request.  If the request is successful, the password is encrypted/hashed and stored in a separate .JSON file, which uses “CustomerName” as the base name.  So if “CustomerName”: “StinkTech” then the credentials file will be named “StinkTech.json”

WARNING: Be careful not to assign “CustomerName” to the same name as your configuration .JSON file!  This will cause the credential storage to overwrite the configuration file.  I could add some exception-avoidance code for that, but that wouldn’t be any fun.  People often go to NASCAR races with hopes of seeing a good crash at some point (as long as nobody gets hurt).

Using the example above (Container names), along with the sample JSON configuration file, the containers would be copied over to destination Container “backups” and be named [ContainerName]+[YYMMDD].  So “Container1” will be copied to “backups/Container1-181114” and so on.  Why not just copy like-for-like and not shove them under a mid-level container?  Because, that’s what the customer insisted.

The Code

#requires -version 5.0
#requires -modules AzureRM.Storage
<#
.DESCRIPTION
    Backup Azure Blob Containers from one Storage Account to Another.
    Copies each source container and blob to a date-stamped name in the second account.
    Example: SA1\container1\folder\133.txt --> SA2\backups\container1-181114\folder\133.txt
.PARAMETER ConfigFile
    Path to JSON configuration file
.PARAMETER Force
    Force overwrite of existing targets (destinations) if they already exist
.EXAMPLE 
    .\Copy-AzureBlobs.ps1 -ConfigFile .\myconfig.json
.EXAMPLE
    .\Copy-AzureBlobs.ps1 -ConfigFile .\myconfig.json -Force -Verbose -WhatIf
.EXAMPLE
    .\Copy-AzureBlobs.ps1 -ConfigFile .\myconfig.json -Force -ResetCredentials
.NOTES
    1.0.0 - 2018/11/14 - First release (skatterbrainz)
#>
[CmdletBinding(SupportsShouldProcess=$True)]
param (
    [parameter(Mandatory=$True, HelpMessage="Path to configuration file")]
    [ValidateNotNullOrEmpty()]
    [string] $ConfigFile,
    [parameter(Mandatory=$False, HelpMessage="Force backups even when targets already exist")]
    [switch] $Force,
    [parameter(Mandatory=$False, HelpMessage="Force credential reset")]
    [switch] $ResetCredentials
)
$time1 = Get-Date
$Script:countall = 0
$Script:ccount = 0
function Invoke-AzureBlobBackup {
    [CmdletBinding(SupportsShouldProcess=$True)]
    param (
        [parameter(Mandatory=$False, HelpMessage="List of source containers to exclude from backups")]
            [string[]] $ExcludeContainers
    )
    Write-Verbose "connecting to storage accounts"
    $context1 = New-AzureStorageContext -StorageAccountName $SourceStorageAccount -StorageAccountKey $StorageAccountKey1
    $context2 = New-AzureStorageContext -StorageAccountName $DestinationStorageAccount -StorageAccountKey $StorageAccountKey2
    Write-Verbose "getting storage containers"
    $sc1 = Get-AzureRmStorageContainer -ResourceGroupName $ResourceGroupName -StorageAccountName $SourceStorageAccount
    if ($IncludeContainers.Count -gt 0) {
        Write-Verbose "filtering list of source containers"
        $sc1 = $sc1 | ?{$IncludeContainers -contains $_.Name}
        Write-Verbose "containers: $($($sc1).Name -join ',')"
    }
    if ($ExcludeContainers.Count -gt 0) {
        Write-Verbose "removing excluded containers from source list"
        $sc1 = $sc1 | ?{$ExcludeContainers -notcontains $_.Name}
        Write-Verbose "containers: $($($sc1).Name -join ',')"
    }
    Write-Verbose "validating destination container [$DestinationContainer]"
    try {
        $sc2 = Get-AzureRmStorageContainer -ResourceGroupName $ResourceGroupName -StorageAccountName $DestinationStorageAccount -Name $DestinationContainer -ErrorAction Stop
        Write-Verbose "container [$DestinationContainer] exists in destination"
        $destBlobs = (Get-AzureStorageBlob -Container $DestinationContainer -Context $Context2).Name
        Write-Verbose "$($destBlobs.count) destination blobs found in [$DestinationContainer]"
        Write-Verbose $($destBlobs -join ',')
    }
    catch {
        Write-Verbose "container [$DestinationContainer] not found in destination, creating it now"
        try {
            $c2 = New-AzureRmStorageContainer -ResourceGroupName $ResourceGroupName -StorageAccountName $DestinationStorageAccount -Name $DestinationContainer
        }
        catch {
            $stopEverything = $True
            Write-Error $Error[0].Exception.Message
            break
        }
    }
    Write-Verbose "enumerating source containers"
    $Script:countall = 0
    $Script:ccount = 0
    foreach($sc in $sc1) {
        $sourceContainer = $sc.Name
        Write-Verbose "source container: $sourceContainer"
        $srcBlobs  = Get-AzureStorageBlob -Container $sourceContainer -Context $context1
        Write-Verbose "------------------------- $sourceContainer ---------------------------------"
        Write-Verbose "$($srcBlobs.count) source blobs found in [$sourceContainer]"
        #$srcBlobs
        Write-Verbose "copying blobs to [$DestinationContainer]..."
        foreach ($blob in $srcBlobs) {
            $Script:countall++
            $srcBlob = $blob.Name
            $destPrefix = $sourceContainer+'-'+(Get-Date -f $BackupDateFormat)
            $destBlob = "$destPrefix`/$srcBlob"
            if ($Force -or ($destBlobs -notcontains $destBlob)) {
                Write-Verbose "[$sourceContainer] copying [$srcBlob] to [$destBlob]"
                try {
                    $copyjob = Start-AzureStorageBlobCopy -Context $context1 -SrcContainer $sourceContainer -SrcBlob $srcBlob -DestContainer $DestinationContainer -DestBlob "$destBlob" -DestContext $context2 -Force -Confirm:$False
                    Write-Verbose "copy successful"
                    $Script:ccount++
                }
                catch {
                    Write-Error $Error[0].Exception.Message
                }
            }
            else {
                Write-Verbose "blob [$destBlob] already backed up"
            }
        }
    }
}
function Get-AzureCredentials {
    [CmdletBinding()]
    param (
        [parameter(Mandatory=$True, HelpMessage="Azure Subscription UserName")]
        [ValidateNotNullOrEmpty()]
        [string] $AzureUserID,
        [parameter(Mandatory=$True, HelpMessage="Azure Subscription Name")]
        [ValidateNotNullOrEmpty()]
        [string] $SubscriptionName,
        [parameter(Mandatory=$False, HelpMessage="Credential file basename")]
        [ValidateNotNullOrEmpty()]
        [string] $CredentialName = "cred",
        [parameter(Mandatory=$False, HelpMessage="Force credentials reset")]
        [switch] $ForceUpdate
    )
    $ProfilePath = ".\$CredentialName.json"
    Write-Verbose "searching for $ProfilePath"
    if (Test-Path $ProfilePath) {
        if ($ForceUpdate) {
            Write-Verbose "deleting credential storage file: $ProfilePath"
            try {
                Get-Item -Path $ProfilePath -ErrorAction SilentlyContinue | Remove-Item -Force -WhatIf:$False
            }
            catch {}
            Write-Verbose "stored credential removed. prompt for credentials to create new file"
            try {
                $pwd = Get-Credential -UserName $AzureUserID -Message "Azure Credentials" -ErrorAction Stop
                $pwd.password | ConvertFrom-SecureString | Set-Content $ProfilePath -WhatIf:$False -ErrorAction Stop
                Write-Verbose "$ProfilePath has been updated"
            }
            catch {
                Write-Warning "$ProfilePath was NOT updated!"
            }
            try {
                $pwd = Get-Content $ProfilePath | ConvertTo-SecureString -Force
                $azCred = New-Object System.Management.Automation.PSCredential -ArgumentList $AzureUserID, $pwd
            }
            catch {
                Write-Error $Error[0].Exception.Message
                break
            }
        }
        else {
            Write-Verbose "$ProfilePath was found. importing contents"
            try {
                $pwd = Get-Content $ProfilePath | ConvertTo-SecureString -Force
                $azCred = New-Object System.Management.Automation.PSCredential -ArgumentList $AzureUserID, $pwd
            }
            catch {
                Write-Error $Error[0].Exception.Message
                break
            }
        }
    }
    else {
        Write-Verbose "$ProfilePath not found. prompt for credentials to create new file"
        try {
            $pwd = Get-Credential -UserName $AzureUserID -Message "Azure Credentials" -ErrorAction Stop
            $pwd.password | ConvertFrom-SecureString | Set-Content $ProfilePath -WhatIf:$False -ErrorAction Stop
            Write-Verbose '$ProfilePath has been updated'
        }
        catch {
            Write-Warning '$ProfilePath was NOT updated!!'
        }
        try {
            $pwd = Get-Content $ProfilePath | ConvertTo-SecureString -Force
            $azCred = New-Object System.Management.Automation.PSCredential -ArgumentList $AzureUserID, $pwd
        }
        catch {
            Write-Error $Error[0].Exception.Message
            break
        }
    }
    try {
        $azLogin = Connect-AzureRmAccount -Subscription $SubscriptionName -Credential $azCred -Environment $EnvironmentName -WhatIf:$False
        Write-Verbose "azure credentials verified"
    }
    catch {
        Write-Warning "azure credentials have expired. Prompt for new credentials"
        $azLogin = Connect-AzureRmAccount -Subscription $SubscriptionName -Environment $EnvironmentName -WhatIf:$False
    }
    $azLogin
}
function Get-AzureBackupConfig {
    param(
        [parameter(Mandatory=$True, HelpMessage="Path to configuration JSON file")]
        [ValidateNotNullOrEmpty()]
        [string] $FilePath
    )
    if (!(Test-Path $FilePath)) {
        Write-Warning "$FilePath not found!!"
        break
    }
    Get-Content -Raw -Path $FilePath | ConvertFrom-Json
}
if ($config = Get-AzureBackupConfig -FilePath $ConfigFile) {
    Write-Verbose "reading configuration data from file $ConfigFile"
    $config.psobject.properties | ForEach-Object{
        Set-Variable -Name $_.Name -Value $_.Value -Scope Script -WhatIf:$False
    }
    if ($ResetCredentials) {
        Get-AzureCredentials -AzureUserID $AzureUserID -SubscriptionName $SubscriptionName -CredentialName $CustomerName -ForceUpdate
    }
    if (Get-AzureCredentials -AzureUserID $AzureUserID -SubscriptionName $SubscriptionName -CredentialName $CustomerName) {
        Invoke-AzureBlobBackup -ExcludeContainers $ExcludedContainers
    }
    else {
        Write-Warning "run Set-AzureCredentials to update credential store and try running again"
    }
}
$time2 = Get-Date
Write-Verbose "completed. $($Script:countall) total objects processed. $($Script:ccount) were copied"
Write-Verbose "total runtime $($(New-TimeSpan -Start $time1 -End $time2).TotalSeconds) seconds"

The source is available on GitHub here.

Sample Usage

.\Copy-AzureBlobs.ps1 -ConfigFile ".\toilets.json" -Verbose -WhatIf

This will run the script with verbose output enabled (lots of detailed output) and using -WhatIf shows what it *would* do, without actually doing anything.  Well, actually, it will import, process and update the credentials file, but no blog copying will be processed.

The nice thing about the external configuration file approach is that you can prepare and use multiple configurations by simply calling different configuration files, each using its own credentials file, so they don’t need to have anything in common such as Azure tenant, resource group, storage accounts, or Azure user credentials.

What If?

What if you’ve run the script a few times and it’s conveniently reusing the credentials without prompting, but then someone changes the password on the account in Azure?  You can do one of two things:

  • Delete (or rename) the credentials .json file
  • Use the -ResetCredentials switch parameter

Either of these will force a login prompt to update the credentials, and then save a new file.

Other Notes

  • The -Verbose option also displays individual blob names as they’re being copied (or would be copied, if -WhatIf is also used)
  • The -Verbose option also displays total counts of blob objects, how many were copied and total runtime in seconds.
  • You can wrap the execution (with -Verbose) within Start-Transcript to capture all the details for analysis, logging, auditing, and troubleshooting
  • If there’s any measurable interest/support for this, I may post it to PowerShell Gallery for easier use (let me know?)

Thank you!

Cloud, System Center, Technology

Deploy Office 365 ProPlus with Visio and Project using Configuration Manager 1807 with fries and a soft drink

o-DEAD-PATIENT-WAKES-UP-facebook

Update: 2018-08-27

I meant to post this a few weeks ago, but anyhow… Microsoft released an updated ODT which removes the “Match Current OS” option from the language options list.  That seems to work fine.  However, the Project Online client has an issue with the Detection Rule being the same as Office 365 ProPlus.  So the install (deployment) on a machine with O365 Pro Plus (latest/same version) causes the Project deployment to think it’s already installed.  Just change the detection rule and it works.  The Visio Pro deployment uses a different detection rule and seems to work fine.

As for Shared Computer Activation deployments, there are at least two (2) ways to go.  One is supported, the other is unknown (at this point).  The first is to simply build a new deployment, which sounds like a waste of storage space (compared with traditional O365 ProPlus deployment builds using ODT and XML files) but if you have deduplication turned on for the content source location it shouldn’t be a concern.  The other (semi/un-supported) way is to manually copy the “configuration.xml” and make a Shared Activation flavor of it, then add a new deployment type, set some sort of condition to control the scope of that deployment type, and go that route.  A little more convoluted, but possible.

Update: 2018-08-08

While working with a customer to deploy Office 365 ProPlus, with SCCM 1806, we discovered a bug in the OCT that has been confirmed by Microsoft and hopefully fixed soon.  The bug is related to the language selection “Match the Current OS” with respect to (English-US) platforms.  That selection, for some reason, does not download the required language pack files for the deployment source, and causes the deployments to fail with an error that mentions “missing files”.

The catch here is it won’t return an error when it fails via SCCM.  The deployment fails but still returns “success” (exit code 0), and writes the registry key which is shown in the Detection Method configuration.  To see the error, we had to execute the setup.exe with /configure and the relevant .xml file from the client cache folder.  This is actually two (2) “bugs” as far as I can tell.

The fix/workaround is to simply select the actual language (e.g. “English (United States)”) rather than use the default “Match the Current OS”.

Introduction

Someone asked if that surgeon on the left is Johan.  I cannot confirm, but it wouldn’t surprise me.

Ingredients

  • System Center Configuration Manager current branch 1807+
  • Basic knowledge about how to use ConfigMgr
  • Office 365 / ProPlus licensing
  • Coffee
  • A really bad attitude

Process Overview

  1. Create an Application and Deployment in ConfigMgr
  2. Target a Collection (devices or users)
  3. Drink coffee
  4. Punch someone in the face and go home (no, don’t do that)

Quick Notes

  • During the process of building and deploying the Office configuration, the ConfigMgr console will be locked out from making changes. You can still make it visible, but no scrolling/selecting is allowed.  Therefore, if you intend to deploy the configuration at the end of the procedure below, you should prepare the target collection in advance.
  • This will require access to the Internet in order to download the content for building the O365 deployment source.  If you don’t have access to the internet, you may want to look for a new job.
  • I posted a summary version of this using ConfigMgr 1806 on another blog here.  But this one has french fries.

Procedural Stuff

  1. Open the ConfigMgr admin console.  This is important.
  2. Navigate to Software Library > Office 365 Client Management
  3. On the Office 365 Client Management dashboard, scroll over to the far right until you see that high-quality uglyaficon icon with “Office 365 Client Installer” caption.  Click on it like you mean it.  (Just kidding about the icon, it really is nice)
    1
  4. Give the Application a name (e.g. “Office 365 ProPlus with Visio and Project and Fries – 64-bit”)
  5. Enter a Description (optional)
  6. Enter the UNC source location path (the folder must exist, but the content will be populated at the end of this exercise).  It must be a UNC path. Drive letters are for losers.
    3
  7. On the “Office Settings” page, click “Go to the Office Customization Tool” (or “Office Customisation Tool” for you non-American folks).  NOTE: If you do not already have the latest version of OCT installed, it will prompt you to download and extract it somewhere.  Then it will continue on.  Otherwise, it will just continue on.
    4.png
  8. The OCT home page uses a layout designed by Stevie Wonder.  It’s very spread out, so, on a low-res display, expect some finger exercise workouts on your mouse or trackpad.  Anyhow, look for “To create or update a configuration file, click Next” and click on (you guessed it:) Next.
    5
  9. The Software and Languages page will open first.
    1. Enter the Organization Name and select the Version (32 or 64 bit), then click Add.  IMPORTANT: Pay attention to the ADD and UPDATE buttons throughout this exciting journey, there is a reward at the end.  I’m just kidding, there is no reward, and no Santa Claus either.  Note also that while you’re making selections and changes, the information is being updated along the right-most column of the OCT form.
    2. Select the Software Suite or Product “Office 365 ProPlus” from the drop-down menu, and click Add
    3. Select the drop-down again, and choose “Visio Pro for Office” and click Add again.
    4. Select the drop-down again, and choose “Project Online Desktop Client” and click Add one more time.
    5. The Software section on the right-hand settings column should show all three selections.6
    6. Scroll down to Languages.  You HAVE to select an option here.  It is not optional.  The default choice for most situations will be “Match Operating System“, however, you can add more languages if you like, or just to have some fun with users by dropping unfamiliar languages on them.
      7.png
    7. Scroll back up so you can view the navigation menu at top-left again.  Then select “Licensing and display settings
      1. For most situations, the KMS or MAK options will be disabled, with KMS automatically selected.  If yours is different, who cares, I’m writing this crappy blog post, not you.  So there.
      2. Under “Additional Properties“, you can select options to enable Shared Computer Activation, Automatically accept the EULA, and Pin Icons to Taskbar.  It’s worth noting that there is no longer a warning label about the taskbar icons, so it would appear to work on Windows 10.
        8
      3. There is no “Add” or “Update” button to click for this part, so calm down, we’re almost there.
    8. Scroll up to the navigation menu again and select “Preferences“. This is where you may spend the rest of your life clicking on things, because there’s a lot to click on.  Or you may choose to ignore all of it and instead blame configuration issues on whoever handles GPO and MDM settings.  If that’s you, well, it sucks to be you.  Choose wisely.
      9.png
    9. Take a moment to review your settings along the right-hand “Configured Settings” column.  Take a moment also to reflect on your poor choices in life, those missed opportunities, that last vacation trip, and how dysfunctional your family is (or could be).  Now, when you’re done with that, and put the loaded gun and liquor bottle back in the bottom drawer, and…
    10. Click “Submit” at the very top right.
      10
    11. After you click Submit, you will be returned to the Application wizard.  Click Next.
    12. On the Deployment page, it will ask if you want to deploy the application now.  If you have a target collection ready to go, you can go for it. YOLO.
      11.png
      If you choose Yes, you will be prompted for typical deployment configuration settings, otherwise, you’ll click Next two more times and then…
    13. Wait for the content to download and the deployment source to be prepared.
    14. Don’t forget to distribute the content to your DPs.
    15. Don’t forget to populate the target collection.
    16. Don’t forget to allow time for policy updates, etc.
    17. You can also modify Office 365 Client Installations by using the “Import” feature at the top-right of the OCT form.

I’d ask for feedback/comments on this, but nobody ever posts feedback or comments.

Cheers!

Cloud, System Center, Technology

ConfigMgr – 2 Minute Microwave Style

Genesis – I posted a tweet about someone I know getting stressed at learning Configuration Manager in order to manage 50 Windows devices.  All desktops.  The background is basically that his company had planned on 1000 devices running Windows.  But the end-users, who wield more purchasing power, opted to buy mostly Macbooks.  So the total Windows device count was capped at 50, BUT…. they already approved the purchase of ConfigMgr.  It’s worth noting that the end-users also purchases JAMF (formerly Casper) and set it up in their own secret underground lab, complete with a diabolical German scientist in a white lab coat.  Ok.  That last part isn’t really true, but the JAMF part is true.

So, the “discussion” slid into “okay mr. smarty-pants skatter-turd-brainz, what would you want in a ‘perfect’ ConfigMgr world to address such a scenario?” (again, I’m paraphrasing a bit here)

MC DJam, aka DJammer, aka David the Master ConfigMaster Meister of ConfigMgr, popped some thermal verbals in front of the house and the room went Helen Keller (that means quiet and dark, but please don’t be offended, just stay with me I promise this will make sense soon…)

Yes, I’ve had a few beers.  Full disclosure.  I had to switch to water and allow time for the electric shock paddles to bring my puny brain back online.  That was followed by a brief gasp,”oh shit?! what have I started now?”  Then some breathing exercises and knuckle crackings and now, back to the program…

So, Ryan Ephgrave (aka @EphingPosh) stepped in and dropped some mic bombs of his own.

And just like having kids, the whole thing got out ahead of me way too quick.

So, I agree with Ryan, who also added a few other suggestions like IIS logs, Chocolatey package deployments (dammit – I was hoping to beat him to that one).

So the main thing about this was that this person (no names) is entirely new to ConfigMgr.  Never seen it before, and only gets to spend a small portion of their daily/weekly time with it, due to concurrent job functions.  This is becoming more and more common everywhere I go, and I’ve blogged ad nauseum about it many times (e.g. “role compression”)

What do most small shop admins complain about?

  1. Inventory reporting
  2. Remote management tools
  3. Deploy applications
  4. Deploy updates
  5. Imaging
  6. Customizable / Extendable

These are the top (6) regardless of being ConfigMgr, LANdesk, Kace, Altiris, Solarwinds, or any other product.  All of them seem to handle most of the first 4 pretty well, with varying levels of learning and effort.  But Imaging is entirely more flexible and capable with ConfigMgr (or MDT) than any of the others I’ve seen (Acronis, Ghost, etc. etc. etc.)

ConfigMgr does an outstanding job of all 6 (even though I might bitch about number 6 in private sometimes, it is improving).  ConfigMgr is also old as dirt and battle-tested.  It scales to very large demands, and has a strong community base to back it up in all kinds of ways.  In some respects it reminds me of the years I spent with AutoCAD and Autodesk communities and the ecosystems that developed around that, but that’s another story for another time.

The challenge tends to come from just a few areas:

  1. Cost and Licensing – ConfigMgr is still aimed at medium-to-large scale customers.  The EA folks with Software Assurance, are most often interested and courted into buying it.  Some would disagree, but I set my beer mug down and calmly say “Walk into any major corporate IT office and ask who knows about ConfigMgr.  Then walk into a dentist office, car dealership, or small school system and ask that same question.”  I bet you get a different response.
  2. Complexity – ConfigMgr makes no bones about what it aims to do.  The product sprung from years of “Microsoft never lets me do what I want to manage my devices” (say that with a nasally whiny tone for optimum effect).  Microsoft responded “Here you go bitch.  A million miles of rope to hang yourself.  Enjoy!”  It’s an adjustable wrench filled with adjustable wrenches, because it was designed to be the go-to toolset for almost any environment.  And it’s still evolving today (faster than ever by the way)
  3. Administration – Anyone who’s worked with ConfigMgr knows it’s not really a “part-time” job.  But that’s okay.  It’s part of the “complexity” side-effect.  And rarely are two environments identical enough to make it cookie cutter.  That’s okay too.  Microsoft didn’t try to shoehorn us into “one way”, but said “here’s fifty ways, you choose“.  The more devices you manage with it, the more time and staff it often demands in order to do it justice.  I know plenty of environments that have scaled to the point of having dedicated staff for parts of it like App deployments, Patch Management, Imaging and even Reporting.

None of these are noted with the intention of being negative.  They are realities.  It’s like saying an NHRA dragster is loud and fast.  It’s supposed to be.

Now, add those three areas up and it makes that small office budget person lose control of their bowels and start munching bottles of Xanax.  So they start searching Google for “deploy apps to small office computers” or “patching small office computers cheap as hell” and things like that.

So, ConfigMgr already does the top 6 functions pretty darn well.  So what could be done to spin off a new sitcom version of this hit TV show for the younger generation?

  1. Simpler – It needs to be stupid-simple to install/deploy and manage.  This reaches into the UI as well.  Let’s face it, as much as I love the product, the console needs a makeover.  Simplify age-old cumbersome tasks like making queries and Collections, ADRs and so on.
  2. Lightweight – Less on-prem infrastructure requirements: DPs, MPs, SUPs, RPs, etc.  Move that into cloud roles if possible.
  3. Integrate/Refactor – Move anything which is mature (and I mean really mature) in Intune, out of ConfigMgr.  Get rid of Packages AND Applications, make a hybrid out of both.  Consider splitting some features off as premium add-ons or extensions, like Compliance Rules (or move that to Intune), OSD, Custom Reporting, Endpoint Protection, Metering, etc.
  4. Cheaper – Offer a per-node pricing model that scales down as well as up.  Users should be able to get onboard within the cost range of Office 365 models, or lower.

Basically, this sounds like Intune 3.0, which I’ve also blabbered about like some Kevin Kelly wanna-be futurist guy, but without the real ability to predict anything.

Some of the other responses on Twitter focused on ways to streamline the current “enterprise” realm, with things like automating many of the (currently) manual tasks involved with installation and initial configuration (SQL, AD, service accounts, IIS, WSUS, dependencies, etc. etc.), all of which are extremely valid points.  I’m still trying to focus on this “small shop” challenge though.

It’s really easy to stare at the ConfigMgr console and start extrapolating “what would the most basic features I could live with really come down to?” and end up picking the entire feature set in the end.  But pragmatically, it’s built to go 500 mph and slow down to push a baby stroller.  That’s a lot of range for a small shop to deal with, and they really shouldn’t.  That would be like complaining that the Gravedigger 4×4 monster truck makes for a terrible family vehicle, but it’s not supposed to be that.  And ConfigMgr really isn’t supposed to be the go-to solution for a group of 10-20 machines on a small budget.  Intune COULD be, but it’s still not there yet.  And even it is already wandering off the mud trail of simplicity.  It needs to be designed with a different mindset, but borrowing from the engine parts under the ConfigMgr hood.

Maybe, like how App-V was boiled down and strained into a bowl of Windows 10 component insertions for Office 365 enablement, and dayam that was a weird string of nouns and verbs, they could do something similar with a baked-in “device management client” in a future build of Windows 10.  Why not?  Why have to deploy anything?  They have the target product AND the management tool under the same umbrella (sort of, but I heard someone unnamed recently moved from the MDT world into the Windows 10 dev world, so I’m not that far off).

Does any of this make sense?  Let me know.

 

Cloud, Personal, Projects, Scripting, System Center, Technology

Random Stuff, Part 42

Between work, studying, tinkering and trying to have something close to being considered “a life”, I haven’t been blogging much lately.  And every time I get close to having that magical, mythical thing called “a life”, I have to travel.  I can’t complain, since it gives me new perspectives on “life”, which help me to feel like I have “a life”.

And speaking of travel, here’s a cheap diagrammatic view of how I roll (literally, since my suitcase does in fact have wheels)…

packing.png

This is just the backpack.  I also didn’t include tampons, whips, chains, hand grenades, latex gloves, surgical masks, or bags of unmarked pills.  Those tend to slow me down with TSA, and I’d rather they spend most of their time with their hands around my privates.  If I touch myself in public it looks unsettling, but when they do it for me, it’s professionalism at its best, and they love it when I smile during the procedure.

Speaking of TSA, I’ve found that the passive aggressive score follows the scale of the airport, at least in the U.S.  Meaning, the bigger the airport, the less humor they tolerate.  The friendliest bunch I’ve encountered would be Medford, Oregon (MFR), and the other end of the scale would be Boston (BOS).  I love Boston.  The TSA have a consistent and warm way of welcoming travelers to bean town with that glaring “I’ll stomp your face in if you make eye contact for more than 5 seconds!”

I’ve also been updating some PowerShell-related projects.  I have always maintained personal project time to keep my sanity.  It also makes my dog want my attention more.  She leaves me little gifts to express how much she misses my attention.  And at 95 lbs, the size of those gifts can almost clog the toilet.

Here’s a few examples of what too much caffeine, too much vlog watching, and access to PowerPoint will do to someone like me, a latent marketing student.  I’m just kidding, I would’ve gone into statistics as a “statistician” but it’s too difficult to pronounce after 3 or 4 beers, and the pay doesn’t come close to most IT related jobs.

fudgepop.pngFudgePop

 

cmhealthcheck.pngCMHealthCheck

gpodoc.pngGPODoc

cmbuild.pngCMBuild

They almost look professional.  And almost as if I know what I’m doing.  Cooked up with only a frying pan, a little butter, some chunks of PowerPoint and sprinkled with Paint.Net.  All four took a whopping hour to create.  The pencil was the most fun.  I highly recommend the shape tools (Boolean stuff, like Union, Subtract, etc.), you can spend hours immersed in that strange world, forgetting to shave and bathe too.

You can find the rest of this exciting stuff at https://www.powershellgallery.com/profiles/skatterbrainz/ – where I publish things I almost know how to do.  CMBuild is still in beta, so if you get really, really, reeeeeeally bored, and you have a lab environment in which to try things like this – feel free to post angry, hurtful, mocking and demoralizing comments and bug reports.  The more condescending the better. My doctor enjoys this too.  The visits for medication help his kids through another semester at medical school, and I don’t want to let him down.

Travel

I forgot to mention that MFR, while being a very small airport, also has some really nice artwork on the walls around baggage claim…

20171105_213501.jpg

Approaching Norfolk (ORF), the most dynamic and interesting place for underpaid IT professionals…

20171111_102616.jpg

Leaving San Fran (SFO).  The most dynamic and interesting place for well-paid IT professionals who can’t afford to live there…

20171110_193714.jpg

Getting ready to board my next flight.  I have the window seat just behind the wing…

20171110_204509_Burst01.jpg

Back in my office…

20170902_231019.jpg

Technical Stuff

In the past month, I’ve been dunked into projects involving a variety of different beatings, I mean challenges.

  • 2 involving MDT+Windows 10 with distributed/replicated MDT deployment shares.  One using DFS and the other using Nasuni, for the replication service.  Both worked out very well.
  • 2 involving Office 365 ProPlus.  One mixing C2R Office with MSI Visio and Project.  The other mixing C2R Office using O365/AzureAD licensing, with C2R Visio/Project using KMS licensing.  Neither was that difficult, but I did come away with a continued wonder and amazement at how something so simple (C2R deployments) could be left half-baked by Microsoft and nobody seems to care.
  • 3 involving Configuration Manager.  1 focused on SUP strategies for servers.  1 focused on being a crying shoulder for an overloaded admin and under-give-a-shit managers.  1 focused on replacing some horrific mess some other (independent) consultant attempted while in between binges of drinking and glue sniffing.

The rest of the time has been Azure, Intune, O365, PowerShell, PowerShell with Azure AD, PowerShell with Intune, PowerShell with System Center, System Center with PowerShell, PowerShell with PowerShell, and a little bit of PowerShell. I’d think by now I’d know something about PowerShell, but I’m not going to pat myself on the back just yet.

User Groups

Our geographic region seems to have very few IT-related user groups with regards to the population of professionals.  We do have a few, such as groups for Docker, SQL Server, .NET, Machine Learning/AI, and a few others.  So, I’ve been trying once again (third time) to get a Microsoft-related group off the ground.  And I’m happy to say it’s actually starting to get off the ground!  It’s called Hampton Roads Cloud Users Group.  “HRCloudGroup” on Slack, and Facebook.

For those not familiar with this interesting little area, it’s officially comprised of 7 cities in the southeastern corner of Virginia, at the North Carolina border.  Mouth of the Chesapeake Bay.  But the actual list of surround municipalities include Norfolk, Virginia Beach, Portsmouth, Chesapeake, Hampton, Newport News, Williamsburg, Yorktown, Suffolk, Surry, and Smithfield.  There’s also a large number of people who commute from North Carolina to jobs in this area, so it extends beyond Virginia.

Some call it “Tidewater”, which is a stupid name.  Some call it “Hampton Roads”, which is a less stupid name.  Some call it “that shitty place I hated being stationed at while in the Navy/Marines/Air Force/Army/Coast Guard/CIA/FBI/NSA/DEA/NATO…” eh, you get the idea.  I would venture to say it is the most militarized area of land in the United States, maybe in the world.  Every branch of military, intelligence, logistics, special operations, tactical operations, is located within a small enough radius to be a ridiculously appealing target for Russian satellites.  My house, is under the flight path between Little Creek JEB (SEAL team 6 or DEVGRU), Fort Story and Oceana NAS.  I can name the fighter jet, cargo plane, or helicopter models by sound alone. I just haven’t found a way to earn a living doing that yet.

Enough Rambo talk. Our group is still very small, at about a dozen members, with about 4 or 5 people attending the monthly meet-ups so far, we’ve been fortunate to get some very skilled, very creative members, so I couldn’t be happier.  I feel like my role is more of a facilitator than a leader.  The others have way more experience than I at this point, so I’m happy to just connect the wires and keep the engine running, and learn what I can along the way.  We’ve only had 2 meet-ups so far, but I’m optimistic.  Our next one is December 14, 2017 at 6pm.  If you live in the area, hit us up.

Miscellaneous

As if the entire blog post isn’t already “miscellaneous”.  Shit, my whole life is “miscellaneous” when I get down to it.  But who’s complaining? Okay, I do from time to time.  Anyhow, shotgun blast…

  • PlatyPS is cool.  Once you remember to actually put comments in the right places and import the module before running New-MarkdownHelp for fifth time and cursing at the monitor for not reading my my mind.
  • Carbon is still cool.  Even cooler.
  • The Tesla semi is freaking awesome.  The Roadster is obviously cool as well.  I can afford neither.
  • I had my first MSATA failure today.  A Lite On 256 GB card in my HP Elitebook.  RIP.  It was nice having you while you lasted.
  • Shout out to Whitner’s BBQ in Virginia Beach.  Still the best I’ve had anywhere I’ve traveled, and it’s right in my backyard.
  • Shout out to the group of kids who yelled across the busy street “I like your chocolate dog!!”  She loved it too.
  • I need fish food for the aquarium.  Off to the stores on a Saturday.  Wish me luck.

Chocolate dog.  Aka “Dory”

20171117_131721.jpg

Cloud, Projects, Scripting, System Center, Technology

CMHealthCheck is now a PowerShell Module

What is it

CMHealthCheck is a PowerShell module filled with bubble wrap, Styrofoam peanuts, and 2 possibly useful functions, aimed at collecting a bunch of data from a System Center Configuration Manager site server, and making a pretty report using Microsoft Word.  But doing so without needing to manually download and store a bunch of script files and so on.

It’s still based on the foundations laid by Rafael Perez, with quite a bit of modification, prognostication, prestidigitation, and some coffee.  Special thanks to Kevin (@thenextdotnet) for helping point me in the right direction to move it all from scripts into a module.

Why is it

I get asked (okay, told) to help customers find out why their site servers are running slow, showing red or yellow stuff, or just to get them ready to upgrade from whatever they’re on to whatever is the latest and greatest version of ConfigMgr.  They also like a pretty Word document with a spiffy cover page.

How to use it

  1. Install the module on the ConfigMgr CAS or Primary server you wish to audit –> Import-Module CMHealthCheck
  2. Run the Get-CMHealthCheck function (see documentation on Github – linked below)
  3. Install the module on a Windows computer which has Office 2013 or 2016 installed (hopefully NOT the same computer which was audited)
  4. Run the Export-CMHealthCheck function (see documentation on Github – linked below)
  5. Save the Word document and mark it up.

Where to Get it

How to Complain About it

Because I know some of you will, but that’s okay.  Without complaints, we have no way of identifying targets.  Just kidding.  I need feedback and suggestions (or winning lottery numbers and free food coupons).  Please use the “issues” link on the GitHub repository to submit your thoughts, gripes and so on.