Random Thoughts and Stuff

While packing for travel I was having a conversation around the “state of IT” with a friend. So I figured (A) it might be worth jotting down some of the key points, and (B) do it before I fly out, in case some underpaid mechanic forgets to tighten that one bolt that holds the engine onto the wing.  Actually, who am I kidding, that mechanic probably earns more than I do.  Anyhow…

this is rambling, so drink plenty of medication and smoke your wine before continuing.

(travel-packing sidenote: Oreo is 15 years old, and like most humans, used to hate my guts.  Eventually, as daughter number 3 moved out, and I became her sole source of attention and food, she has become my friend.  She follows me around all day.  Every time she sees my suitcase out the night before I travel, she does this.  I’ll need to go over that ball cap with a ball of tape in the morning.)

The Future of SCCM and MDT and EMS

This is admittedly a Microsoft-centric topic. The discussion mentioned above was about “how long” will SCCM and MDT be “of interest” to consumers?  Obviously, I do not own a real crystal ball.  I only have a Dollar Tree knock-off, and I don’t have access to real business data, therefore I rely upon what scientists often refer to as ‘shit talking‘.

I’ll admit, after day 1 at MS Ignite this year, I was feeling the angst.  For example, while riding the bus back from the conference center to the hotel, staring out the window, I kept thinking “why, why, why did I leave app-dev to move into this infrastructure rat race?!  wtf was I thinking?!” and “I hope my dog isn’t chewing up my last pair of flip flops right now.”  It really felt like the job market for anyone walking around a typical IT shop today, would be dried up to around 10% of that volume within 5 years.  And who really knows?  It could go in any direction.  I think everyone is in agreement that there is already a major tectonic shift in play, with traditional operations moving into software-defined operations.  The thought of learning things like JSON, Git, VSTS, on top of PowerShell and Azure, is adding wrinkles to quite a few faces I see.

The mere mention of “new certs” is probably having a measurable effect on the sales of alcohol, tobacco and firearms, which should keep these guys employed for a long time.  For many I’ve known over the years, the feeling is like being shoved onto a rollercoaster by your drunk buddies, against your will, and now you’re approaching the top of the first peak in the run.

After 3 more days, and soaking up session content, food, beer, and more importantly: engaging vendors in deep discussions out on the expo floor, my initial expectations of SCCM/MDT doom were relaxed quite a bit.  Mainly out of realizing the following points:

  • The majority of imaging work I see being done at most mid-sized and large customers is refresh of existing hardware.  The mix of new vs. reuse is cyclical at most larger shops, due to budget cycles and SLA’s about hardware refresh programs.  So at various times in a given year, there’s more new imaging, but for the remainder of the year it seems to be more refresh/reuse.
  • Most of the (above) people I’ve spoken with are very interested in AutoPilot, but quite a few hadn’t yet been allowed access to the preview at the time I spoke with them (I think most are now seeing it in their portals)
  • In-place upgrades are still a minority (far too low in my opinion)
  • The description of “automatic redeployment” got their attention, but most are still tied to the comfort of a “clean wipe and load” for various reasons.
  • Ultimately: Regardless of what anyone says, things which “work” have a very VERY tough time dying in the IT world.  Hence why so many machines still run XP.  I’d also wager my tombstone that Windows 7 will easy to find running on machines in 2027.  That’s because I’m planning to be cremated on a Walmart grill.  But that’s beside the point.

The weeks after Ignite I’ve made it a point to casually interview my customers to get a feel for where they see the biggest and most immediate changes coming.  It’s a delicate thing to ask, since it can easily smell like a sales pitch.  Sales pitches have a distinct odor that is often confused with bus station toilets or dead cows laying in the sun.  However, most of them are well aware of my dislike for sales people in general (some of my friends are in sales, so there are exceptions).

  • The biggest hurdles they have today are keeping up with device models and drivers, patching, moving to some new system for IT operations (ticketing, change mgt, etc.), and endless training of each new-hire who is replacing three who just left.  See what I did there?
  • The single biggest complaint about imaging in general revolves around drivers.
  • There’s still quite a bit of frustration and confusion I hear around Intune capabilities.  Some is related to Intune agent vs. agentless, management; Some is around app deployment capabilities; Some is inventory reporting.

Windows 10

I still spend way too much time explaining Windows 10 servicing models and the naming convention.  They were just starting to grasp CB, and CBB, but now “Semi-Annual” and “Semi-Annual Targeted” are leaving them in one of two modes: pissed off or chuckling.  The most common response I hear from the former is around the constant renaming of things in general.  “Active, Live, Visual, and now CBB, then Semi Annual Targeted, WUS, then WSUS”, and so on.  Their words, not mine.

I’m always surprised to find so many shops still heavily invested in MDT and doing very well with it.  The other interesting thing is that the majority of them assume they’re doing everything wrong and are panicked when I arrive that I’ll redline their papers.  In fact, most of them are doing very well.  They’ve read the blogs, the tweets, bought the books, watched the videos, done the TechNet labs, and so on.  A few have been lucky enough to attend training and/or conferences as well, but that’s a very small percentage.

The pace at which organizations are getting their Windows 10 rollouts moving is gaining speed and volume.  However, the Office aspect has thrown a wrench into quite a few (see below)


This reminded me of a discussion I had at Ignite with the Office team on the expo floor.  It was around the issue of third-party extensions (add-ons) for Office, and how many are produced by small shops which do not stay current with Office versions.  The result I’ve continued to see is a fair amount of shops who can’t upgrade Office until the third-party vendor puts out an update or a new version.  Then there’s the cost factor (is it free or not?).  In many of those cases, the hold-up triggered the IT department to wait on other projects such as Windows 10.

The number of Access applications interfering with Office upgrades has dropped significantly for me in the past 3 years.  Not just the projects I’m working on, but also from reports I get from other engineers and customers.  That’s a good thing.

Controls vs. Controls

I’m still seeing a continued reliance on inefficient control mechanisms with regards to device and user configuration management.  Way too much effort put into the imaging side, and not enough on the environment side.  Way too much on the “nice to have” side, vs. the “gotta have” side.  Not enough attention is being paid to ‘baseline’ and ‘continuous’ control models, and when to use which one, and how best to apply tools for each.  For example, hours and hours spent on wallpaper, shortcuts, and folders in the imaging process, being manually adjusted with each update cycle, rather than letting Group Policy Preferences step in and mop that shit up with one hand.

I’ve had fairly good results convincing customers to drop the continuous meddling with reference images, instead, (at least trying to)…

  • Change the data storage process to keep users from storing anything valuable on their device
  • Remove as many non-critical configuration controls as possible
  • Move continuous controls to the environment (GPO)
  • Move baseline controls to the environment wherever possible (GPP)
  • Move remaining baseline controls to task sequence steps

There are obviously exceptions that require mucking with the install.wim to make a new .wim, but I’m finding that’s only REALLY necessary in a small percentage of the time.  The vast majority of controls I see are voluntary and serve little functional or operational benefit.  Things like hiding Edge, Forcing IE, hiding Cortana, forcing Start Menu and Taskbar items, etc.  Just educate users to avoid them.  Treat them like adults and who knows, maybe they’ll stop urinating on your office chair.  Try it and see.  The worst you’ll get is whining (you get that anyway), but the best (and most likely) outcome is less work for you and less risk of something breaking.

Role and Salary Compression

It not only continues to thrive, it seems to be accelerating its pace.  Almost every customer I meet tells me how they’re expected to do more with fewer people, less training, less budget, and shorter time constraints.  Most haven’t had a significant raise in a long time.  This seems more prevalent at companies which are publicly-traded than those which are not, but it affects both.  Personally, I see a correlation with public organizations and cost reduction priorities over innovation and revenue increase.  Then again, I have no formal training in such matters, so again, I’m probably talking shit.  Again.


Speaking of Intune.  I’ve been spending more time with it this past week, along with Azure AD, and AzureADPreview powershell module.  I’ve always like the concept of what Intune and EMS are aimed at. The mechanics are still frustrating to me however.  There are plenty of design quirks that drive me batshit crazy.  Like devices / all devices vs. Azure AD devices, and the differences in how provisioning an app for iOS or Android and Windows, from the Windows Store no less.  As Ricky Bobby would say, “it’s mind-bottling”.

Then again, I’m not at all a fan of the “blade” UI paradigm.  The Blade model is (in my humble, semi-professional, almost coherent opinion) marginally efficient for direct touchscreen use, but for mouse and keyboard it blows chunks of post-Chipotle residuals.  I’m sure that will infuriate some of you.  But that’s just how I feel.  Drop-down menus, panels, heck, even ribbons, are more efficient in terms of hand and finger interaction (mouse, touchpad) for operations involving closely related tasks (settings, options, etc.)  Ask yourself if moving ConfigMgr to a blade UI would make it better?  Or Office?  If you think so, try switching to another brand of model glue.

Back to Intune.  I would really look forward to seeing it mature.  Especially in areas like agent vs. agentless device management (it’s very confusing right now, and the differences are weird), AutoPilot, Redeployment/Reset, and expanding the features for deploying applications, remote management (TeamViewer, etc.), and GPO-to-MDM migration.  I’m thinking Windows desktops and laptops of course (if you hadn’t already figured that out).  Phones are great, but nobody, who isn’t masochistic, is going to write a major app using their phone most of the time.  Auto-correct and latent-rich touch screen typing, would cause most PowerShell, C# or Ruby code writers to massage their head with a running chainsaw.

I think I digressed a bit.  sorry about that.

Other Stuff

I’ve spent some after-hours time keeping my brain occupied with scripting and app-dev projects.  I’ve been doing Windows 10, Windows Server 2016, MDT and SCCM lab builds and demos for months, along with real implementation projects, and starting to burn out on it all.  I needed a break, so I’ve managed to get a few things done and few in the pipeline:

I ran across a couple of (seem to be) abandoned Git projects focused on DSC for building ConfigMgr sites, but none of them appear to be factored into a template construct.  Meaning?  That they’re still built on specific parameters, or require extensive customization for various roles, configurations, environments.  I’m still poking at them and forked one to see what I can make it do.  In the meantime, I’m moving ahead with CM_BUILD and CM_SITECONFIG being merged into a new CMBuild PowerShell module.  So far so good.  And when that’s done, I’ll go back to see what I can in that regard with DSC and applying that towards Azure VM extensions.

I’ve come to realize that there’s 4 basic types of southern dialect in America:  Fast, Slow, Twangy and Crooked Jaw.  Think about it, and you’ll see that’s true.

The shortest distance between two points is often the most expensive.

If you never fail, you’re not trying hard enough.

If you fail most of the time, you’re probably in the wrong career path.

I’m on travel next week.  Expect more of the same stupid tweets about mundane stuff.  If you tire of me, unfollow me.  I don’t mind.  It’s just Twitter after all.  I will do my best to keep my camera ready for anything interesting.  I need to watch some air crash documentaries now, to get my mind relaxed for tomorrow.



Pardon the headline and semi-questionable graphic, but it’s all I had to work with on short notice.

As a result of way too much caffeine, tempered with a sudden burst of alcohol and intense, yet clueless conversation with a colleague, the following hair-brained idea sprang up. This required immediate action, because, stupid ideas have to be enacted quickly in order to produce rapid failures and immediate lessons-learned…

Idea: What if you could manage remote, non-domain-joined, Windows 10 computers from a web page control mechanism, for “free”, where the local user has NO local admin rights, to do things like run scripts, deploy or remove applications, etc.?

What if? Aye?

So, Chocolatey came to mind. Partly because I was eating something with chocolate in it, but mostly because I love the Chocolatey PowerShell packaging paradigm potential, and that’s a heavy string of “P”‘s in one sentence.  Anyhow, it felt like one of those sudden Raspberry Pi project urges that I had to get out of my system, so I could move on to more important things, like figuring out what to eat.


  1. The machine would need to be configured at least once by someone with local admin rights.
  2. The machine would need to be connected to the Internet in order to receive updated instructions
  3. The admin needs a little knowledge of technical things, like proper coffee consumption, shit-talking and locating clean restrooms

Outline of Stupid Idea

  1. Drop the FudgePack script into a folder on the device (e.g. C:\ProgramData\FudgePack, file is Invoke-FudgePack.ps1)
  2. Create a Scheduled Task to run under the local NT Authority\SYSTEM account
    1. Task should only run if a network connection is active
    2. Task should only run as often as you would need for immediacy
    3. User (device owner) should be able to invoke Task interactively if needed.
  3. Host the control data somewhere on the Internet where the script can access it

Procedure for Stupid Idea

  1. Here’s the FudgePack app control XML file.  Make a copy and edit to suit your needs.
  2. Here’s the FudgePack PowerShell script.  Douse it in gasoline and set on fire if you want.
  3. Here’s an example Scheduled Task job file to edit, import, point at, and laugh.

Setting up and Testing the Stupid Idea

  1. Copy the appcontrol.xml file somewhere accessible to the device you wish to test it on (local drive, UNC share on the network, web location like GitHub, etc.)
  2. Edit the appcontrol.xml file to suit your needs (devicename, list of Chocolatey packages, runtime date/time values, etc.)
  3. Invoke the script under the SYSTEM account context (you can use PsExec.exe or a Scheduled Task to do this)
  4. Once you get it working as desired, create a scheduled task to run as often as you like
  5. Send me a box filled with cash – ok, just kidding.  but seriously, if you want to, that’s ok too.

More Stupid Caveats

  1. It’s quite possible in this day and age, that someone else has done this, and likely done a better job of it than I have.  That’s okay too.  I searched around and didn’t find anything like this, but my search abilities could quite possibly suck.  So, if someone else has already posted an idea identical to this, I will gladly set this aside to try theirs.  After all, it’s about solving a problem, not wasting a sunny, beautiful Saturday behind a keyboard.


CMHealthCheck is now a PowerShell Module

What is it

CMHealthCheck is a PowerShell module filled with bubble wrap, Styrofoam peanuts, and 2 possibly useful functions, aimed at collecting a bunch of data from a System Center Configuration Manager site server, and making a pretty report using Microsoft Word.  But doing so without needing to manually download and store a bunch of script files and so on.

It’s still based on the foundations laid by Rafael Perez, with quite a bit of modification, prognostication, prestidigitation, and some coffee.  Special thanks to Kevin (@thenextdotnet) for helping point me in the right direction to move it all from scripts into a module.

Why is it

I get asked (okay, told) to help customers find out why their site servers are running slow, showing red or yellow stuff, or just to get them ready to upgrade from whatever they’re on to whatever is the latest and greatest version of ConfigMgr.  They also like a pretty Word document with a spiffy cover page.

How to use it

  1. Install the module on the ConfigMgr CAS or Primary server you wish to audit –> Import-Module CMHealthCheck
  2. Run the Get-CMHealthCheck function (see documentation on Github – linked below)
  3. Install the module on a Windows computer which has Office 2013 or 2016 installed (hopefully NOT the same computer which was audited)
  4. Run the Export-CMHealthCheck function (see documentation on Github – linked below)
  5. Save the Word document and mark it up.

Where to Get it

How to Complain About it

Because I know some of you will, but that’s okay.  Without complaints, we have no way of identifying targets.  Just kidding.  I need feedback and suggestions (or winning lottery numbers and free food coupons).  Please use the “issues” link on the GitHub repository to submit your thoughts, gripes and so on.

Random Notes from This Week

Warning: Skattered thoughts ensue.

Checking Memory

Calculating total system memory within a virtual machine session might seem like a rather mundane task.  But during a recent troubleshooting call with a customer, I ran across something I had forgotten: Physical machine properties are stored in WMI differently depending upon the context.  Note that running these on an actual physical machine, a virtual machine with static memory allocation, and a virtual machine with dynamic memory allocation, will often yield different results.

Win32_ComputerSystem (TotalPhysicalMemory)

[math]::Round(((Get-WmiObject -Class Win32_ComputerSystem | 
  Select-Object -ExpandProperty TotalPhysicalMemory) | 
    Measure-Object -Sum).Sum/1GB,0)

This returned 4 (GB) from 4,319,100,928

Win32_PhysicalMemory (Capacity)

Using Win32_PhysicalMemory / Capacity returns a different result.

[math]::Round(((Get-WmiObject -Class Win32_PhysicalMemory | 
  Select-Object -ExpandProperty Capacity) | 
    Measure-Object -Sum).Sum/1GB,0)

This returned 16 (GB) from the sum of 17,179,869,184 (13,019,119,616 + 4,160,749,568).


PowerShell, CSV and sparse Label Names

Some of the biggest headaches in IT, besides people, are caused by imposing human habits on technical processes.  For example, insisting on embedded spaces in LDAP names, CSV column headings and SQL table columns.  Just stop it!  Say no to spaces.  The only space we need is the one between our ears, and usually when driving.  However, we don’t always have the luxury of dictating “best practices”, instead, we have to adapt.  So, I had a colleague suffering with a CSV file that a customer provided with column names like “Last Name”, “Dept Number”, and so on.  Why they couldn’t use “LastName”, “LName”, or even “LN”, who knows, but it was “Last Name”.

$csvdata = Import-Csv "C:\Program Files (x86)\Microsoft Visual Studio 14.0\DesignTools\SampleData\en\SampleStrings.csv"
PS C:\> $csvdata[0]
Name : Aaberg, Jesper
Phone Number : (111) 555-0100
Email Address : someone@example.com
Website URL : http://www.adatum.com/
Address : 4567 Main St., Buffalo, NY 98052
Company Name : A. Datum Corporation
Date : November 5, 2003
Time : 8:20:14 AM
Price : $100
Colors : #FF8DD3C7

PS C:\> $csvdata[0]."Email Address"

PS C:\> $csvdata | Where-Object {$_."Email Address" -like '*fabrikam*'} | 
  Select-Object -ExpandProperty "Email Address"

Thankfully, PowerShell is somewhat forgiving in this regard and allows you to adapt to such challenges easily.  A very small feature but very helpful as well.

Searching for Apps: Slow vs. Fast

Searching for installed software is usually pretty simple.  However, using WMI to query the class Win32_Product often becomes much slower due to overhead incurred by the Windows background refresh.  If you run a query for “SQL Server Management Studio” on a Windows server or client, regardless of the amount of memory, processor power, or disk performance, if these are all equal, WMI will be much slower than searching the registry or file system.  Case in point (using PowerShell for example purposes only):

Get-WmiObject -Class Win32_Product | 
  Where-Object {$_.Name -eq 'SQL Server Management Studio'}

When you run this, if you watch Task Manager at the same time, you may notice WmiPrvSE.exe and dwm.exe spin up more memory until the query is completed.  However, if you search the registry, which may not always be possible for every situation, the performance is typically much better (faster):

Get-ChildItem -Path HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall\ -Recurse | 
  Where-Object {$_.GetValue('DisplayName') -eq 'SQL Server Management Studio'}

Another option is to query the file system, which in this example is looking for a specific file and specific version:

$file = "C:\program files\Microsoft SQL Server\140\Tools\Binn\ManagementStudio\Microsoft.SqlServer.Configuration.SString.dll"
if (Test-Path $file) { 
  (Get-Item -Path $file).VersionInfo.FileVersion 

Running these three statements on my HP Elitebook 9470m with 16 GB memory and a Samsung 500 GB 850 EVO SSD the results end up with the following average execution times:

  • WMI: 20 seconds
  • Registry: 0.98 seconds
  • File System: 0.13 seconds

The point here isn’t that you should avoid WMI, or anything like that.  It’s just that we should always consider other options to insure we’re choosing the right approach for each situation.  For example, if you really need to invoke the .Uninstall() method, then your options may narrow somewhat.

For a handful of targeted queries it may not matter to you which of these (or others) makes the most sense.  But when querying 5,000, 10,000 or 100,000 machines across a WAN, without the aid of a dedicated inventory product, the choice may have a significant impact.

cm_siteconfig 1.2

I’ve been busy this Labor Day weekend. Besides vacuuming up water from a broken water heater at 4am, mowing a ridiculously big lawn with a ridiculously small lawnmower, and avoiding the oceanfront tourist freakshow (or almost), I spent a fair amount of time on cm_build and cm_siteconfig.  Needless to say, I’m not feeling very funny right now, but I assure you that I will return to my usual tasteless, dry, ill-timed humor after a word from these sponsors.  Even though I don’t have any yet.  Actually, if you blink, you might miss some hidden jestering below.

I’ve already discussed these two PowerShell scripts in a previous blog post, but to update things: cm_build is still at 1.2.02 from 9/2/2017, and cm_siteconfig is now at 1.2.22 from 9/2/2017.  That’s a lot of 2’s.  Anyhow, here’s what each does as of the latest versions:

Note: The version numbers are in parenthesis to indicate lab configuration.  The XML allows *YOU* to configure the installation ANY WAY YOU DESIRE, and to reference ANY VERSION you desire.  The versions below are just what I used to test this thing (so far) about 84 times.

LAB TEST NOTE: Prior to running this script, build the server, assign a name, static IP address, and join to an Active Directory domain.  Then take a snapshot (VMware) or checkpoint (Hyper-V) to roll back if anything spews chunks along the way.

Prep Work

cm_build.ps1 / cm_build.xml

  • Install Windows Server roles and features (except for WSUS)
  • Install Windows 10 ADK (1703)
  • Install Microsoft Deployment Toolkit (MDT) (8443)
  • Install SQL Server (2016)
  • Install SQL Server Management Studio (2017)
  • Configure SQL Server maximum memory allocation
  • Configure SQL Server ReportServer DB recovery model
  • Install WSUS server role
  • Run the WSUS post-install configuration
  • Install System Center Configuration Manager 1702
  • Install ConfigMgr Toolkit 2012 R2
  • Install Recast Right-Click Tools

GENERAL NOTES:  The cm_build.xml structure starts with the “packages” section, which dictates what gets executed and in what order.  The “name” element establishes the package code link used by all of the other sections, such as payloads, detections, files, and features. Note that the files section only requires the pkg key value for SQLSERVER and CONFIGMGR.  Other files can be created without using a matching pkg key if desired.

LAB TEST NOTE: I strongly recommend taking another snapshot (VMware) or checkpoint (Hyper-V) at this point, prior to running cm_siteconfig.ps1.  This will help avoid angst and loss of temper while making iterative changes to cm_siteconfig.xml and retesting.

cm_siteconfig.ps1 / cm_siteconfig.xml

  • Create SCCM accounts
  • Configure the AD Forest connection
  • Configure Server Settings
    • Software Deployment Network Access Account
  • Configure Discovery Methods
    • The template AD User Discovery Method adds AD attributes: department, division, title.
    • The template AD User and System Discovery Methods add filtering for password set and login periods of 90 days each
  • Configure Boundary Groups
    • The template creates 4 sample boundary groups
    • You can enable creating Site Boundaries as well, but the default is to allow the AD Discovery to create subnet/IP range boundaries
  • Configure Site Roles
    • Management Point
    • Service Connection Point
    • Distribution Point (with PXE)
    • Cloud Management Gateway (still in development)
    • Software Update Point
    • Reporting Services Point
    • Application Catalog Web Service Point
    • Application Catalog Website Point
    • Asset Intelligence Synchronization Point
    • (more to come)
  • Client Settings
    • Still in development, but…
    • The template creates two (2) sample device policies: Servers and Workstations
  • Client Push Installation Settings
    • Still in development
  • Create DP groups
    • The template creates 4 sample DP groups
  • Create console Folders
    • The template creates sample folders beneath: Applications, Device Collections, User Collections, Boot Images, Task Sequences, and Driver Packages
  • Create Custom Queries
    • The template creates two (2) sample device queries
  • Create Custom Device Collections
    • The template creates three (3) sample user query-rule collections, and 15 sample device query-rule collections
  • Create Custom User Collections
  • Import OS Images
    • The template imports two (2) OS images: Windows 10 1703 and Windows Server 2016
    • Source media is not included.  Batteries not included.  Just add hot water and stir.
  • Import OS Upgrade Installers
    • The template imports two (2) OS upgrade packages
  • Configure Site Maintenance Tasks
    • Excludes Site Backup, and Database Reindex tasks for now.  I plan to have these enabled soon.
  • Create Application Categories
    • The template includes (5) sample categories: IT, Developer, Engineering, Finance, General and Sales
    • For now, the detection rules are implemented using a chunk of freshly-cut, carefully seasoned, and slow-roasted PowerShell code, because it’s easier to shoe-horn into this process and provide flexibility and adaptability.  And besides, all those syllables sound kind of impressive after a few mixed drinks.
  • Create Applications
    • The template includes examples for 7-Zip, Notepad++, VLC Player, and Microsoft RDC Manager

What’s Next?

  • I’m still working on this, so more changes/improvements will be coming.

Q & A

  • Is it really “open source”?
    • Yes! Go ahead and pinch yourself.
  • Did you write all this code yourself?
    • Yes, sort of.  Some of the pieces were adapted from, or inspired by, the outstanding work done by other amazing people like Niall Brady, Nickolaj Andersen, Johan Arwidmark, Mikael Nystrom, Maurice Daly, Stephen Owen, Anders Rodland, Raphael Perez, Chrissie LeMaire, Jason Sandys, Sherry Kissinger, and many others I can’t think of right now.  Thanks to Kevin B. and Chris D. for helping me find better ways to solve key areas of the overall project.  The XML constructs and process model are my own hallucinatory work.
  • Can I Make Suggestions / Request Changes?
    • Please use the “Issues” feature in Github to submit bugs, feature changes and enhancements, etc.  I will make every effort to review, assess, feebly attempt, fail to satisfy, cry over insecurities of self-doubt, angrily assign blame, throw objects across room while swearing like a drunk sailor, solemnly accept defeat, and ultimately: try to make it work as requested.
    • Note that creating a Github account is required for submitting Issues.  Github accounts are free and they make you feel warm and fuzzy inside.
  • Does cm_build also download required installation media?
    • No.  I’m too lazy.
    • 99.999999% of my customer engagements involve a ‘kick-off’ call in which we discuss prerequisite action-items prior to beginning work. This typically includes requesting the customer to have all installation media and licensing information ready to go.  Which they typically do, so I didn’t feel the need to bother with that aspect (not to mention, try to keep up with version changes and new URL’s over time)
  • Can cm_build be used to install a Central Administration Site?
    • Yes.
  • Can cm_build be used to install a Secondary Site?
    • Yes.
  • Can cm_build be used to destroy alien civilizations?
    • Probably not.
  • How was this thing Tested?
    • In a small dungeon beneath a floating castle in a lake atop a tall mountain.  Okay, in my home lab, next to the dog’s sofa.
    • It’s been tested about 84 times as of 9/4/2017.  That’s about 55 times for cm_build and 29 times for cm_siteconfig.  But by the time you’ve read this, it’ll have increased again.
  • What was/is your Test Environment like?
    • Windows Server 2016 (Dell R710) server with Hyper-V
    • 3 virtual machines: DC1 (domain controller), FS1 (file server) and CM01 (configuration manager server)
    • CM01:
      • 16 GB memory, 4 disks (C: for OS, E: for apps, F: for content, G: for logs, etc.), 2 vCPUs
      • Windows Server 2016 Standard
    • Me:
      • Coffee cups falling off the desk, on ever flat surface, in the trash can, on top of one of my dogs, and a few more in the kitchen sink
      • Empty snack bar wrappers strewn across the room
      • A tattered doggie toy-squirrel hanging on a door knob for some strange reason.
  • What’s the point?
    • It’s been challenging, and fun, to work on.  It saves me time and headache at work and in my home lab.
    • It opens up potential secondary capabilities, like automating installation documentation and building an extract/build process to close the circle of life, open a wormhole, fill it with black holes and jump in for a ride.  I really need to stop listening to so many podcasts.
  • Why should I care?
    • You shouldn’t.  You can go do something fun now instead.
  • Why XML?
    • Because I &^%$ing hate JSON, and INI is too limited.  I thought about YAML, which looks a little bit like JSON, but not like it was punched in the face with a meat tenderizer mallet, but then I had to mow the lawn, and completely forgot why.
  • What have been (or continued to be) the biggest Challenges?
    • Time
    • Sleep
    • Deciding where to draw logical boundaries between automating and leaving out for manual work later
    • Refactoring, refactoring, re-refactoring, and re-re-refactoring before refactoring some more
    • More refactoring
    • Incomplete Microsoft ConfigMgr PowerShell cmdlet reference documentation*
    • Incomplete/inconsistent Microsoft ConfigMgr PowerShell cmdlet features*
    • Incomplete/inconsistent mental capacity (mine)
    • Occasional power and Internet service outages and lack of a power backup system (budget, weather, drunk drivers)
  • Does humor belong in IT?
    • Yeah.  It has to. Over 35 years in this field of work, I’ve seen what happens to people who forget that. It doesn’t end well.

*  I’m not going to beat them up on this, since they are already making Herculean efforts towards modernizing and cleaning up ConfigMgr, so the gravy should have a few lumps.

GPODoc – PowerShell Module

I just posted my first PowerShell module to the PowerShell Gallery, even though it’s been available on my GitHub site for a little while longer.  It’s a small project, with only two (2) functions, so far.  Get-GPOComment and Export-GPOCommentReport.

Get-GPOComment is purely CLI and helps query GPO’s to return embedded descriptions, comments and so on.

Export-GPOCommentReport is a CLI function as well, but produces an HTML report, which is technically GUI output, of embedded GPO comments and descriptions.

The module is simple to load by way of the Install-Module and Import-Module cmdlets (see below).  The only requirement is to have the GroupPolicy PowerShell module installed first.  This is normally installed on all AD domain controllers, as well as other servers and workstations which have the Windows 10 RSAT package installed (tip: You can also use Chocolatey to install RSAT:  choco install rsat <or> cinst rsat)

Loading The PowerShell Module

Install-Module GPODoc

Trying it out: Get-GPOComment

Get-GPOComment -GPOName '*' -PolicyGroup Policy


Get-GPOComment -GPOName '*' -PolicyGroup Preferences

get-gpocomment2You can easily modify the output layout since the output is in hash table format…

Get-GPOComment -GPOName '*' -PolicyGroup Preferences | Format-Table

If you want more flexibility in selecting the inputs, you can use the Get-GPO cmdlet and channel the output through a pipeline.  For example…

Get-GPO -All | Where-Object {$_.DisplayName -like "U *"} | 
  Select-Object -ExpandProperty DisplayName | 
    Foreach-Object {Get-GPOComment -GPOName $_ -PolicyGroup Policy}


The parameters for this function are as follows:

  • GPOName
    • [string] single or array of GPO names
  • PolicyGroup
    • [string] (select list) = “Policy”, “Preferences” or “Settings”
    • Policy = query the GPO descriptions only
    • Settings = query the comments attached to GPO internal settings only
    • Preferences = query the comments attached to GPO Preferences internal settings only

This also respects -Verbose if you want to see more background details.


This function only takes two (2) inputs:

  • GPOName
    • [string] single or array of GPO names
  • ReportFile
    • [string] the path/filename for the HTML report file to create.

It invokes Get-GPOComment, combining “policy”,”preferences” and “settings” output, and writes it all to an HTML report file of your choosing.  The CSS formatting is hard-coded for now, but I’m open to feedback if you want more options. You can specify the GPO name(s) directly, or feed them via the pipeline.


Example web report…


If you used 1.0.1 (posted a few days earlier) scrap that and get 1.0.2.  The older version had some major bugs in the Get-GPOComment function, and the Export-GPOCommentReport function wasn’t available until 1.0.2.

Feedback and suggestions are welcome.  Even if it’s “hey man, your shit sucks. Consider a dishwashing job, or something else.”  Any feedback is better than no feedback.

Thank you!


Documenting Your IT Environment (The Easy Way)

One of the most common, and tragically problematic, aspects of today’s IT world is the lack of sufficient documentation. In fact, ANY documentation. It’s not associated with any one area either. It crosses everything from help desk tickets, to change requests, to Active Directory objects, to Group Policy, to SQL Server, to System Center to Azure.

UPDATED 8/8/17 – added SQL and SCCM examples at very bottom.

When and Why Does It Matter?

It matters when things go wrong (or “sideways” as many of my colleagues would say).  There’s the “oh shit!” moments where you or your staff find the problem first.  Then there’s the “oh holy shit!!” moments when your boss, his/her boss, and their bosses find the problem first and come down hard on you and your team.  When shit goes wrong/sideways, the less time you waste on finding out the what/where/why/who/when questions for the pieces involved with the problem, the better.  To that end, you can document things in other places, like Word documents, spreadsheets, help desk and ITIL systems, post-it notes, whiteboards, etc.  Or you can attach comments DIRECTLY to the things which are changed.  This isn’t possible in all situations, but it is most definitely possibly for many of the things that will straight-up shove your day into shit storm, and I have no idea what that really means, but it just sounds cool.  Anyhow…

The best part of this approach is that you can leverage PowerShell and other (free and retail) utilities to access and search comments you nest throughout the environment.  For example, you can collect and query Group Policy Object comments…

Need to Get the Comment buried in a particular GPO or GPPref setting?  Here’s one example for querying a GPO under User Configuration / Preferences / Control Panel Settings / Folder Options…

#requires -modules GroupPolicy
param (
  [string] $GPOName
try {
  $policy = Get-GPO -Name $GPOName -ErrorAction Stop
catch {
  Write-Error "Group Policy $GPOName not found"
$policyID = $policy.ID
$policyDomain = $policy.DomainName
$policyName  = $policy.DisplayName
$policyVpath = "\\$($policyDomain)\SYSVOL\$($policyDomain)\Policies\{$($policyID)}\User\Preferences\FolderOptions\FolderOptions.xml"
Write-Verbose "loading: $policyVpath"

if (Test-Path $policyVpath) {
  [xml]$SettingXML = Get-Content $policyVpath
  $result = $SettingXML.FolderOptions.GlobalFolderOptionsVista.desc
else {
  Write-Error "unable to load xml data"
Write-Output $result


So, What Now?

So, in the interest of avoiding more bad Mondays, start documenting your environment now.  Need some examples of where to start?  Here you go…


Thank you for your support.