Great Moments in IT History


  • “UUEncoding is so cool! You can download an entire JPEG in only 30 minutes!”
  • “Let me tell you, 2400 baud is where it’s at. The Jetson’s are now, baby!”


  • “OMG! you need to dump AOL and join CompuServe! It’s the best thing ever! It’ll be around forever!”
  • “No, Prodigy is where the future is”


  • “OMG! you have to try ColdFusion! It’s the best thing ever! It’s here to stay!”


  • “OMG! the new Blackberry is rocking! There’s nowhere to go from here with this perfect phone!”
  • “The only search engine that will be around in ten years is Excite or AltaVista”
  • “No, Yahoo! will outlast all of them.”


  • “OMG! ASP web pages are the ultimate! And this VB6 just blows everything else away! Last programming language you will EVER need to learn!”
  • “ColdFusion is the future.”


  • “OMG! I just got my Novell certification! Now I have my WordPerfect, FoxPro AND dBase certs! Woo hoo! I’m set for life!”
  • “Holy pig squeeling shit!!!! Did you know that when 2000 arrives all our computers will implode?!!! We have to spend a metric shit-ton of $$$ on preventing this impending doom!!”


  • “OMFG! Have you seen the latest Sun Sparcstations?! Windows NT doesn’t have a future, trust me!”
  • “No way. SGI is where my money is.”


  • “Hey, what happened with all that Y2K stuff?”
  • “Oooh, the Internet market is the future. No way it could crash now!”


  • “Facebook is never going to take off. Dumb idea”
  • “Twitter is never going anywhere either.”


  • “A touch screen phone? Stupidest idea ever! Who would ever NOT want a real keypad?!”
  • “I’m telling you, Real Estate investments simply CANNOT FAIL. I’m advising everyone to buy buy buy!!”


  • “Sun Microsystems is a name that will be around long after Microsoft, Google and Apple are gone”


  • “A ‘ride-sharing’ business is stupid. Who would ever not want to use a taxi or a bus?!”


  • “The Cloud is all a silly fad. It’ll be forgotten in a year or two.”


  • “Letting employees work from home is a bad idea. Nothing will get done. Businesses will collapse immediately just from that alone.”


  • (let’s just skip past that, mmmkay?)


business, humor, Technology

2001 vs 2021 (aka 2020 CU1)


IT planning Meeting: / Project = Document Management

Teams present: IT Architect, PM, InfoSec, Network, Storage, Accounts, Licensing, Customer stakeholders, Customer stakeholder stakeholders, Stakeholders for other stakeholders, Vendor reps handing out business cards and shaking hands, all packed in one conference room.

Preliminary: Storage and Network teams are blaming InfoSec for system issues. InfoSec is blaming Licensing for holding up a PO. Licensing blames the CFO. PM asks them to keep it down.

Vendor: “(blah blah blah blah blah blah) some pointing and hand gestures learned from a sales book.

IT PM:Thank you for the introduction, Bob.”

Vendor: “Uhhh, it’s Doug, actually.”

IT PM: (turns to stakeholders) “So, what features do you need from Electronic Document Management?

Customer: “What does it do?

IT Architect: (talks for 30 minutes, reads directly from PowerPoint slides, attendees coughing, staring at Blackberry phones, texting jokes about TV sitcom episode from previous night, sounds of thumbs clicking on physical keypad, spoons clanking against coffee cups) … “Any questions?

Customer: “We need all of it.

General IT takeaway: (old-timers: dread. younger folks: excitement)


IT Planning Meeting / Project = Cloud Security

Teams present: Cloud Architect, PM, InfoSec, Cloud Networking, Cloud Identity, Licensing, Customer stakeholders, Customer stakeholder stakeholders, Stakeholders for other stakeholders, Vendor reps, everyone on a Teams call.

Preliminary: Azure/AWS/GCS and M365 teams are blaming InfoSec for system issues. InfoSec is blaming Licensing for holding up a PO. PM asks them to keep it down.

Vendor: “(blah blah blah blah blah blah) some more PowerPoint slides and a QR code.

IT PM: “Thanks for introduction, Juan.”

Vendor: “Uhhh, it’s Carlos, actually.”

IT PM: (turns to stakeholders) “So, what features do you need from Cloud Security?

Customer: “What does it do?

Cloud Architect: (talks for 30 minutes, reads directly from PowerPoint slides, attendees not on mute add background sounds of cats, dogs, birds, car horns, kitchen pots and pans, messaging about Netflix/Hulu/YouTube/Amazon/HBO show from previous day, crumpling fast food bags, spoons clanking against coffee cups) someone keeps taking a heavy drag on their vape in front of the mic … “Any questions?

Customer: “We need all of it.

General IT takeaway: (old-timers: dread. younger folks: excitement)


A Query for the Weary

Trying to find SolarWinds Orion stuff in your Configuration Manager environment? Probably not, but in case you are, here’s a query to fire against your SQL site database. If you don’t have permissions to query the database, threaten your DBA with promises to post illicit photos of them from the last company party when they passed out and were never told what really happened to them before waking up. That, or a pizza and some beer, either might help.

For more in-depth inspection, check out the blog post by Matt Dowst at Detecting the SolarWinds Compromise Signals with PowerShell – Catapult Systems

  dbo.v_CombinedDeviceResources AS cdr ON isc.ResourceID = cdr.MachineID
  (isc.NormalizedPublisher LIKE 'SolarWinds%')

Modify to add an additional filter condition, at no extra charge, but only if your parents call before midnight….

  dbo.v_CombinedDeviceResources AS cdr ON isc.ResourceID = cdr.MachineID
  (isc.NormalizedPublisher LIKE 'SolarWinds%') AND
  (isc.NormalizedName LIKE '%Orion%')

And for a limited time, for only 30 cereal box tops, shove it into a PowerShell pipeline using a toilet plunger and some good old foot stomping power with module “dbatools”…

  dbo.v_CombinedDeviceResources AS cdr ON isc.ResourceID = cdr.MachineID
  (isc.NormalizedPublisher LIKE 'SolarWinds%') AND
  (isc.NormalizedName LIKE '%Orion%')"
$evil_little_turd_machines = Invoke-DbaQuery -SqlInstance "mysadsqlserver.loser.nowhere" -Database "CM_WOW" -Query $query


Destroying Orphaned OneDrive sites; See them Driven before you, and Hear the Lamentation of the Losers

Today’s sampling of client cases… I’ve had two unrelated clients with the same issue: their Microsoft Cloud App Security alerts went bonkers over potential PCI sensitive file content in user OneDrive folders, where the user accounts had been deleted LONG ago. How long ago? 2015 to be precise.

Since the user account was obliterated long ago, nothing shows in AzureAD, or SharePoint, etc. And since the ownership is (or was) still tied to the missing account, the URL link in each MCAS alert wouldn’t open for the tenant admin (Global or SharePoint administrator).

So, first we needed to find the orphaned turds in the cloud kitty litter box. The following script will dump the turds into a CSV bag for auditing and review. Then pepper spray each one so the new admin person can access the turds in the folders and destroy them.

This was adapted from the examples in https://docs.microsoft.com/en-us/onedrive/list-onedrive-urls and https://www.sharepointdiary.com/2015/08/sharepoint-online-add-site-collection-administrator-using-powershell.html

param (
  [parameter()][string] $TenantUrl = "https://contoso-admin.sharepoint.com",
  [parameter()][string] $CsvFile = ".\contoso_onedrive_users.csv",
  [parameter()][string] $AdminUser = ""

try {
  Write-Host "connecting to AzureAD and SharePoint Online"
  Connect-AzureAD | Out-Null
  Connect-SPOService -Url $TenantUrl | Out-Null

  Write-Host "requesting Azure AD users"
  $adusers = Get-AzureADUser -All $True
  Write-Host "requesting SharePoint OneDrive personal sites"
  $odusers = (Get-SPOSite -IncludePersonalSite $True -Limit All -Filter "url -like '-my.sharepoint.com/personal/'")

  Write-Host "comparing site owners with Azure AD users"
  $odata = @()
  $odusers | Foreach-Object {
    if ($_.Owner -notin $adusers.UserPrincipalName) {
      if (![string]::IsNullOrEmpty($AdminUser)) {
        Write-Host "adding site collection admin for: $($_.Url)"
        Set-SPOUser -Site $_.Url -LoginName $AdminUser -IsSiteCollectionAdmin $True
      $odata += [pscustomobject]@{
        Url = $_.Url
        Owner = $_.Owner
  if ($odata.Count -gt 0) {
    $odata | Export-Csv -Path $CsvFile -NoTypeInformation 
    Write-Host "$($odata.Count) sites found. Saved to: $CsvFile"
  } else {
    Write-Host "no orphaned sites found"
catch {
  Write-Error $_.Exception.Message 

There are a hundred quadrazillion to the forty-five thousandth power variations for coding this, and this is just one. And more than likely it’s the one instance you will point a finger at and shake your head, thinking “I could’ve done this better”, and you’re right, you could have done this better. Go ahead and pat yourself on the back, I’ll wait…

Ok, so you may have noticed I used a cheap method for controlling the part of the process where it changes ownership on the sites, by checking for the $AdminUser value being blank (or not). Just run it as-is (well, change the defaults for your environment first) to get the OneDrive sites without a corresponding Azure AD user.

Then run it again with a UPN assigned to the $AdminUser parameter to apply ownership changes. Keep in mind that the $AdminUser must have SharePoint Administrator role membership.

And now I need to go for a walk. I forgot how to use my legs.


Discount Sci-Fi Tales: Inter-planetary adventures

Adventure 0.01

Once upon an intergalactic time, there was a planet, far from Earth, orbiting its sun, as it had for trillions of Earth years, when it was first discovered by relentless and underpaid scientists. And upon discovering this new planet, the scientists excitedly informed their superiors, who in-turn informed their superiors, and their superiors, and so on. Days later, after their corporate superiors had been informed, they ordered their government underlings to prepare a mission to the new planet.

The teams of highly-paid corporate engineers, consultants, analysts and lobbyists spent trillions of taxpayer dollars on many years of intense planning, designing, testing and refining, and other words that would rhyme in cool ways to make advertising jingles for yet more marketing revenue.

Finally, the day had come when the space vehicle was ready for launch. The crowd of anxious onlookers ceremoniously bid farewell to the brave astronauts, as they hugged and kissed their loved ones, before embarking on their golf carts toward the shiny contraption as it stood majestically alongside a frame, with hoses connected, spewing clouds from the super-cooled liquids fed into it.

Their much-anticipated entry was delayed while they were each required to sign a 30 page release form, indemnifying the corporate owners and shareholders of any liabilities should the vehicle self-destruct, fail in open space, or arrive at their destination only to be greeted by creatures that found them to be both nutritious and delicious. Their words, not mine.

An hour later, they were allowed to board. However, because this vehicle was owned by Earth Aerospace, formerly American Airlines, their launch was delayed at the gate for several more hours while they waited for their luggage to be loaded. Finally, the space vehicle was ignited, the connections removed, and with a furious blast of smoke and thunderous noise, it lifted off and escaped the clouds and entered open space. And then, a commercial break for their sponsors.

And back again. As the space ship traveled, a reality show was beamed back to Earth, which garnered high ratings and spawned fan clubs and meet-ups, along with expensive merchandise, clothing and special access to communicate with the astronauts at only $100 per minute, with a special discount for the first 100,000 new members.

Several years later, the space ship arrived at the remote planetary system and entered it’s calculated orbit. Sensors revealed possible signs of life on the surface below! Viewers back on Earth were ecstatic! A probe was readied, and launched. The crew watched with heavy anticipation of what it would report back from this strange new world using its many cameras and sensors. The viewers back on Earth watched the delayed “live” video feed, gathered together in homes, schools and pubs. Just kidding. They were watching it from their own phones, completely isolated.

Everyone watched as the probe skimmed the outermost edges of the planet’s atmosphere, creating a vapor trail and eventual glow from surface friction. And then, as the probe eventually immersed itself into the rich gaseous layer, they suddenly realized that the planet was surrounded by a layer of absolutely-pure Oxygen. At that point, the probe’s rocket burner ignited the atmosphere, incinerating the entire planet, and all of its inhabitants.

The End

Cloud, Technology, windows

Using PowerShell to Find Out If You’re About to get a Beating

The goal of this blog post is to export your Azure subscription billing data into Microsoft Excel, so that you can apply some cool charts and distract your significant other from killing you when you avoid telling them how much you “accidentally” spent on Azure services, and they end up finding out later when looking at the bank statement.

Let’s get started.

What You Will Need

  • An Azure subscription of any kind. I use a Pay-as-you-Go plan, which my wife now calls the “You-promised-it-wouldn’t-happen-again-last-time” plan.
  • A machine with PowerShell 5.1 or later (I’m using PowerShell Core 7.0.1 for this, on Windows 10)
  • If you want some “real world” data, setup a VM and configure Bastion on it, and leave it running for a few hours each day for about 5 days. I chose a D2s v3, Windows 10 machine, with a “Premium” SSD data disk (defaults on everything else). Then installed a bunch of apps and scripts. Yes, you read correctly. A “Premium SSD”. That was a critical mistake. I may force myself to watch 40 episodes of The View in a row as self-punishment. But not right now….

Poking a Stick at it

  1. Open a PowerShell console using “Run as Administrator”
  2. Check that you have the modules AZ and ImportExcel installed. If you don’t, install them now. If already installed, make sure they’re the latest version (e.g. use Find-Module to check)
  3. Authenticate with Azure
  4. Query for your billing information
  5. Export it to Excel

Let’s dive in a little deeper. Remember, the deeper you go, the harder it is for someone who’s upset with you to kill you. Deep breath before you go under.


You may get a prompt to open a website and enter an authentication code, so whatever happens here, you’re on your own. And no, the code shown below won’t work for you, trust me.

Once you’re authenticated, you can query some information. Let’s fetch your billing information. You can output the results to the screen if you want, but I prefer to capture it to a variable for further tinkering.

$cu = Get-AzConsumptionUsageDetail

Filter down to the most important parts. A lot of entries will have a $0 cost, so I don’t care about those. I also don’t need all of the properties, just the name of the resource (InstanceName), its type (ConsumedService), the Billing Period, usage period and quantity, and finally: the cost (PreTaxCost). For now I’ll sort on the cost in descending order so I can spend a minute freaking out and hyperventilating while my dog snores.

$cu | Where {$_.PreTaxCost -gt 0} | Select InstanceName,ConsumedService,BillingPeriod,UsageStart,UsageEnd,UsageQuantity,PreTaxCost | Sort PreTaxCost -Descending

Now we can export this to an Excel worksheet, so I don’t need to sort the results (you can if you want) but Excel can do that just as well…

$cu | Where {$_.PreTaxCost -gt 0} | Select InstanceName,ConsumedService,BillingPeriod,UsageStart,UsageEnd,UsageQuantity,PreTaxCost | Export-Excel -Path ".\AzureBilling.xlsx" -WorksheetName "2020-06" -ClearSheet -AutoSize -AutoFilter -FreezeFirstColumn -BoldTopRow -Show

The Export-Excel parameters are pretty self-explanatory. But the “ClearSheet” parameter might be less obvious. It wipes the target worksheet (i.e. “2020-06”) before populating with fresh data, so I can re-run the code over and over without losing data on other worksheet tabs (same spreadsheet file).

In my case, the surprise for me was where I forgot the costs of maintaining a “jump server” in Azure with a data disk and Bastion access. Even with automatic (scheduled) daily shutdowns, the storage and network costs accumulated while I was focused on beer, grilling, Netflix, Twitter, YouTube, and blasting on my drumkit as if I know how to play. A few days of that (six to be exact) grew my normal 50 cents/month bill to $94.

A somewhat prettier look shows the chronological details like roadkill…

Point A is where I set up so-called “cheap” Windows 10 VM with a “premium” data disk (you thought I’d skip over that didn’t you). Only running it a few hours each day, and shutting it off (deallocated). But just like listening to my neighbor’s terrible music, the damage has already been done.

Point B is the day after my wife walked by and saw the “Cost Analysis” view on my screen, paused and said, “Um, what’s that? That doesn’t look like an Amazon order? Two-hundred and forty-wtf dollars?!” (for the record, she rarely drops F-bombs, even after being around me)

Point C is where I’d be making arrangements for cremation or burial (for the record, I asked for cremation by explosives with a BBQ party, at a safe distance. I don’t think the ATF will approve. Plan B is a gasoline tanker truck and a bonfire, at a safe distance of course)

Anyhow, you can see there’s a tiny gap at point C (May 27) where my (please God I promise I’ll be good from now on!) costs are diverging from the projected (oh no no no nooo!!!!) cost trajectory. I’ll be monitoring this every day because I’m in super-turbo-hyper-ultra-maxi paranoid mode now.


Some of the comments I’ve received from casual conversation with friends:

You should’ve set a budget and alerts” – I did, but with so much email I don’t check my other inboxes often enough. I will now.

Didn’t Mike Teske say to use a pre-paid card with a hard limit?” – yes, he did.

Hey, didn’t Mike Teske warn about using a pre-paid card with a hard limit?” – yes, he sure did.

Hey, didn’t…?” – yes, yes, he did! damn it!

Geez. What a dumbass.” – (sigh).

Actually, my bill last month was four times that. But I’m single.” – I hate you.

Anyhow, I should’ve known better. I watched Mike Teske give a fantastic presentation on avoiding surprise costs with Azure (at the PowerShell Saturday weekend conference in Raleigh, it was fantastic!), and yet, still, I went full-stupid.

I’m never too proud to admit my mistakes, like having friends who are single and have no debt, but aside from that. I’ve learned my “don’t get out of the boat” lesson* about taking cloud costs seriously. Pay attention to the disk types, the secondary costs (storage, bastion, etc.) and listen to Mike.

(* line from Apocalypse Now)


Traveling Bones

I had a tooth extracted last week. It wasn’t a planned event. I had spent considerable money on my lower left molar getting it capped, and then it had to get a root canal (through the cap), and then it broke in half, laterally, below the gum line. The dentist who did all that kept trying to “address” the last visit, but kept billing me and I got tired of fighting with insurance, etc.

So I picked a new dentist. This one has the highest public ratings in our city, and with COVID-19 keeping everyone at home, his appointment book was pretty empty. So I got right in, since it qualified under the current state legal atmosphere as an “emergency visit”. And it got us all out of our houses for a few hours (including the assistants). Refreshing as well as therapeutic.

So, after a brief consultation over the phone (the day before), I opted for the “bone graft” procedure, which provides for future enhancements, like implants and certain other things. I like chewing on shit, so I thought it might be a good idea. The extraction alone was $75. The graft is another $500.

I pictured a sliver of human bone, because I’ve watched too many TV shows with people wearing blue lights, latex gloves, and Dollar Tree vision goggles. But I was surprised to learn it’s actually granulated bone. Cadaver bone, to be exact. Cadaver is Latin for dead person, I’m told.

So they numbed me up, we did some humorously slurred small talk and they got to work. Pulling, twisting, crunching sounds, and me trying to keep my lip from getting smashed between the plyers, hands and teeth, during all the leveraging. It came out, followed by a suction tube, some stitches, more suction tube, and cleanup. Then the follow-up instructions:

  • No hot foods or liquids for a few days
  • Chew on the other side for a while (as if I didn’t expect that one)
  • Ice pack and alternate Tylenol/Ibuprofen
  • Don’t freak if some of the granules leak out

Yep. That last one didn’t freak me out at all, but made me think of the following…

Some person was born and raised, years ago, living a full life (I’m assuming, but who knows), traveled the world, fought in a war or two, raised a family, built a house, a business, a farm, retired, fell over dead while watching Wheel of Fortune, got carted away to a cold room.

This person might have been a famous person. A bank robber. An actor. A CEO. A chef. A doctor. An astronaut. Or any sort of “ordinary” person like myself (stop it! I can hear you laughing right now). What sort of adventures did this person live? Where had they been? Where was their family? Had I ever met them or known them?

He or she probably signed off donation rights long before, so the staff likely went to work harvesting whatever was of use: organs, tissue, bone.

And from the bones, a select few were sanitized, ground into powder, packaged, labeled, sold, shipped, unpacked, scooped into my empty tooth socket, packed, closed up with sutures, and traveled back to my home afterwards. All for a nominal fee, after insurance deductible.

Then I forgot about the hot food part, which happened to be eggs and bacon with hot coffee. And, of course, some of the granules slipped out, and ended up on my finger, where I stared at that first granule for a full minute, contemplating that fateful journey by which that person took from cradle to grave, to my tooth socket. Then into a napkin and into the trash can.

That granule traveled by truck to the city landfill, inside a plastic kitchen trash bag, under piles of other trash and dirt. Ashes to ashes. Dust to dust.

The end.


PowerShell, Graph, Intune Data, Windows devices, ImportExcel, More Excel, and Excel at Not having to use Excel

…with french fries.

Ok. In the time it took to write that silly title I’ve finished a glass of super high-quality, ultra-premium, 5-star, platinum series, Trader Joe’s Cab-Sauv at $7.99 a bottle. Yes, I know, I’m livin’ the life.

I’m overdue for posting something technical. I’m also overdue for posting something completely stupid. So I decided to combine them into a complete technically stupid post. And I promise you that by the time you’re done reading this you will either be smarter than me, or asleep.

Anyhow, what’s this about? It’s about sticking some wine-infused brain cells together with some PowerShell chewing gum and making something almost sort of useful: A device inventory report out of Microsoft Endpoint Management Intune Graph API data using PowerShell. Say that last sentence 5 times. Seriously, it’s generating a device inventory report from Intune data using PowerShell. How boring did that sound?

You’ll need some things to get started.

The Ingredients

  • Microsoft Intune (yes, with some active Windows 10 devices, smarty pants)
  • A glass of wine (substitute beer, liquor or paint thinner)
  • A computer running Windows with PowerShell 5.1 (I haven’t tested this with PowerShell 6 or 7)
  • A functional Internet connection
  • NOTE: You do NOT need to have Microsoft Excel installed on your cheap computer in order to run this, unless you use the -Show option (you can thank Doug Finke for that, go ahead, I’ll wait)

Preliminary Stuff

  • Open a PowerShell console using “Run as administrator” (hopefully that’s you)
  • Install PowerShell module psIntune (This will also install PowerShell modules: AzureAD, MSOnline, and ImportExcel, if they’re not already installed)

The 100,000,000 foot view of this…

This process involves querying a (your) Intune tenant via the Microsoft Graph API, to fetch all the managed Windows devices. In addition, it will query all of the installed software from those devices, and as much of that juicy, sweet, hardware inventory data as it has hanging from it’s giant tree (not that much actually, but let’s keep going, I’m still on my 2nd glass now)

Install-Module psIntune

Query Devices and Applications Inventory. This part is probably going to cause some uber-geeks to burst a skull cap, but I landed on this by trial and error. Trial = beating head against REST API wall for hours and hours, and Error = 504 gateway time-out responses from taking too big of bites. So… I found that separating the device and app queries has been the most consistently reliable when device count is > 1200.

Query Devices

$devs = Get-PsIntuneDevice -UserName "dumbass@contoso.com" -Detail Detailed -DeviceOS Windows -ShowProgress

Query the Apps for each Device

$apps = Get-PsIntuneDeviceApps -UserName "dumbass@contoso.com" -Devices $devs -ShowProgress

Then, stitch it all together into a nice and fat Excel worksheet file…

Write-PsIntuneDeviceReport -Devices $devs -Apps $apps -Title "ContosoKicksAss" -DeviceOS Windows -AzureAD -Overwrite -Show

If you don’t have Excel installed on the machine where this is run, leave off the -Show parameter or it will puke on your shoes. Actually, it’ll probably do nothing at the end, but it’ll be thinking about puking on your shoes, trust me.

You may be thinking, “Why the ********** **** did he use -AzureAD and not make a Get-PsIntuneAzureADDevice function?” Well, because the name would be stupid looking, and just like that, I’m on glass 3. Cheers! Oh, and the -AzureAD parameter will prompt you to provide real, honest, pure, and wholesome AzureAD credentials, not those imitation kind.

So, when it’s all done, if it didn’t crash and explode on you, you should have a cool-ass spreadsheet like this, only without the blurry stuff (yours will be crystal clear, trust me).

The tabs as of now (and I mean right now, this very micro-second) include the following:

  • Summary
  • Intune Devices
  • AaDevices (AzureAD devices)
  • AaDevicesUnique (removes duplicate names)
  • AaDevicesDuplicates (shows duplicate names)
  • IntuneModels
  • IntuneStaleDevices (haven’t said shit in over XX days, customizable)
  • IntuneDuplicates (yep)
  • IntuneOrphaned (named “User deleted for this device”, because I think the user was deleted for those devices, but don’t quote me)
  • IntuneSoftware
  • IntuneLowDisk (customizable)
  • IntuneInstallCounts (app install counts)
  • IntuneSoftwareUnique (unique product names/versions only)
  • IntuneMissing (not registered in AzureAD)
  • AADMissing (not registered in Intune)

And as always, if anything is missing or needs improvement, drop me a line at your local neighborhood GitHub repo, under the sign labeled “Issues“, because we all have issues. I just have more than most of you.

I should mention that I didn’t write the core parts of the PowerShell Graph API code, but adapted them from the Microsoft samples, and added a few cans of Dollar Tree tomato sauce and microwaved it for ten minutes.

If you were offended by anything posted herein, I’d say it’s your fault for reading this far. You could’ve been watching Netflix instead.


System Center, Technology

Cranky Dave sez Knock that Shit Off

Maybe it’s because I had a string of calls which rubbed the same raw nerve endings too close together. Idk. But I feel like the message isn’t getting out there. Please, if you work with customers, and see these things, urge the living shit out of them to reconsider. If that doesn’t work, duct tape and white vans are still available.

  • Doing things “just because” it’s been that way since 2010
  • Giving out local admin rights like candy. If you want to be drug dealer, run for Congress or Senate.
  • Using login scripts as first response to every need.
  • Repackaging every app installer, even when it only needs “/S” to run quiet
  • Stop over-complicating things just to be cute/clever. Look at me virtualizing my virtual servers inside another virtual server with 3 databases, and I only have 100 devices to manage.
  • Read the #(*$&@(*#$&(_(*@(#*$&) docs and follow the “supported” terms. Stop assuming you’re smarter than a room full of MVPs, *and* a yacht filled with drunk attorneys who all graduated from Harvard.
  • If your environment has to be complicated, it’s most likely because your business is over-complicated, and possibly broken. (Paul’s rule: If you automate a broken process, you will only ever get an automated broken process).
  • Stay within support – Don’t cry for help with your SQL 2005 “mission-critical” database. If it was “mission-critical” it would be running on a supported version.
  • Keep your shit patched – SQL 2016 is nice, but RTM is like 2 service packs and 12 cumulative updates behind! This isn’t 1962. We finished the Moon program a while ago.
  • Do I sound cranky? I’m grabbing a Snickers.

databases, Scripting, System Center, Technology

Basic ConfigMgr HealthChecks using PowerShell

Image result for prostate exam

I’m long overdue for a deep-dive (pardon the pun), so drink-up and let’s move on…

The power provided by PowerShell comes from a lot of different directions, one of them being that you can leverage a ton of built-in functionality without having buy additional software licensing, or even write all the messy code. That’s right, once again, I’m on my “modules are freaking fantabulously increditastical” soap box. I’ll be using a few different modules to do the heavy lifting:

And even though I won’t be showcasing it in this post, if you wish to export anything to Excel, rather than hopping through CSV first, take a look at the module ImportExcel by Doug Finke (the Export-Excel function in particular).

Heads-Up: This is not intended to be a “solution” that you simply download and run. I prefer to share the basic pieces and a direction, and let you take it and run with it however (and wherever) you like. Sharing a fully-bolted, polished solution doesn’t leave you with room to explore and customize without a lot of reverse engineering. Here’s the bricks, have fun building.

If you’re wondering why I’m not covering CMHealthCheck, it’s because (A) it would violate the “heads-up” goal mentioned above, and (B) that module is becoming a bit dated anyway (I’m working on a replacement, feedback is always welcome).

And Now for a Word on Modules

I’ve been in a few discussions about “make vs. buy” or “build vs. borrow” view of scripting. For decades (yes, I’m that freaking old, so you’ll have to speak up), I had always leaned towards building everything. Even when finding a near-perfect match online, I would rewrite it to my tastes. Not anymore. Time is more precious, and I’m not too proud to accept someone else might have provided a better option than I would have built.

In 2020, the state of online sharing is 1000 times what it was 10 years ago. It’s now to the point where not finding a close match for a desired technical solution is the exception, rather than the norm. Only the newest emerging things are lagging behind, mostly due to the trend of over-stimulated coke-snorting CI/CD fanaticism, but I’ll leave that for another episode of “Old man says GTFO my lawn you little CI/CD pipeline bastards!” But, I digress.

To me, modules are like car parts. Even when you build, or restore, a car, you’re not likely going to make EVERY single part from scratch (unless you own a smelting factory, chrome dip tank, a cow farm for leather, and so on). Most components are built by someone else. So, building things from parts is just a natural thing to me. It’s organic. Okay, soap box session is done. Let’s move on.

Getting Things Ready

To perform almost any health assessments, you’ll need sufficient access to the resources. In a typical ConfigMgr environment (if there is a typical ConfigMgr environment), this will translate into:

  • Full Administrator (in ConfigMgr)
  • ServerAdmin (in the SQL instance)
  • Local Administrator (on the site servers)

These are often granted to the account which was used to install the Configuration Manager site. Hopefully, it’s not an actual “user” account (that a human logs in with every day), but a service-type account. If you are not a DBA (or the DBA-ish person who “owns” the SQL instance) you may want to confer with them first, to make sure you don’t step on any toes. Pro-tip: bring doughnuts and fresh jokes.

When I say “Local Administrator”, I don’t mean adding your domain account directly into the local Administrators group, although that does work. It’s recommended that you are a member via domain group membership and that whole AG(U)DLP chain of handling that Microsoft has recommended for decades.

I already mentioned the PowerShell modules I’ll reference, so those need to be installed on your working computer (not on the site servers or database server, unless that’s all you’re working on)

To save on repetitive typing, let’s define some variables to use throughout the following examples. Replace the string values with whatever your TEST LAB environment uses:

$dbhost = "cm01.contoso.local" # site SQL host FQDN
$cmhost = "cm01.contoso.local" # CM primary site host FQDN
$site   = "P01" # CM site code
$cmdb   = "CM_P01" # CM site database

MECM Site Information

To help with finding things in SQL, mainly the default Views, I recommend running the following snippet, so you can use the results to search more easily:

$dbviews = Get-DbaDbView -SqlInstance $dbhost -Database $cmdb -ExcludeSystemView

An example for finding views which relate to something like “site”…

$dbviews | Where {$_.Name -match 'site'} | select name

You can also pass this into a cheap GridView (only $0.99 while supplies last) to pick-and-run your favorite view…

$view = $dbviews | Out-GridView -Title "Pick a View to Query" -OutputMode Single
if (![string]::IsNullOrEmpty($view)) {
  $query = "select * from $view"
  Invoke-DbaQuery -SqlInstance $dbhost -Database $cmdb -Query $query

I have a slightly more fancy version of the above sample, as a function, up on my GitHub at http://bit.ly/2SYYOOL. You can load it directly into a console session, and run it, using Invoke-Expression…

iex (New-Object System.Net.WebClient).DownloadString('http://bit.ly/2SYYOOL')
Invoke-CmDbView -SqlInstance $dbhost -Database $cmdb

Site Information Summary

Invoke-DbaQuery -SqlInstance $dbhost -Database $cmdb -Query "select * from v_Site"

General Client Information

I recommend saving the output of the following script to a variable, for use as a baseline for other operations (rather than requesting new data for each sub-query). I’m using $cmdevices for this example…

$cmdevices = Invoke-DbaQuery -SqlInstance $dbhost -Database $cmdb -Query "select * from v_CombinedDeviceResources where (name not like '%unknown%') and (name not like '%provisioning device%') order by name" | 
select Name,MachineID,SerialNumber,MACAddress,DeviceOS,DeviceOSBuild,CoManaged,ClientVersion,IsVirtualMachine,ADSiteName,LastMPServerName,LastPolicyRequest,LastDDR,LastHardwareScan,LastSoftwareScan,LastActiveTime,LastClientCheckTime,ClientCheckPass

From this you can filter on things like the following examples.

Devices with Old or Missing Hardware Inventory

Find devices which haven’t reported hardware inventory yet…

$cmdevices | Where {[string]::IsNullOrEmpty($_.LastHardwareScan)}

Find devices which have reported hardware inventory in the past, but not with the past 30 days…

$cmdevices | Where {(-not[string]::IsNullOrEmpty($_.LastHardwareScan)) -and ((New-TimeSpan -Start $_.LastHardwareScan -End (Get-Date)).Days -gt 30)}

Compare Device Coverage with AD

$adComps = Get-ADComputer -Filter * -Properties lastlogontimestamp,whenCreated,operatingsystem,description

I included some additional attributes in case I want to also compare last-login dates, and so on. But anyhow, to use this to compare devices between AD and MEM, you can run some super-basic tests like this…

$adComps | Where {$_.Name -notin $cmdevices} | select Name
$cmdevices | Where {$_.Name -notin $adComps} | select Name

The example above shows I have more devices in Active Directory which are not in the ConfigMgr database, than I have devices in ConfigMgr which are not in Active Directory. What kind of “health” is this? It’s a measure of how clean and controlled your environment really is.

General Windows Host Information

Get-CimInstance -ClassName "Win32_ComputerSystem" -ComputerName $cmhost
Get-CimInstance -ClassName "Win32_OperatingSystem" -ComputerName $cmhost
Get-CimInstance -ClassName "Win32_SystemEnclosure" -ComputerName $cmhost
Get-CimInstance -ClassName "Win32_Product" -ComputerName $cmhost
Get-CimInstance -ClassName "Win32_BIOS" -ComputerName $cmhost

Disks and Disk Space

Get-CimInstance -ClassName "Win32_LogicalDisk" -ComputerName $cmhost | % {
    Drive  = $_.DeviceID
    Name   = $_.VolumeName
    SizeGB = [math]::Round(($_.Size / 1GB),2)
    FreeSpaceGB = [math]::Round(($_.FreeSpace / 1GB),2)
    Used   = [math]::Round($_.FreeSpace / $_.Size, 2)

Network Connection Properties

Get-CimInstance -ClassName "Win32_NetworkAdapterConfiguration" -ComputerName $cmhost | 
    Where {$_.IPEnabled -eq $True} | 
        Select IPAddress,DefaultIPGateway,IPSubnet,MACAddress,DNSServerSearchOrder,DNSDomainSuffixSearchOrder | ForEach-Object {
                IPAddress   = $_.IPAddress -join ','
                IPGateway   = $_.DefaultIPGateway -join ','
                IPSubnet    = $_.IPSubnet -join ','
                MACAddress  = $_.MACAddress
                DNSServers  = $_.DNSServerSearchOrder -join ','
                DNSSuffixes = $_.DNSDomainSuffixSearchOrder -join ','

File Shares

Get the file shares, folder and share permissions. This information can be used to further automate for “drift” reporting and remediation, when someone (or some process) modifies them for whatever reason. (Note: The following example has no exception handling. You may want to add some nested try/catch handling inside the foreach-object (%) section.)

$shares = Get-CimInstance -ClassName "Win32_Share" -ComputerName $cmhost | 
  where {$_.Name -ne 'IPC$'} | % { 
    $spath = "\\$cmhost\$($_.Name)"
    $fpath = "\\$cmhost\$($_.Path -replace ':','$')"
    $perms1 = Get-CPermission -Path $spath
    $perms2 = Get-CPermission -Path $fpath
      Name = $spath
      Path = $_.Path
      Description = $_.Description
      SharePermissions = $perms1
      FilePermissions = $perms2

Stopped or Failed Services

Another common check is looking for services which are set to “automatic” but are not currently running…

Get-CimInstance -ClassName Win32_Service -ComputerName $cmhost |
  Where {$_.StartMode -eq 'Auto' -and $_.State -ne 'Running'}

Ooooh. Missing Updates?

What about those pesky Windows updates on your site systems? Yeah, they need them. And SQL Server updates too.

Get-WindowsUpdate -ComputerName $cmhost -WindowsUpdate
# note: if the -ComputerName connection fails, try using Enter-PSSession instead

Event Logs

The Windows Event Log is a gold mine for finding current and potential issues with a Windows Server.

# system log "critical","warning" and "error" entries in the last 24 hours...
$xfilter = @'
  <Query Id="0" Path="System">
    <Select Path="System">*[System[(Level=1 or Level=2 or Level=3) and TimeCreated[timediff(@SystemTime) &lt;= 86400000]]]</Select>
$sysEvents = Get-WinEvent -LogName "System" -ComputerName $cmhost -FilterXPath $xfilter

# application log "critical","warning" and "error" entries in the last 24 hours...
$xfilter = @'
  <Query Id="0" Path="Application">
    <Select Path="Application">*[System[(Level=1  or Level=2 or Level=3) and TimeCreated[timediff(@SystemTime) &lt;= 86400000]]]</Select>
$appEvents = Get-WinEvent -LogName "Application" -ComputerName $cmhost -FilterXPath $xfilter

ConfigMgr Server Logs

Oh boy. This part isn’t fun. You can search the server and component status events within the site database, which is often faster, but I’ll save that for another post.

For this one, I borrowed a very nice script by Adam Bertram, aka “Adam the Automator” (Twitter: @adbertram) and modified it slightly (okay, I poured dumb sauce all over it) to read server logs instead of client logs. I realize that some logs don’t follow a consistent internal format, so if you know of a better alternative, please chime in?

iex (New-Object System.Net.WebClient).DownloadString('http://bit.ly/2vIhtXk')
$logs = ('sitecomp','dataldr','hman','distmgr','smsexec','wsyncmgr')
$logs | ? {
  Get-CmServerLog -ComputerName $cmhost -SiteCode $site -LogName $_ | ? {$_.Category -eq 'Error'}

Database Information

A Configuration Manager site isn’t much good without a SQL Server database. And a SQL Server database isn’t much good if it’s suffering from issues resulting from mis-configuration, neglect of maintenance and updates, and so on. So any real “health check” of a system implies checking all the parts which it depends on, which in this case is the site database.

SQL Instance Summary

This will return basic version and update information, such as version, build, service pack, cumulative update and KB levels, and support status.

Get-DbaBuildReference -SqlInstance $dbhost

Getting SQL Server update compliance can be tricky. At least it has been for me, and probably because I’m retarded AF. But if you find it tricky too, then maybe it’s from something else, but anyhow, here’s one way…

# import a magical function from the world of beyond...
Test-DbaBuild -SqlInstance $dbhost -Latest

CM Database Summary

This will return a summary of your SQL database, such as name, instance, status, recovery model, compatibility level, collation, owner, and basic backup info.

Get-DbaDatabase -SqlInstance $dbhost -Database $cmdb

Connection Authentication Scheme

Test-DbaConnectionAuthScheme -SqlInstance $dbhost

SQL Instance Memory Allocation

This will return summary information about the current maximum memory limit, and current usage for the instance (in megabytes).

Get-DbaMaxMemory -SqlInstance $dbhost

You can also retrieve current memory usage stats…

Get-DbaMemoryUsage -ComputerName $dbhost

Database File Information

This will return details about each .mdf and .ldf file for your CM database, such as path, size, status, reads/writes, and more.

$dbfiles = Get-DbaDbFile -SqlInstance $dbhost -Database $cmdb

Database File Auto-Growth Information

This is basically an extension of the example above, which dives more into the auto-growth aspects.

$dbfiles | select LogicalName,Size,Growth,GrowthType,UsedSpace,NextGrowthEventSize,TypeDescription

Database Index Fragmentation

This will return the current fragementation state of your database indexes (indices?) I prefer to break this into two (2) parts: a query file, and the script code. The query file contains only the SQL statement, which the script code imports using the -File parameter. The first example below is the SQL statement, followed by the PowerShell script.

  dbschemas.[name] as 'Schema',
  dbtables.[name] as 'Table',
  dbindexes.[name] as 'Index',
  indexstats.avg_fragmentation_in_percent as 'FragPct',
  indexstats.page_count as 'PageCount' 
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL, NULL, NULL) AS indexstats
  INNER JOIN sys.tables dbtables on dbtables.[object_id] = indexstats.[object_id]
  INNER JOIN sys.schemas dbschemas on dbtables.[schema_id] = dbschemas.[schema_id]
  INNER JOIN sys.indexes AS dbindexes ON dbindexes.[object_id] = indexstats.[object_id]
  AND indexstats.index_id = dbindexes.index_id
  indexstats.database_id = DB_ID()
  indexstats.avg_fragmentation_in_percent desc
$qfile = "[ENTER_THE_SCRIPT_PATH_HERE]\Documents\indexfrag.sql"
$threshold = 40 # index frag percent baseline, whatever you prefer
$stats = Invoke-DbaQuery -SqlInstance $dbhost -Database $cmdb -File $qfile
[math]::Round(($stats | where {$_.FragPct -gt $threshold}).Count / $stats.count, 2)

Failed SQL Agent Jobs (last 24 hours)

Get-DbaAgentJobHistory -SqlInstance $dbhost -StartDate (Get-Date).AddHours(-24) | Where {$_.Status -ne "Succeeded"}

Database Backup History

Get-DbaDbBackupHistory -SqlInstance $dbhost


I think I’ve talked enough for now, and I’m out of coffee. As I mentioned earlier (I think), this is only a sampling of some of the things you can bolt together using off-the-shelf modules, and some minimal touch-up work.

As the MECM or MEM/CM team adds more to the Management Insights library of tools, you can expect to peel off a few custom tools, but that may be a gradual process. Keep an eye on this feature with each new build that you install.

This isn’t at all restricted to MEM/CM/ConfigMgr, or even SQL Server (even though I spent a lot on this).

Now, put on your best pair of Latex gloves, and smile. 🙂