Technology, windows

Windows 11 – Part 2. The Electric Boogaloo

I’ve already shared my thoughts on Windows 11, but that was pre-release. This is now.

WARNING: May contain language not suitable for intelligent people.

copyright 2017 skatterbrainz

What I like about Windows 11

Pretty much the same as my previous post. The user interface is nice. It’s clean. It’s refined. It’s fresh. It’s a lot like MacOS, and I don’t even use a Mac (my son uses Mac exclusively, and loves to joke about Windows 11 now).

The new Settings app is finally what it should’ve been from day one with Windows 10. It looks like it was kidnapped in a white van from camp Linux, but it’s really nice. I like it.

The centered taskbar is not really a deal-breaker for me. I know some people hate it, but I just don’t care if the icons are in the middle or to the left/right/top/bottom, etc. But if you want some (really nice) PowerShell help with the UI, check out CustomizeWindows11 by Jaap Brasser. (Install-Module CustomizeWindows11).

“Windows Terminal” has replaced “Windows PowerShell” as the default Start menu right-click option. I like Terminal a lot.

Finally! The “Enterprise” SKU doesn’t install a bunch of stupid games like Windows 10 did early on (Candy Crush, etc.) Someone finally listened to the business customers.

What I don’t like about it

It’s still version 10.x (10.0.22000 to be exact). As if vendors can’t do the ($version -ge 10.0) math? Why?

PowerShell 5.1 is the only version installed by default. Why?

The same old prompt to run “Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Force”. I tweeted that this is like selling cars with 1 lug nut missing on each wheel, every single year, and requiring buyers to install the nuts themselves. This is approaching government-level stupidity. Why?

TLS is still “SystemDefault”, not 1.2. Shouldn’t it default to 1.2 and let customers downgrade if needed? I thought “security” is a major selling point of this version.

PowerShell still isn’t woven into the fabric of the UX, it’s still an add-on. For example, in some apps like SSMS, you can right-click and generate a script of a desired action (create, delete, etc.), but why not in Explorer, REGEDIT, or Event Viewer? Frivolous? Maybe. But still… aside from die-hard PowerShell geeks, how much cumulative time is spent every year by users Google searching how to update a registry value, or export event log search results.

WinGet is flaky. Yes, flaky. Consider item 1, your honor:

winget install --name notepad++ 

This did not work.

This returns an error that there are “Multiple packages found matching input criteria”. No, there are not. There is only one with name “Notepad++”. The other is “Notepad+” (one plus sign). You have to use –id notepad++.notepad++ (kind of like Little Caesar’s pizza pizza) for this one. This tells me that the input filter handler is flawed (defaulting to regex or something that’s not swallowing the “++” properly).

Also, silent installs are NOT the default. You will need to be careful to add –silent, and possibly –accept-package-agreements, and –accept-source-agreements. The default appears to be what MSI folks would use as /qb! (basic w/no cancel).

The winget silent install command that worked for me

winget install --id notepad++.notepad++ --silent --accept-package-agreements

I’m aware this is nit-picky, but Chocolatey (my favorite package manager so far) doesn’t have this issue. The corresponding command to run a silent install of Notepad++ with Chocolatey is

cinst notepad++ -y

The other complaints I mentioned in the previous blog post still stand. For example,

  • No “Edit” option on right-click menus for PowerShell scripts. Why?
  • File extensions are still hidden by default. Why?

And speaking of hidden file extensions: It’s time once again for…

Exam Question of the Day

You call a (non-technical) user to advise them to perform certain actions while troubleshooting an issue on their shiny-new Windows 11 laptop. You have to speak loudly to overcome the sounds of cats on their end.

You: “I need you to edit the PowerShell script I just copied to your Documents folder. It’s named ‘Sample.ps1′” (see Figure 1)

Them: “Ok, How do I edit it?”

You: “Right-click on it and choose ‘Edit'”

Them: “There is no ‘Edit’. Should I use ‘Open’?”

You: “No!!!!!!!!! It’s got to be modified before you run it!”

Them: “I don’t like your tone. I’m hanging up and calling HR.”

You: (starts working on resume and updating LinkedIn profile)

Try this again…

You: “No. Just right-click on the one without an extension on the name.”

Them: “What’s an extension?”

You: (rubbing your temples with a 9mm and a glass of whiskey) “It’s…. uhhh…. hold on (sips, clicking sound)…. “okay, happy thoughts…”

Them: “What? Are you okay?”

You: “Yeah, I’m fine. It’s just, oh, never mind. So, I’ll need to ask you to click on “View” at the top of the Explorer window, then click on “Show” at the bottom of that popup menu, then click on “File name extensions”.

Them: “That’s dumb. So many clicks? Why aren’t these extensions turned on by default?”

You: “YES!!!! OMG! OMFG!! I love you!!!” (drops headset, and knocks over shot glass)

Them: “I’m married! You need to calm down!”

You: “Yes, I apologize. All good. Let’s continue…”

Which of the following will satisfy the requirement

A. Quit this IT job and pursue woodworking or guitar playing

B. Ask the user to marry you

C. Continue drinking and playing with your gun

D. Use a GPO or MDM policy to show file extensions, because they should have been displayed by default, from the very first version of Windows to contain Explorer. Then complain to everyone else on StackOverflow, Slack, Discord, Twitter, Reddit, Facebook, Twitch and maybe even LinkedIn, then post on your blog about it.

(figures)

Figure 1
Figure 2
Figure 3
Figure 4

As with everything software, it’s never done. Windows 11 will continue to evolve, like all other software products. So I would expect some of these complaints to be remediated over time. It’s just annoying that some of these complaints have been around for a very long time (file extensions, PowerShell version), and some appear to be regressions (right-click menus). That’s a sign of one of two possible things going on:

  • Internal knowledge sharing is broken (old team to new team, or product1-team to product2-team)
  • The customer feedback system is broken (user feedback is not being collected and/or assessed properly)

Maybe Windows 11 22H1 or 22H2 will take care of this.

Scripting, Technology

Ridiculous Redactions

Today’s waste of time will focus on using PowerShell to crawl through a bunch of Microsoft Excel workbook files, commonly referred to as “Excel Database Files” by many MBA recipients, to effectively replace one text string pattern with another. A typical scenario might be that the workbook files contain hyperlinks to documents which have been moved to a new location, or maybe someone got fired, and the CEO hates them enough to demand their name be stricken from every workbook file.

For demonstration purposes let’s assume that CEO Jim Useless wants to replace every occurrence of the name of the former Sales Manager, Mike Hunt, with “Sales Manager”. Because, nothing makes a new employee feel valued and welcomed as much as being referred to only by their job title. So, an example source table (top example below) would be modified to look like the example below that.

Let’s drink a whole pot of coffee (or a whole box of Keurig cups) and get started.

Step 1 is setting up some variables to use later on.

[string]$SearchFor = "Mike Hunt"
[string]$ReplaceWith = "Sales Manager"

Step 2 is getting all the .xlsx files within the folder.

[array]$files = Get-ChildItem -Path $Path -Filter "*.xlsx" -ErrorAction Stop

Step 3 is making sure Excel is installed. I was hoping to use the ImportExcel module for this, but it doesn’t provide the functionality for doing this sort of thing yet, so we need to make sure Excel is available to handle the heavy-lifting.

try {
  Write-Verbose "Opening excel application session"
  $excel = New-Object -ComObject Excel.Application
  $excel.Visible = $False
}
catch {
  Write-Error "Excel could not be found on this cheap-ass computer."
  break
}

Step 4 is putting on some Latex gloves and facemask to begin crawling through all those stinky workbook files. Keep in mind that the Open() method may need to be modified to suit password-protected files, and other such silliness. Then fetch the worksheets within each workbook to start the search and replace fun.

foreach ($file in $files) {
  $workbook = $excel.Workbooks.Open($file.FullName)
  $worksheets = $workbook.Worksheets
  # add more stuff here to crawl through each worksheet in the workbook - see Step 5
}

Step 5 is searching for the text you want to replace. This will be shoved into the “add more stuff here” line in Step 4

# insert into while() loop for each workbook
$found = $worksheet.UsedRange.Find($SearchFor)
if ($null -ne $found) {
  $address1 = $found.Address(0,0,1,1)
  $row = $found.Row
  $col = $found.Column
  [string]$textvalue = $worksheet.Cells($row,$col).Formula
  [string]$newvalue = $textvalue.Replace($SearchFor, $ReplaceWith)
  if ($textvalue -ne $newvalue) {
    $worksheet.Cells($row,$col).Formula = $newvalue
  }
  while ($found = $worksheet.UsedRange.FindNext($found)) {
    $address2 = $found.Address(0,0,1,1)
    if ($address1 -eq $address2) { break } # no more matches found on this worksheet.. skip to the next one
    $row = $found.Row
    $col = $found.Column
    [string]$textvalue = $worksheet.Cells($row,$col).Formula
    [string]$newvalue = $textvalue.Replace($SearchFor, $ReplaceWith)
    if ($textvalue -ne $newvalue) {
      $worksheet.Cells($row,$col).Formula = $newvalue
    }
  }
}
# end of code insertion

The final merged slurry of ingredients should look something like this…

foreach ($file in $files) {
  $workbook = $excel.Workbooks.Open($file.FullName)
  $worksheets = $workbook.Worksheets
  # insert into while() loop for each workbook
  $found = $worksheet.UsedRange.Find($SearchFor)
  if ($null -ne $found) {
    $address1 = $found.Address(0,0,1,1)
    $row = $found.Row
    $col = $found.Column
    [string]$textvalue = $worksheet.Cells($row,$col).Formula
    [string]$newvalue = $textvalue.Replace($SearchFor, $ReplaceWith)
    if ($textvalue -ne $newvalue) {
      $worksheet.Cells($row,$col).Formula = $newvalue
    }
    while ($found = $worksheet.UsedRange.FindNext($found)) {
      $address2 = $found.Address(0,0,1,1)
      if ($address1 -eq $address2) { break } # no more matches found on this worksheet.. skip to the next one
      $row = $found.Row
      $col = $found.Column
      [string]$textvalue = $worksheet.Cells($row,$col).Formula
      [string]$newvalue = $textvalue.Replace($SearchFor, $ReplaceWith)
      if ($textvalue -ne $newvalue) {
        $worksheet.Cells($row,$col).Formula = $newvalue
      }
    } # while
  } # if
} # foreach

Step 6 is, after all of the worksheets have been processed, save and close the workbook file.

if (!$workbook.Saved) { $workbook.Save() }
$workbook.Close($False)

Step 7 is to close Excel and clean up the spewage it may leave behind. But also to make sure to release those roach-infested COM objects in reverse order (from how they were instantiated)

$excel.Quit()
while ([System.Runtime.InteropServices.Marshal]::ReleaseComObject($worksheet) -gt 0) {}

while ([System.Runtime.InteropServices.Marshal]::ReleaseComObject($worksheets) -gt 0) {}
while ([System.Runtime.InteropServices.Marshal]::ReleaseComObject($workbook) -gt 0) {}
[void][System.GC]::Collect()
# and if Excel just won't die...
Get-Process 'excel' | Stop-Process -Force

Step 8 is to grab something to eat. Because coding and talking make me hungry.

Step 9 is to assemble all of this into one script – which, if you’re as lazy as I am, can get here. It also contains some additional exception handling sauce and comment seasoning.

I hope you enjoyed this mindless rambling. The code provided at the link above has been tested and works, for me at least, so if it doesn’t work for you, I’ll respond with “that’ll be addressed in the next build.”. Seriously, if you see a problem with the mess above, please share a comment below. Thank you!

Enjoy!

business, Scripting, Technology

Replace Excel Data using PowerShell

The full title of this should be, “Replace Excel Data in Workbooks using PowerShell with the ImportExcel Module“, but that’s what the introduction paragraph is for. That should be clear enough, but just in case you are still unsure what this is about: I’m going to demonstrate how to replace hyperlinks in an Excel workbook, using PowerShell. And the best part of this? You don’t even need Excel (or Office) to be installed, in order for this to work. In fact, the better best part is that you don’t even have to do any of this. You could go find something else to do that’s way more fun.

People First Corporate Culture? 6 Steps to Take | LSA Global
(image borrowed from a Google search showing corporate people excited to look at your new spreadsheet)

Ingredients

You’ll need a few things to run this demo:

  • A Windows device (I don’t think ImportExcel will work on a Linux machine, but I don’t know for sure)
  • PowerShell 5.1 or later
  • ImportExcel (PowerShell module)
  • A sample workbook (.xlsx file)
  • No life whatsoever, oops, I mean time. You’ll need a few minutes of nothing better to do

My sample file looks like this (below) and resides on a worksheet named “Sheet1”. Close your eyes and imagine this has thousands of rows of juicy corporate data links, dripping with exciting corporatey information. Way more than you’d want to edit manually, because that would cut into your Facebook time.

CFO Jim Facefister called to say that the intranet URL is changing because “corp” sounds too “corpy” and he prefers “general“, because it’s more general sounding. So he wants the URLs under the “Link” column in this favorite “Excel database” (don’t correct him or he’ll beat you with a golf club), and you want to stay employed long enough to buy that new Ford F150 Lightning your neighbor would envy for at least a few months.

So, you need to change the links to replace “/corp/” items with “/general/” and leave the rest of the contents unchanged. And if it makes the CFO happy, maybe he’ll actually call you Brad instead of Bob, every time he passes you in the hall, even though your name is Susan. But never let a valuable compliment go to waste.

Here’s a view of the source code captured from PowerShell ISE, even though I use Visual Studio Code. I do this because I know it infuriates some of my colleagues, and cheap entertainment is all I can afford right now.

For those who don’t like images, here’s the 100-level, no-exception-handling, poorly formatted, grade D, organic, GMO-free source code…

$XlFile = "\\intranet\corporatystuff\corp\corpdata\corpy-corp-corp\Company Handbook.xlsx"

if (-not(Test-Path $XlFile)) { Write-Warning "File not found: $XlFile"; break }

$xldata = Import-Excel -Path $XlFile -WorksheetName "Sheet1"

$newdata = $xldata | Foreach-Object {
    [pscustomobject]@{
        Title = $_.Title
        Link  = $($_.Link -replace '/corp/', '/general/')
    }
}

$newdata | Export-Excel -Path $XlFile -WorksheetName "Sheet1" -ClearSheet -AutoSize

Let’s walk through this stuff (or crawl through it, depending on how much you’ve had to drink today). First we define the file (path and name) as $XlFile. Then we check to see if it exists, because your colleagues probably moved or renamed it by the time you recovered from last night’s drinking. Then we import the data using Doug Finke’s super-fantastic ImportExcel module function “Import-Excel”. Then we loop through it (lines 7 to 12) to replace the original “Link” values with the new values. Then we shove it out the door like, well, shoving it out the door. I’m too tired for more analogies right now.

Pausing before line 6, we can see $xldata shows the original content:

Pausing before line 14, we can see $newdata shows the modified content:

So that’s pretty much it. Now you can impress your C-level folks and maybe earn a chance to serve drinks at their next party. Next request they’ll send you is to “fix the internet” because it’s broken again.

Cheers!

business, humor, Technology

2001 vs 2021 (aka 2020 CU1)

2001

IT planning Meeting: / Project = Document Management

Teams present: IT Architect, PM, InfoSec, Network, Storage, Accounts, Licensing, Customer stakeholders, Customer stakeholder stakeholders, Stakeholders for other stakeholders, Vendor reps handing out business cards and shaking hands, all packed in one conference room.

Preliminary: Storage and Network teams are blaming InfoSec for system issues. InfoSec is blaming Licensing for holding up a PO. Licensing blames the CFO. PM asks them to keep it down.

Vendor: “(blah blah blah blah blah blah) some pointing and hand gestures learned from a sales book.

IT PM:Thank you for the introduction, Bob.”

Vendor: “Uhhh, it’s Doug, actually.”

IT PM: (turns to stakeholders) “So, what features do you need from Electronic Document Management?

Customer: “What does it do?

IT Architect: (talks for 30 minutes, reads directly from PowerPoint slides, attendees coughing, staring at Blackberry phones, texting jokes about TV sitcom episode from previous night, sounds of thumbs clicking on physical keypad, spoons clanking against coffee cups) … “Any questions?

Customer: “We need all of it.

General IT takeaway: (old-timers: dread. younger folks: excitement)

2021

IT Planning Meeting / Project = Cloud Security

Teams present: Cloud Architect, PM, InfoSec, Cloud Networking, Cloud Identity, Licensing, Customer stakeholders, Customer stakeholder stakeholders, Stakeholders for other stakeholders, Vendor reps, everyone on a Teams call.

Preliminary: Azure/AWS/GCS and M365 teams are blaming InfoSec for system issues. InfoSec is blaming Licensing for holding up a PO. PM asks them to keep it down.

Vendor: “(blah blah blah blah blah blah) some more PowerPoint slides and a QR code.

IT PM: “Thanks for introduction, Juan.”

Vendor: “Uhhh, it’s Carlos, actually.”

IT PM: (turns to stakeholders) “So, what features do you need from Cloud Security?

Customer: “What does it do?

Cloud Architect: (talks for 30 minutes, reads directly from PowerPoint slides, attendees not on mute add background sounds of cats, dogs, birds, car horns, kitchen pots and pans, messaging about Netflix/Hulu/YouTube/Amazon/HBO show from previous day, crumpling fast food bags, spoons clanking against coffee cups) someone keeps taking a heavy drag on their vape in front of the mic … “Any questions?

Customer: “We need all of it.

General IT takeaway: (old-timers: dread. younger folks: excitement)

Cloud, Technology, windows

Using PowerShell to Find Out If You’re About to get a Beating

The goal of this blog post is to export your Azure subscription billing data into Microsoft Excel, so that you can apply some cool charts and distract your significant other from killing you when you avoid telling them how much you “accidentally” spent on Azure services, and they end up finding out later when looking at the bank statement.

Let’s get started.

What You Will Need

  • An Azure subscription of any kind. I use a Pay-as-you-Go plan, which my wife now calls the “You-promised-it-wouldn’t-happen-again-last-time” plan.
  • A machine with PowerShell 5.1 or later (I’m using PowerShell Core 7.0.1 for this, on Windows 10)
  • If you want some “real world” data, setup a VM and configure Bastion on it, and leave it running for a few hours each day for about 5 days. I chose a D2s v3, Windows 10 machine, with a “Premium” SSD data disk (defaults on everything else). Then installed a bunch of apps and scripts. Yes, you read correctly. A “Premium SSD”. That was a critical mistake. I may force myself to watch 40 episodes of The View in a row as self-punishment. But not right now….

Poking a Stick at it

  1. Open a PowerShell console using “Run as Administrator”
  2. Check that you have the modules AZ and ImportExcel installed. If you don’t, install them now. If already installed, make sure they’re the latest version (e.g. use Find-Module to check)
  3. Authenticate with Azure
  4. Query for your billing information
  5. Export it to Excel

Let’s dive in a little deeper. Remember, the deeper you go, the harder it is for someone who’s upset with you to kill you. Deep breath before you go under.

Connect-AzAccount

You may get a prompt to open a website and enter an authentication code, so whatever happens here, you’re on your own. And no, the code shown below won’t work for you, trust me.

Once you’re authenticated, you can query some information. Let’s fetch your billing information. You can output the results to the screen if you want, but I prefer to capture it to a variable for further tinkering.

$cu = Get-AzConsumptionUsageDetail

Filter down to the most important parts. A lot of entries will have a $0 cost, so I don’t care about those. I also don’t need all of the properties, just the name of the resource (InstanceName), its type (ConsumedService), the Billing Period, usage period and quantity, and finally: the cost (PreTaxCost). For now I’ll sort on the cost in descending order so I can spend a minute freaking out and hyperventilating while my dog snores.

$cu | Where {$_.PreTaxCost -gt 0} | Select InstanceName,ConsumedService,BillingPeriod,UsageStart,UsageEnd,UsageQuantity,PreTaxCost | Sort PreTaxCost -Descending

Now we can export this to an Excel worksheet, so I don’t need to sort the results (you can if you want) but Excel can do that just as well…

$cu | Where {$_.PreTaxCost -gt 0} | Select InstanceName,ConsumedService,BillingPeriod,UsageStart,UsageEnd,UsageQuantity,PreTaxCost | Export-Excel -Path ".\AzureBilling.xlsx" -WorksheetName "2020-06" -ClearSheet -AutoSize -AutoFilter -FreezeFirstColumn -BoldTopRow -Show

The Export-Excel parameters are pretty self-explanatory. But the “ClearSheet” parameter might be less obvious. It wipes the target worksheet (i.e. “2020-06”) before populating with fresh data, so I can re-run the code over and over without losing data on other worksheet tabs (same spreadsheet file).

In my case, the surprise for me was where I forgot the costs of maintaining a “jump server” in Azure with a data disk and Bastion access. Even with automatic (scheduled) daily shutdowns, the storage and network costs accumulated while I was focused on beer, grilling, Netflix, Twitter, YouTube, and blasting on my drumkit as if I know how to play. A few days of that (six to be exact) grew my normal 50 cents/month bill to $94.

A somewhat prettier look shows the chronological details like roadkill…

Point A is where I set up so-called “cheap” Windows 10 VM with a “premium” data disk (you thought I’d skip over that didn’t you). Only running it a few hours each day, and shutting it off (deallocated). But just like listening to my neighbor’s terrible music, the damage has already been done.

Point B is the day after my wife walked by and saw the “Cost Analysis” view on my screen, paused and said, “Um, what’s that? That doesn’t look like an Amazon order? Two-hundred and forty-wtf dollars?!” (for the record, she rarely drops F-bombs, even after being around me)

Point C is where I’d be making arrangements for cremation or burial (for the record, I asked for cremation by explosives with a BBQ party, at a safe distance. I don’t think the ATF will approve. Plan B is a gasoline tanker truck and a bonfire, at a safe distance of course)

Anyhow, you can see there’s a tiny gap at point C (May 27) where my (please God I promise I’ll be good from now on!) costs are diverging from the projected (oh no no no nooo!!!!) cost trajectory. I’ll be monitoring this every day because I’m in super-turbo-hyper-ultra-maxi paranoid mode now.

Takeaways

Some of the comments I’ve received from casual conversation with friends:

You should’ve set a budget and alerts” – I did, but with so much email I don’t check my other inboxes often enough. I will now.

Didn’t Mike Teske say to use a pre-paid card with a hard limit?” – yes, he did.

Hey, didn’t Mike Teske warn about using a pre-paid card with a hard limit?” – yes, he sure did.

Hey, didn’t…?” – yes, yes, he did! damn it!

Geez. What a dumbass.” – (sigh).

Actually, my bill last month was four times that. But I’m single.” – I hate you.

Anyhow, I should’ve known better. I watched Mike Teske give a fantastic presentation on avoiding surprise costs with Azure (at the PowerShell Saturday weekend conference in Raleigh, it was fantastic!), and yet, still, I went full-stupid.

I’m never too proud to admit my mistakes, like having friends who are single and have no debt, but aside from that. I’ve learned my “don’t get out of the boat” lesson* about taking cloud costs seriously. Pay attention to the disk types, the secondary costs (storage, bastion, etc.) and listen to Mike.

(* line from Apocalypse Now)

System Center, Technology

Cranky Dave sez Knock that Shit Off

Maybe it’s because I had a string of calls which rubbed the same raw nerve endings too close together. Idk. But I feel like the message isn’t getting out there. Please, if you work with customers, and see these things, urge the living shit out of them to reconsider. If that doesn’t work, duct tape and white vans are still available.

  • Doing things “just because” it’s been that way since 2010
  • Giving out local admin rights like candy. If you want to be drug dealer, run for Congress or Senate.
  • Using login scripts as first response to every need.
  • Repackaging every app installer, even when it only needs “/S” to run quiet
  • Stop over-complicating things just to be cute/clever. Look at me virtualizing my virtual servers inside another virtual server with 3 databases, and I only have 100 devices to manage.
  • Read the #(*$&@(*#$&(_(*@(#*$&) docs and follow the “supported” terms. Stop assuming you’re smarter than a room full of MVPs, *and* a yacht filled with drunk attorneys who all graduated from Harvard.
  • If your environment has to be complicated, it’s most likely because your business is over-complicated, and possibly broken. (Paul’s rule: If you automate a broken process, you will only ever get an automated broken process).
  • Stay within support – Don’t cry for help with your SQL 2005 “mission-critical” database. If it was “mission-critical” it would be running on a supported version.
  • Keep your shit patched – SQL 2016 is nice, but RTM is like 2 service packs and 12 cumulative updates behind! This isn’t 1962. We finished the Moon program a while ago.
  • Do I sound cranky? I’m grabbing a Snickers.

databases, Scripting, System Center, Technology

Basic ConfigMgr HealthChecks using PowerShell

Image result for prostate exam

I’m long overdue for a deep-dive (pardon the pun), so drink-up and let’s move on…

The power provided by PowerShell comes from a lot of different directions, one of them being that you can leverage a ton of built-in functionality without having buy additional software licensing, or even write all the messy code. That’s right, once again, I’m on my “modules are freaking fantabulously increditastical” soap box. I’ll be using a few different modules to do the heavy lifting:

And even though I won’t be showcasing it in this post, if you wish to export anything to Excel, rather than hopping through CSV first, take a look at the module ImportExcel by Doug Finke (the Export-Excel function in particular).

Heads-Up: This is not intended to be a “solution” that you simply download and run. I prefer to share the basic pieces and a direction, and let you take it and run with it however (and wherever) you like. Sharing a fully-bolted, polished solution doesn’t leave you with room to explore and customize without a lot of reverse engineering. Here’s the bricks, have fun building.

If you’re wondering why I’m not covering CMHealthCheck, it’s because (A) it would violate the “heads-up” goal mentioned above, and (B) that module is becoming a bit dated anyway (I’m working on a replacement, feedback is always welcome).

And Now for a Word on Modules

I’ve been in a few discussions about “make vs. buy” or “build vs. borrow” view of scripting. For decades (yes, I’m that freaking old, so you’ll have to speak up), I had always leaned towards building everything. Even when finding a near-perfect match online, I would rewrite it to my tastes. Not anymore. Time is more precious, and I’m not too proud to accept someone else might have provided a better option than I would have built.

In 2020, the state of online sharing is 1000 times what it was 10 years ago. It’s now to the point where not finding a close match for a desired technical solution is the exception, rather than the norm. Only the newest emerging things are lagging behind, mostly due to the trend of over-stimulated coke-snorting CI/CD fanaticism, but I’ll leave that for another episode of “Old man says GTFO my lawn you little CI/CD pipeline bastards!” But, I digress.

To me, modules are like car parts. Even when you build, or restore, a car, you’re not likely going to make EVERY single part from scratch (unless you own a smelting factory, chrome dip tank, a cow farm for leather, and so on). Most components are built by someone else. So, building things from parts is just a natural thing to me. It’s organic. Okay, soap box session is done. Let’s move on.

Getting Things Ready

To perform almost any health assessments, you’ll need sufficient access to the resources. In a typical ConfigMgr environment (if there is a typical ConfigMgr environment), this will translate into:

  • Full Administrator (in ConfigMgr)
  • ServerAdmin (in the SQL instance)
  • Local Administrator (on the site servers)

These are often granted to the account which was used to install the Configuration Manager site. Hopefully, it’s not an actual “user” account (that a human logs in with every day), but a service-type account. If you are not a DBA (or the DBA-ish person who “owns” the SQL instance) you may want to confer with them first, to make sure you don’t step on any toes. Pro-tip: bring doughnuts and fresh jokes.

When I say “Local Administrator”, I don’t mean adding your domain account directly into the local Administrators group, although that does work. It’s recommended that you are a member via domain group membership and that whole AG(U)DLP chain of handling that Microsoft has recommended for decades.

I already mentioned the PowerShell modules I’ll reference, so those need to be installed on your working computer (not on the site servers or database server, unless that’s all you’re working on)

To save on repetitive typing, let’s define some variables to use throughout the following examples. Replace the string values with whatever your TEST LAB environment uses:

$dbhost = "cm01.contoso.local" # site SQL host FQDN
$cmhost = "cm01.contoso.local" # CM primary site host FQDN
$site   = "P01" # CM site code
$cmdb   = "CM_P01" # CM site database

MECM Site Information

To help with finding things in SQL, mainly the default Views, I recommend running the following snippet, so you can use the results to search more easily:

$dbviews = Get-DbaDbView -SqlInstance $dbhost -Database $cmdb -ExcludeSystemView

An example for finding views which relate to something like “site”…

$dbviews | Where {$_.Name -match 'site'} | select name

You can also pass this into a cheap GridView (only $0.99 while supplies last) to pick-and-run your favorite view…

$view = $dbviews | Out-GridView -Title "Pick a View to Query" -OutputMode Single
if (![string]::IsNullOrEmpty($view)) {
  $query = "select * from $view"
  Invoke-DbaQuery -SqlInstance $dbhost -Database $cmdb -Query $query
}

I have a slightly more fancy version of the above sample, as a function, up on my GitHub at http://bit.ly/2SYYOOL. You can load it directly into a console session, and run it, using Invoke-Expression…

iex (New-Object System.Net.WebClient).DownloadString('http://bit.ly/2SYYOOL')
Invoke-CmDbView -SqlInstance $dbhost -Database $cmdb

Site Information Summary

Invoke-DbaQuery -SqlInstance $dbhost -Database $cmdb -Query "select * from v_Site"

General Client Information

I recommend saving the output of the following script to a variable, for use as a baseline for other operations (rather than requesting new data for each sub-query). I’m using $cmdevices for this example…

$cmdevices = Invoke-DbaQuery -SqlInstance $dbhost -Database $cmdb -Query "select * from v_CombinedDeviceResources where (name not like '%unknown%') and (name not like '%provisioning device%') order by name" | 
select Name,MachineID,SerialNumber,MACAddress,DeviceOS,DeviceOSBuild,CoManaged,ClientVersion,IsVirtualMachine,ADSiteName,LastMPServerName,LastPolicyRequest,LastDDR,LastHardwareScan,LastSoftwareScan,LastActiveTime,LastClientCheckTime,ClientCheckPass

From this you can filter on things like the following examples.

Devices with Old or Missing Hardware Inventory

Find devices which haven’t reported hardware inventory yet…

$cmdevices | Where {[string]::IsNullOrEmpty($_.LastHardwareScan)}

Find devices which have reported hardware inventory in the past, but not with the past 30 days…

$cmdevices | Where {(-not[string]::IsNullOrEmpty($_.LastHardwareScan)) -and ((New-TimeSpan -Start $_.LastHardwareScan -End (Get-Date)).Days -gt 30)}

Compare Device Coverage with AD

$adComps = Get-ADComputer -Filter * -Properties lastlogontimestamp,whenCreated,operatingsystem,description

I included some additional attributes in case I want to also compare last-login dates, and so on. But anyhow, to use this to compare devices between AD and MEM, you can run some super-basic tests like this…

$adComps | Where {$_.Name -notin $cmdevices} | select Name
$cmdevices | Where {$_.Name -notin $adComps} | select Name

The example above shows I have more devices in Active Directory which are not in the ConfigMgr database, than I have devices in ConfigMgr which are not in Active Directory. What kind of “health” is this? It’s a measure of how clean and controlled your environment really is.

General Windows Host Information

Get-CimInstance -ClassName "Win32_ComputerSystem" -ComputerName $cmhost
Get-CimInstance -ClassName "Win32_OperatingSystem" -ComputerName $cmhost
Get-CimInstance -ClassName "Win32_SystemEnclosure" -ComputerName $cmhost
Get-CimInstance -ClassName "Win32_Product" -ComputerName $cmhost
Get-CimInstance -ClassName "Win32_BIOS" -ComputerName $cmhost

Disks and Disk Space

Get-CimInstance -ClassName "Win32_LogicalDisk" -ComputerName $cmhost | % {
  [pscustomobject]@{ 
    Drive  = $_.DeviceID
    Name   = $_.VolumeName
    SizeGB = [math]::Round(($_.Size / 1GB),2)
    FreeSpaceGB = [math]::Round(($_.FreeSpace / 1GB),2)
    Used   = [math]::Round($_.FreeSpace / $_.Size, 2)
  }
}

Network Connection Properties

Get-CimInstance -ClassName "Win32_NetworkAdapterConfiguration" -ComputerName $cmhost | 
    Where {$_.IPEnabled -eq $True} | 
        Select IPAddress,DefaultIPGateway,IPSubnet,MACAddress,DNSServerSearchOrder,DNSDomainSuffixSearchOrder | ForEach-Object {
            [pscustomobject]@{
                IPAddress   = $_.IPAddress -join ','
                IPGateway   = $_.DefaultIPGateway -join ','
                IPSubnet    = $_.IPSubnet -join ','
                MACAddress  = $_.MACAddress
                DNSServers  = $_.DNSServerSearchOrder -join ','
                DNSSuffixes = $_.DNSDomainSuffixSearchOrder -join ','
            }
        }

File Shares

Get the file shares, folder and share permissions. This information can be used to further automate for “drift” reporting and remediation, when someone (or some process) modifies them for whatever reason. (Note: The following example has no exception handling. You may want to add some nested try/catch handling inside the foreach-object (%) section.)

$shares = Get-CimInstance -ClassName "Win32_Share" -ComputerName $cmhost | 
  where {$_.Name -ne 'IPC$'} | % { 
    $spath = "\\$cmhost\$($_.Name)"
    $fpath = "\\$cmhost\$($_.Path -replace ':','$')"
    $perms1 = Get-CPermission -Path $spath
    $perms2 = Get-CPermission -Path $fpath
    [pscustomobject]@{
      Name = $spath
      Path = $_.Path
      Description = $_.Description
      SharePermissions = $perms1
      FilePermissions = $perms2
    }
  }

Stopped or Failed Services

Another common check is looking for services which are set to “automatic” but are not currently running…

Get-CimInstance -ClassName Win32_Service -ComputerName $cmhost |
  Where {$_.StartMode -eq 'Auto' -and $_.State -ne 'Running'}

Ooooh. Missing Updates?

What about those pesky Windows updates on your site systems? Yeah, they need them. And SQL Server updates too.

Get-WindowsUpdate -ComputerName $cmhost -WindowsUpdate
# note: if the -ComputerName connection fails, try using Enter-PSSession instead

Event Logs

The Windows Event Log is a gold mine for finding current and potential issues with a Windows Server.

# system log "critical","warning" and "error" entries in the last 24 hours...
$xfilter = @'
<QueryList>
  <Query Id="0" Path="System">
    <Select Path="System">*[System[(Level=1 or Level=2 or Level=3) and TimeCreated[timediff(@SystemTime) &lt;= 86400000]]]</Select>
  </Query>
</QueryList>
'@
$sysEvents = Get-WinEvent -LogName "System" -ComputerName $cmhost -FilterXPath $xfilter

# application log "critical","warning" and "error" entries in the last 24 hours...
$xfilter = @'
<QueryList>
  <Query Id="0" Path="Application">
    <Select Path="Application">*[System[(Level=1  or Level=2 or Level=3) and TimeCreated[timediff(@SystemTime) &lt;= 86400000]]]</Select>
  </Query>
</QueryList>
'@
$appEvents = Get-WinEvent -LogName "Application" -ComputerName $cmhost -FilterXPath $xfilter

ConfigMgr Server Logs

Oh boy. This part isn’t fun. You can search the server and component status events within the site database, which is often faster, but I’ll save that for another post.

For this one, I borrowed a very nice script by Adam Bertram, aka “Adam the Automator” (Twitter: @adbertram) and modified it slightly (okay, I poured dumb sauce all over it) to read server logs instead of client logs. I realize that some logs don’t follow a consistent internal format, so if you know of a better alternative, please chime in?

iex (New-Object System.Net.WebClient).DownloadString('http://bit.ly/2vIhtXk')
$logs = ('sitecomp','dataldr','hman','distmgr','smsexec','wsyncmgr')
$logs | ? {
  Get-CmServerLog -ComputerName $cmhost -SiteCode $site -LogName $_ | ? {$_.Category -eq 'Error'}
}

Database Information

A Configuration Manager site isn’t much good without a SQL Server database. And a SQL Server database isn’t much good if it’s suffering from issues resulting from mis-configuration, neglect of maintenance and updates, and so on. So any real “health check” of a system implies checking all the parts which it depends on, which in this case is the site database.

SQL Instance Summary

This will return basic version and update information, such as version, build, service pack, cumulative update and KB levels, and support status.

Get-DbaBuildReference -SqlInstance $dbhost

Getting SQL Server update compliance can be tricky. At least it has been for me, and probably because I’m retarded AF. But if you find it tricky too, then maybe it’s from something else, but anyhow, here’s one way…

# import a magical function from the world of beyond...
Test-DbaBuild -SqlInstance $dbhost -Latest

CM Database Summary

This will return a summary of your SQL database, such as name, instance, status, recovery model, compatibility level, collation, owner, and basic backup info.

Get-DbaDatabase -SqlInstance $dbhost -Database $cmdb

Connection Authentication Scheme

Test-DbaConnectionAuthScheme -SqlInstance $dbhost

SQL Instance Memory Allocation

This will return summary information about the current maximum memory limit, and current usage for the instance (in megabytes).

Get-DbaMaxMemory -SqlInstance $dbhost

You can also retrieve current memory usage stats…

Get-DbaMemoryUsage -ComputerName $dbhost

Database File Information

This will return details about each .mdf and .ldf file for your CM database, such as path, size, status, reads/writes, and more.

$dbfiles = Get-DbaDbFile -SqlInstance $dbhost -Database $cmdb

Database File Auto-Growth Information

This is basically an extension of the example above, which dives more into the auto-growth aspects.

$dbfiles | select LogicalName,Size,Growth,GrowthType,UsedSpace,NextGrowthEventSize,TypeDescription

Database Index Fragmentation

This will return the current fragementation state of your database indexes (indices?) I prefer to break this into two (2) parts: a query file, and the script code. The query file contains only the SQL statement, which the script code imports using the -File parameter. The first example below is the SQL statement, followed by the PowerShell script.

SELECT 
  dbschemas.[name] as 'Schema',
  dbtables.[name] as 'Table',
  dbindexes.[name] as 'Index',
  indexstats.avg_fragmentation_in_percent as 'FragPct',
  indexstats.page_count as 'PageCount' 
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL, NULL, NULL) AS indexstats
  INNER JOIN sys.tables dbtables on dbtables.[object_id] = indexstats.[object_id]
  INNER JOIN sys.schemas dbschemas on dbtables.[schema_id] = dbschemas.[schema_id]
  INNER JOIN sys.indexes AS dbindexes ON dbindexes.[object_id] = indexstats.[object_id]
  AND indexstats.index_id = dbindexes.index_id
WHERE 
  indexstats.database_id = DB_ID()
ORDER BY 
  indexstats.avg_fragmentation_in_percent desc
$qfile = "[ENTER_THE_SCRIPT_PATH_HERE]\Documents\indexfrag.sql"
$threshold = 40 # index frag percent baseline, whatever you prefer
$stats = Invoke-DbaQuery -SqlInstance $dbhost -Database $cmdb -File $qfile
[math]::Round(($stats | where {$_.FragPct -gt $threshold}).Count / $stats.count, 2)

Failed SQL Agent Jobs (last 24 hours)

Get-DbaAgentJobHistory -SqlInstance $dbhost -StartDate (Get-Date).AddHours(-24) | Where {$_.Status -ne "Succeeded"}

Database Backup History

Get-DbaDbBackupHistory -SqlInstance $dbhost

Conclusion

I think I’ve talked enough for now, and I’m out of coffee. As I mentioned earlier (I think), this is only a sampling of some of the things you can bolt together using off-the-shelf modules, and some minimal touch-up work.

As the MECM or MEM/CM team adds more to the Management Insights library of tools, you can expect to peel off a few custom tools, but that may be a gradual process. Keep an eye on this feature with each new build that you install.

This isn’t at all restricted to MEM/CM/ConfigMgr, or even SQL Server (even though I spent a lot on this).

Now, put on your best pair of Latex gloves, and smile. 🙂

databases, Projects, Scripting, System Center, Technology

Cool SQL Tricks with DbaTools and MEM ConfigMgr

If you’ve ever wanted to export information from a Configuration Manager site database, you’ve probably found many different ways to do it. Sometimes the management console provides an easy method, sometimes not. Sometimes the SMS Provider (WMI) makes it easy, sometimes not (think set-join, operations for example). Then there’s SQL queries.

One of the best features of SQL, as a language, is the ability to perform dataset joins, or combining results from multiple source tables/views/functions as if the source data was all in one place (it kind of is, but kind of isn’t). Then comes the “how do I run a query against the database from PowerShell?” question.

A lot of examples use .NET ADO methods like connections, datareaders, data adapters, and so on. Which is fine, but yuck! That’s a lot of messy code and I just took a shower. I’m lazy, and I hate writing more lines of code than I really need. Enter PowerShell modules.

Modules are basically toolboxes, filled with tools to help with certain tasks, by taking care of the messy or complicated stuff for you in the background.

If you’ve ever used PowerShell, you’ve used modules. Just type Get-Module and press Enter to see the ones currently loaded in memory. To see all of the modules installed add -ListAvailable after it (e.g. Get-Module -ListAvailable).

One module I absolutely love is dbatools. It’s a toolbox of functions for dealing with SQL Server from inside and outside. By “inside” I mean interacting with tables, views, functions, agent jobs, and so forth. By “outside” I mean server configuration, statistics, backups, processes, memory allocation, version and build info, and much more. It really is amazing. (follow @psdbatools on Twitter for updates and tips)

Install-Module dbatools

Then type “Get-Command -Module dbatools” and lean closer to your screen. Slide a plate under your mouth to catch the drool.

I use a lot of functions in that module, and even so, only about 10% of the overall list. That’s still enough for my needs, and when new needs arise, it’s usually ready for me. For a full list of commands – click here. (note: if it’s missing something, you can suggest it, or contribute via their GitHub site).

Invoke-DbaQuery

The Invoke-DbaQuery function submits a query to a specified database from either a file or text. To use a file, specify the full path/name using the -File parameter. To use text, specify the -Query parameter. Here’s an example for showing all the Collections, along with their ID and membership count…

Note that the -SqlInstance parameter refers to the SQL server hostname (and instance name, if not referencing the “default” instance), and the -Database parameter is the name of the database (I’m not being snarky, I really meant that).

$query = "select CollectionId,Name,MemberCount from v_Collection order by Name"
Invoke-DbaQuery -SqlInstance "cm01.contoso.local" -Database "cm_p01" -Query $query

Before you get too excited, and start getting twitchy fingers, please heed the following recommendation:

  • Do not EVER submit changes directly to the database for a ConfigMgr site. No writing or deleting, renaming, altering of any kind. Just read operations.
  • Limit your read operations to just what you need, and keep in mind that every request you submit adds to the overhead it is already dealing with.
  • TEST EVERYTHING IN AN ISOLATED, NON-PRODUCTION LAB ENVIRONMENT THOROUGHLY BEFORE YOU EVER THINK ABOUT TRYING IT IN PRODUCTION (unless you win the PowerBall and just don’t care, but don’t forget me?)

WMI vs SQL vs REST (Admin Service)

Some of you may ask about the differences between SQL, WMI and REST when it comes to reading data. WMI (SMS Provider) has been the go-to since, well, forever. The problems with WMI for intensive data operations are:

  • WMI is slower than SQL. Typically a lot slower. Compared with SQL queries, it’s like listening to Mitch McConnell talk after he’s had 4 martini’s, and then listening to John Moschitta Jr. talk after he’s had 4 cans of Red Bull. And WMI queries (WQL) don’t support compound join operations.
  • The Admin Service (REST API) is newer, and will eventually replace the SMS Provider. It provides a robust channel using a web service, which, like the SMS Provider, acts as a broker between the requestor and the actual SQL database. If you’ve used the Modern Driver Management solution from SCConfigMgr.com, the task sequence step which runs a PowerShell script is using a custom web service which provides the same function (only at a smaller scale)
  • SQL is by far the fastest for returning large datasets, and excels at complex join operations. But it’s also tapping into the very heart of ConfigMgr and should be done carefully.

The Admin Service is probably going to become the go-to path for interacting, programmatically, with MEM/CM databases, and it is very cool indeed. But for me, there will always be a soft spot for using SQL. So don’t take this as a “you should use SQL always for everything” suggestion. It’s really just ONE of several ways you can get data out of the site.

For more on the Admin Service, read Sandy‘s article here, and Adam Gross‘s article here.

Back to SQL for a moment… Another example, this one for querying the general MEM/CM site information:

$query = "select * from v_Site"
Invoke-DbaQuery -SqlInstance <server\instance> -Database <name> -Query $query

As of Tech Preview build (5.00.8946.1000) there are 1757 non-system views in the CM_xxx database schema. Sometimes it can be a challenge to search (or filter) within SSMS to find one you need. PowerShell can make it a little easier, especially with Get-DbaDbView.

$views = Get-DbaDbView -SqlInstance "cm01" -Database "cm_p01" -ExcludeSystemView | select Name | sort Name

Because this is often a slow return, I recommend saving the output to a variable (as shown above). This makes it less painful to filter/sort than sending a new request.

Getting more serious, like putting-my-pants-on serious, let’s check the database index fragmentation status. For this, since the query is a bit longer, I will dump it in a file and save it as “indexfrag.sql” (shown below) and use the -File parameter to submit the contents to the SQL instance. I also want to save the output to a variable, so I can perform some filtering and sorting:

SELECT 
  dbschemas.[name] as 'Schema',
  dbtables.[name] as 'Table',
  dbindexes.[name] as 'Index',
  indexstats.avg_fragmentation_in_percent as 'FragPct',
  indexstats.page_count as 'PageCount' 
FROM sys.dm_db_index_physical_stats (DB_ID(), NULL, NULL, NULL, NULL) AS indexstats
  INNER JOIN sys.tables dbtables on dbtables.[object_id] = indexstats.[object_id]
  INNER JOIN sys.schemas dbschemas on dbtables.[schema_id] = dbschemas.[schema_id]
  INNER JOIN sys.indexes AS dbindexes ON dbindexes.[object_id] = indexstats.[object_id]
  AND indexstats.index_id = dbindexes.index_id
WHERE 
  indexstats.database_id = DB_ID()
ORDER BY 
  indexstats.avg_fragmentation_in_percent desc
$stats = Invoke-DbaQuery -SqlInstance "cm01" -Database "cm_p01" -File "C:\users\clueless\Documents\indexfrag.sql"

Now I can filter the results to see how bad my fragmentation really is:

[math]::Round(($stats | where {$_.FragPct -gt 40}).Count / $stats.Count, 2)

My lab database doesn’t seem too bad (0.03% of indices are more than 40% fragmented), but that’s because I use Ola Hallengren‘s DB Maintenance Solution (as explained by Steve Thompson)

By the way, speaking of Ola Hallengren’s solution, dbatools has a built-in function for installing it. Check out Install-DbaMaintenanceSolution. One caveate is that it doesn’t create the nice Maintenance Plan that Steve describes, and some of the jobs aren’t configured with all the options (e.g. IndexOptimize – USER_DATABASES).

To list the Agent jobs, use Get-DbaAgentJob:

Get-DbaAgentJob -SqlInstance "cm01" # dumps a lot of details for each job

Gluing things together

These are just a few building blocks, and you can easily start building all sorts of diabolical planet-saving awesomeness to earn your next bonus check. For example:

  • Query for Devices which meet (or don’t meet) a given condition
  • Output to a CSV, XLSX file, or HTML file
  • Attach to an email and send it on a schedule

Aside from basic things like this, you can install the module on ANY machine in your environment which has access to the network, it doesn’t need to be on the SQL host (actually, it’s preferred that you don’t install things on your site servers unless absolutely necessary).

Having dbatools on a workstation, or utility server, allows you to consume data from multiple instances and databases. You can use modules like dbatools for building custom automation processes. For example, you can install it on an Azure Automation Hybrid Worker to leverage it with Runbooks. This is handy for pulling data from disconnected environments (think multiple AD forests without lateral trusts). Even shaping the extracted data for uploading into Log Analytics (now Azure Monitor).

The sky is the limit. The world is your oyster. Now, drink some super-caffeinated stuff and get busy!

Cloud, Devices, Scripting, System Center, Technology, windows

Rant 42 – Getting In Tune with Intune

Keep in mind that everything I’m about to say is within the context of the following:

  • This is entirely focused on managing Windows desktops and laptops. This has nothing to do whatsoever with mobile devices.
  • This is based on querying the Graph API “beta” version, but much of it applies to v1.0 as well
  • I’m more comfortable crawling inside Configuration Manager and SQL Server databases
  • It’s entirely possible that I’m a complete idiot and haven’t the slightest clue what I’m talking about
  • I’m struggling to be funny, but I’m tired and pissed off trying to get inventory data for a customer and keep landing on “if this was ConfigMgr I’d have it in 5 seconds”
  • After three weeks of no beer, wine or anything fun, I just finished two very tasty beers. And now my fingers want to type some shit out.

In the Beginning

(Morgan Freeman voice here) Most of the last 20 years I’ve worked in “IT” has been muddling with computer devices of various kinds; not necessarily from a hardware aspect (calm down), but from a software and infrastructure angle. Deploying, configuring, managing, repairing, replacing, decommissioning. Okay, and breaking, and swearing at. Mostly those which run some flavor of Microsoft Windows.

As such, there are many commonplace scenarios I’ve dealt with, which may seem very familiar to you

  • Inventory
  • Health (prescriptive and reactionary)
  • Disposition
  • Compliance
  • Financial
  • Legal
  • Control

The first three are pretty common everywhere, regardless of what hardware or software brands are used. The fourth is emerging as the top contender. And while financial and legal aspects are important, ask any attorney, it’s the last item, Control, that takes the lead. After all, you can’t really guarantee any of the others without having control in some respect.

After almost 25 years of getting involved with various aspects of inventory management, what I find most interesting is a seemingly disconnected view of Control and Inventory. Many organizations seem to view them as semi-related, or unrelated. But they are actually entirely related. Like politicians and pedophiles, they’re impossible to separate.

You need inventory to gain control, and you need control to gather inventory. And for decades, traditional (on-prem) solutions have evolved and matured to meet the needs of almost every customer in every environment when it comes to inventory and control.

Most modern cloud-based offerings offer a wide variety of control capabilities, but are lacking when it comes to inventory. This is arguably due to having a mobile device focus, at least during their inception. Mobile phones typically don’t incur the same depth of inventory concern as desktop and laptop computers. This is partly due to the platform restrictions and licensing costs for available apps (consider the most expensive phone app vs. the most expensive desktop apps).

The operational aspects are different, as are the lifecycle management aspects. Think of the typical sequence of events in the life of a mobile phone and compare that to a typical laptop or workstation. Before you get your panties tied around your neck in angst, consider deploying most mobile apps with deploying things like ArcGIS, AutoCAD, Inventor, USB-related software, device drivers, language packs, multiple user profiles, and so on. Yeah, they’re not quite the same animals

For decades, Active Directory was the backbone of device inventory management, but in 2020 that’s no longer exclusive. Not that AD was a comprehensive or robust solution in that regard, but that it was a back-plane on which other applications, scripts, databases, and so on were extended. AD was, and still is, a foundation for managing devices (and users and groups). And now there’s Azure AD.

More and more customers, particularly smaller organizations, don’t want any “servers” to manage. Many larger organizations have smaller teams that they’d like to be less encumbered with on-premises infrastructure. They want devices to live in the cloud, and be managed entirely, and exclusively, in the cloud.

The challenge is that cloud solutions were built, initially, to address a different need: mobile devices (mostly phones and tablets). Traditional desktops and laptops have enjoyed decades of maturing management tools. And now the market is pushing one solution to cover a different need.

The more that customers come knocking on our doors with questions about replacing on-prem inventory and management systems with Intune, the more challenges I encounter with gaps in functionality. This is particularly true for inventory management. And since a huge number of customers rely on inventory for a variety of business needs, they demand a solution that provides comprehensive and accurate inventory data.

That said, here’s what I’ve compiled from customer discussions, as well as my own (limited) experiences. Intune management features for Windows computers could use some improvements in at least the following areas:

  • More comprehensive inventory:
    • BIOS information (vendor, version, dates, etc.)
    • Driver information (name, vendor, version, dates, etc.)
    • Applications: Just copy what ConfigMgr gathers, that’ll nail it
    • Custom file inventory (by extension, by wildcard, etc.)
    • User Profiles
    • Windows Services
    • NIC device info, adapter config info (IP addresses, gateway, DNS, etc.)
    • Address some quirks: PhysicalMemoryInBytes always shows zero (0), ethernetMacAddress is empty for almost all devices, and chassisTypes shows “unknown”
    • Windows Events (filtering, reporting)
  • Win32 application deployments
    • More robust and simplified troubleshooting (log access/viewing, custom logs)
    • More robust lifecycle management: upgrades, complex batch execution (task sequences), and so on.
  • Patching could be better, particularly having the ability to decline/reject specific updates
  • Run PowerShell scripts on a schedule, view/edit scripts in the portal, and view a history of all past-deployed PowerShell scripts per device
  • I would add REST API (Graph) changes as well, but these would depend on the preceding suggestions
  • I realize some of this is possible with things like ATP and other premium tools, but as a base product, it needs more from the start.
  • I could go on, but I’ll just point you to UserVoice

Hallucinatory Thoughts

Related image

Ever since Microsoft made a bold push for Intune, and the “cloud first, mobile first” motto was announced, the ConfigMgr masses reacted quickly with a message that “ConfigMgr is dead”. I don’t think Microsoft really intended (or expected) that would grow so quickly.

The dilemma this likely caused was two-fold: Internal and External

First, the ConfigMgr team was pushing on with an enormous backlog of new features, and fixes; investing heavily in beefing up ConfigMgr features. Allowing the public perception to go unchecked could risk impacting sales and revenue, and impacting team morale at a critical time. No one wants to work on a project with a doomed future. Valuable people would leave, and potentially-valuable people would avoid it entirely.

Second, imagine you’re a business looking for direction on how to manage (mostly Windows-based) devices on-prem and out in the wild. You have an older version of ConfigMgr, and are struggling to decide whether it’s worth upgrading, or finding a new solution. A pervasive message of a “dead product” would almost certainly steer you towards another solution. Again, impacting sales revenue and so on.

To avoid both risks, Microsoft had to shore-up the message that there’s a future, and it was both products combined as a unified solution. And rightly so. It makes perfect sense. They’ve admitted this is a journey and worth sticking to if you want to reap the benefits of a superior solution at a better (more affordable) cost.

However…

There’s still that group of customers who weren’t using ConfigMgr to begin with. They’re using <insert any name of inexpensive product> to manage inventory and reporting, software deployments, patch management, and so on. I’ll paraphrase a quote from one of my customers, but it matches most customers in a similar situation:

“When you compare Intune with an on-prem solution, like PDQ or LAN Sweeper, as limited as they are compared with ConfigMgr, they still outshine Intune for managing of, and reporting on, Windows desktops and laptops. And we already own them. We need a compelling reason to drop this and pay subscription for something new. It’s just not there yet.”

To be honest, most of them I’ve spoken with would love to switch to Intune. The notion of eliminating on-prem infrastructure, and using a web browser from anywhere, is very appealing. They’re not saying “no”, but “not yet”. They’re keeping an eye on it, and many have trial accounts or smaller paid subscriptions, to continue testing and learning, and comparing.

But Why?

Related image

I really don’t know why Intune hasn’t pushed as hard and fast as ConfigMgr development has (I mean, holy McShit!). People still ask me, “Hey Dan!” (“It’s Dave”) “Right. Doug, why don’t you pick on Configuration Manager anymore? “

My answer is “Because, (A) my name is Dave and (B) there’s really nothing to pick on anymore.” That’s right. The CM team is knocking it out of the park. If you don’t believe me, come to MMS and find out for yourself. I did.

I’m sure the Intune team has equally lovely, hard-working people, lovingly hard at work to make the love work like it should. Or something like that. But I haven’t sat at a table with them, so they don’t know my face or what kind of car I drive either. So, for now at least, I can focus on them.

So, Now What?

Some of the limitations (missing capabilities) can be addressed using things like PowerShell script deployments and Win32 app deployments. But these are also limited. Let’s say you wanted to override the hamstrung inventory capabilities using a custom PowerShell script, or using an Azure Function or Automation Runbook.

Of these, the PowerShell script makes the most sense for two reasons:

  • Most likely you’re working with laptops, which roam around and aren’t accessible over the WAN 24/7, like desktops typically are.
  • Initiating the process from the remote client insures the greatest probability of success due to timing (when it’s on, and connected, it runs, instead of poking from afar repeatedly and trying again and again)

Pulling data from a device isn’t difficult, thanks to CIM/WMI and PowerShell. But you need to store the inventory (and other) query results somewhere if you want to make use of it. Excluding the use of a third-party tool, that leaves you with a two primary options:

  • PowerShell + Azure Storage Account (e.g. blob container)
  • PowerShell + Log Analytics

Both of these require some sort of credentials to access and import data. Whether that’s username/password, or a key, it has to be embedded in the script, which makes it vulnerable and risky.

Another issue is that deploying a PowerShell script from Intune only runs one time per device. Implementing a recurring/scheduled update requires either recreating the same script as a new assignment, or creating a scheduled task, and hope it doesn’t stop working for whatever reason. Either way, it’s reinventing a wheel that’s been around for DECADES. Sort of like “Here’s a shiny new car! Wheels are still in planning, but you can make your own

Summarizing my Conclusion

Once again, paraphrasing my colleagues and customers, and anyone who doesn’t run away as soon as I start talking to them, I would say MOST people trying to manage Windows devices would LOVE to do so from a nice and pretty web portal, and without a lot of on-prem infrastructure to mess with. Buy the new device, join it to a cloud party, and manage the open bar from the cloud.

In 2020, that vision is closer than it has ever been to becoming a reality. But the other reality is that not only are there still some serious technical challenges (rural connectivity, bandwidth, idiot users, crappy applications, bullshit drivers, more idiots, decaf coffee, JRE, McAfee, etc.) that remain persistent, but they aren’t going away anytime soon. If the economy doesn’t tank in the meantime, I think in the next five years this will be as commonplace as mobile phones are today. I hope so.

But then again, in 2020, I still have to fiddle with BIOS settings, and the ConfigMgr Query Rule dialog box may outlive today’s most powerful cockroaches. So, while I’m hopeful, I still don’t have a flying car, or that paperless office I was promised 20 years ago.

Finger’s crossed.

Namaste!

Cloud, Devices, Scripting, Technology, windows

Export HW/SW Inventory Data from Intune Devices using PowerShell

What is this recent torrent of Intune gibberish coming from this foul-mouthed idiot? Is he some sort of “expert”? Bah! Nope! I’m just working with it a bit more lately, so I figured I’d brain-dump on it while I can (and to help me recall things if I step away from it for a few months).

Background and Setup

The inventory data for Intune-managed Windows 10 devices is stored in Azure and exposed through the Graph API. And while it can seem challenging to find good examples for accessing it with PowerShell, there is in fact a very nice repository of example scripts on the Microsoft GitHub site at https://github.com/microsoftgraph/powershell-intune-samples

Given that I’m still learning my way around Intune, and Graph, the first thing I found helpful were the examples ManagedDevices_Get.ps1, and ManagedDevices_Apps_Get.ps1, under the ManagedDevices folder. Both of these were very helpful and I was able to pull the data I needed.

However, since I needed to query 1800+ devices, I noticed the default “page” limit returns only the first 1000 records (devices). Then I found they also posted a nice example ManagedDevices_Get_Paging.ps1, which I merged with the ManagedDevices_Get.ps1, and was able to pull all of the devices at one time. The make part that needs help are lines 179 to 187 (below)…

$DevicesNextLink = $DevicesResponse."@odata.nextLink"
while ($DevicesNextLink -ne $null){
    $DevicesResponse = (Invoke-RestMethod -Uri $DevicesNextLink -Headers $authToken -Method Get)
    $DevicesNextLink = $DevicesResponse."@odata.nextLink"
    $Devices += $DevicesResponse.value
}

After that, I added the 2 or 3 lines of code to query the installed applications and add those to an output object (a master set of data for each device, including hardware, operating system and applications). I added this to a new function (below) to return the data for further processing.

function Get-DsIntuneDeviceData {
	[CmdletBinding()]
	param(
		[parameter(Mandatory)][string] $UserName,
		[parameter()][switch] $ShowProgress,
		[parameter()][switch] $Detailed
	)
	Get-DsIntuneAuth -UserName $UserName
	$Devices = Get-ManagedDevices
	Write-Host "returned $($Devices.Count) managed devices"
	if ($Devices){
		$dx = 1
		$dcount = $Devices.Count
		foreach ($Device in $Devices){
			if ($ShowProgress) { 
				Write-Progress -Activity "Found $dcount" -Status "$dx of $dcount" -PercentComplete $(($dx/$dcount)*100) -id 1
			}
			$DeviceID = $Device.id
			$uri = "https://graph.microsoft.com/beta/deviceManagement/manageddevices('$DeviceID')?`$expand=detectedApps"
			$DetectedApps = (Invoke-RestMethod -Uri $uri -Headers $authToken -Method Get).detectedApps
			$dx++
			if ($Detailed) {
				$disksize  = [math]::Round(($Device.totalStorageSpaceInBytes / 1GB),2)
				$freespace = [math]::Round(($Device.freeStorageSpaceInBytes / 1GB),2)
				$mem       = [math]::Round(($Device.physicalMemoryInBytes / 1GB),2)
				[pscustomobject]@{
					DeviceName   = $Device.DeviceName
					DeviceID     = $DeviceID
					Manufacturer = $Device.manufacturer
					Model        = $Device.model 
					MemoryGB     = $mem
					DiskSizeGB   = $disksize
					FreeSpaceGB  = $freespace
					SerialNumber = $Device.serialNumber 
					OSName       = $Device.operatingSystem 
					OSVersion    = $Device.osVersion
					Ownership    = $Device.ownerType
					Category     = $Device.deviceCategoryDisplayName
					Apps         = $DetectedApps
				}
			}
			else {
				$disksize  = [math]::Round(($Device.totalStorageSpaceInBytes / 1GB),2)
				$freespace = [math]::Round(($Device.freeStorageSpaceInBytes / 1GB),2)
				[pscustomobject]@{
					DeviceName   = $Device.DeviceName
					DeviceID     = $DeviceID
					OSName       = $Device.operatingSystem 
					OSVersion    = $Device.osVersion
					Apps         = $DetectedApps
				}
			}
		}
	}
	else {
		Write-Host "No Intune Managed Devices found..." -f green
		Write-Host
	}
}

The full trainwreck can be safely viewed here. Be sure to wear rubber gloves while handling it.

With that, I decided to drop it into a new module to make it easier to access and reuse. I also added a few more functions, with the help of examples from Matthew Dowst and Eli Shlomo and some calls to PowerShell module ImportExcel, by Doug Finke. I named this module ds-intune.

Example

This example was tested on ds-intune 0.3.

Install-Module ds-intune
Get-Command -Module ds-intune

The two functions I’ll use below are Get-DsIntuneDeviceData and Export-DsIntuneAppInventory.

$CustomerName = "Contoso"
$UserName = "<your_AzureAD_UserPrincipalName>"
Connect-AzureAD
# be patient, this step can take a while if you have more than 50 machines
$devices = Get-DsIntuneDeviceData -UserName "john.doe@contoso.com" -ShowProgress -Detailed
Export-DsIntuneAppInventory -DeviceData $devices -Title $CustomerName -UserName $user -Overwrite -Show -Verbose

As always: Please post comments or corrections, winning lottery numbers, tasteless jokes, and happy thoughts. Here or at the GitHub repo.

Tomorrow I’m off to Ft. Myers for 3 days of work. Wish me luck.

Cheers!