Facebook – The New American Sofa

If you’re posting on Facebook, like I am right now, it means you’re not getting shit done. If you’re not getting shit done, it means you’re part of the problem. The more you’re not getting shit done, the bigger the problem.

I don’t count Twitter in this fight, because Twitter constrains everyone to small bites.  Facebook encourages long, rambling, mindless rants, just short of being a full blog, but definitely longer than most TV and radio commercials.  Also, comparing Twitter and Facebook, the percentage of content focused on “sharing information” (useful information) rather than fluff/politics/religion/rehashed-news/fake-news/re-faked-news etc. isn’t even in the same ball park.  Not even the same planetary system.

In one hour I get more helpful information from Twitter than an entire year on Facebook.  I have made my decision on where to spend my “not getting shit done” time.

GPODoc – PowerShell Module

I just posted my first PowerShell module to the PowerShell Gallery, even though it’s been available on my GitHub site for a little while longer.  It’s a small project, with only two (2) functions, so far.  Get-GPOComment and Export-GPOCommentReport.

Get-GPOComment is purely CLI and helps query GPO’s to return embedded descriptions, comments and so on.

Export-GPOCommentReport is a CLI function as well, but produces an HTML report, which is technically GUI output, of embedded GPO comments and descriptions.

The module is simple to load by way of the Install-Module and Import-Module cmdlets (see below).  The only requirement is to have the GroupPolicy PowerShell module installed first.  This is normally installed on all AD domain controllers, as well as other servers and workstations which have the Windows 10 RSAT package installed (tip: You can also use Chocolatey to install RSAT:  choco install rsat <or> cinst rsat)

Loading The PowerShell Module

Install-Module GPODoc

Trying it out: Get-GPOComment

Get-GPOComment -GPOName '*' -PolicyGroup Policy

get-gpocomment1

Get-GPOComment -GPOName '*' -PolicyGroup Preferences

get-gpocomment2You can easily modify the output layout since the output is in hash table format…

Get-GPOComment -GPOName '*' -PolicyGroup Preferences | Format-Table

If you want more flexibility in selecting the inputs, you can use the Get-GPO cmdlet and channel the output through a pipeline.  For example…

Get-GPO -All | Where-Object {$_.DisplayName -like "U *"} | 
  Select-Object -ExpandProperty DisplayName | 
    Foreach-Object {Get-GPOComment -GPOName $_ -PolicyGroup Policy}

get-gpocomment3

The parameters for this function are as follows:

  • GPOName
    • [string] single or array of GPO names
  • PolicyGroup
    • [string] (select list) = “Policy”, “Preferences” or “Settings”
    • Policy = query the GPO descriptions only
    • Settings = query the comments attached to GPO internal settings only
    • Preferences = query the comments attached to GPO Preferences internal settings only

This also respects -Verbose if you want to see more background details.

Export-GPOCommentReport

This function only takes two (2) inputs:

  • GPOName
    • [string] single or array of GPO names
  • ReportFile
    • [string] the path/filename for the HTML report file to create.

It invokes Get-GPOComment, combining “policy”,”preferences” and “settings” output, and writes it all to an HTML report file of your choosing.  The CSS formatting is hard-coded for now, but I’m open to feedback if you want more options. You can specify the GPO name(s) directly, or feed them via the pipeline.

export-gpocommentreport1

Example web report…

export-gpocommentreport2

If you used 1.0.1 (posted a few days earlier) scrap that and get 1.0.2.  The older version had some major bugs in the Get-GPOComment function, and the Export-GPOCommentReport function wasn’t available until 1.0.2.

Feedback and suggestions are welcome.  Even if it’s “hey man, your shit sucks. Consider a dishwashing job, or something else.”  Any feedback is better than no feedback.

Thank you!

 

Documenting Your IT Environment (The Easy Way)

One of the most common, and tragically problematic, aspects of today’s IT world is the lack of sufficient documentation. In fact, ANY documentation. It’s not associated with any one area either. It crosses everything from help desk tickets, to change requests, to Active Directory objects, to Group Policy, to SQL Server, to System Center to Azure.

UPDATED 8/8/17 – added SQL and SCCM examples at very bottom.

When and Why Does It Matter?

It matters when things go wrong (or “sideways” as many of my colleagues would say).  There’s the “oh shit!” moments where you or your staff find the problem first.  Then there’s the “oh holy shit!!” moments when your boss, his/her boss, and their bosses find the problem first and come down hard on you and your team.  When shit goes wrong/sideways, the less time you waste on finding out the what/where/why/who/when questions for the pieces involved with the problem, the better.  To that end, you can document things in other places, like Word documents, spreadsheets, help desk and ITIL systems, post-it notes, whiteboards, etc.  Or you can attach comments DIRECTLY to the things which are changed.  This isn’t possible in all situations, but it is most definitely possibly for many of the things that will straight-up shove your day into shit storm, and I have no idea what that really means, but it just sounds cool.  Anyhow…

The best part of this approach is that you can leverage PowerShell and other (free and retail) utilities to access and search comments you nest throughout the environment.  For example, you can collect and query Group Policy Object comments…

Need to Get the Comment buried in a particular GPO or GPPref setting?  Here’s one example for querying a GPO under User Configuration / Preferences / Control Panel Settings / Folder Options…

#requires -modules GroupPolicy
<#
.SYNOPSIS
  Get-GPPrefFolderOptionsComments.ps1
#>
param (
  [parameter(Mandatory=$True)]
  [ValidateNotNullOrEmpty()]
  [string] $GPOName
)
try {
  $policy = Get-GPO -Name $GPOName -ErrorAction Stop
}
catch {
  Write-Error "Group Policy $GPOName not found"
  break
}
$policyID = $policy.ID
$policyDomain = $policy.DomainName
$policyName  = $policy.DisplayName
$policyVpath = "\\$($policyDomain)\SYSVOL\$($policyDomain)\Policies\{$($policyID)}\User\Preferences\FolderOptions\FolderOptions.xml"
Write-Verbose "loading: $policyVpath"

if (Test-Path $policyVpath) {
  [xml]$SettingXML = Get-Content $policyVpath
  $result = $SettingXML.FolderOptions.GlobalFolderOptionsVista.desc
}
else {
  Write-Error "unable to load xml data"
}
Write-Output $result

Example…

So, What Now?

So, in the interest of avoiding more bad Mondays, start documenting your environment now.  Need some examples of where to start?  Here you go…

 

Thank you for your support.

Poor Man’s IT Chain Reactions

wpid-chinese-take-out.jpg

Challenge:

Make sure every machine in the enterprise (connected to LAN or always-on VPN) has the latest version of psexec.exe on the local C: drive.

Why?

Why not?  That’s why.

Option 1:

AKA – the semi-automated, safety switch turned off, fully-loaded, drunk guy holding the trigger option.

  1. Download psexec.exe from live.sysinternals.com (or direct: https://live.sysinternals.com/psexec.exe) and place into AD domain SYSVOL scripts folder (e.g. \\contoso.com\netlogon)
    Example…

    $WebClient = New-Object System.Net.WebClient
    $WebClient.DownloadFile("https://live.sysinternals.com/psexec.exe","\\contoso.com\netlogon\psexec.exe"
  2. Create Group Policy Object (GPO) with Computer Preferences setting to copy psexec.exe from the SYSVOL share to a location on the local C: drive. Configure to “update” so that future version updates will be passed down to the clients.
  3. Create a Scheduled Task to keep the SYSVOL copy up to date with the latest version.

Pros

  • Cheap (free)
  • Fairly automated (just add water, makes it’s own sauce / set it and forget it)

Cons

  • Smells like duct tape and coat hanger wire

Option 2:

AKA – The “I have a budget, so kiss my butt” option.

  1. SCCM package or application deployment

Pros

  • You look cool pulling it off, but not as geeky as option 1.

Cons

  • More moving parts under the hood.
  • May require additional steps to maintain a consistent current version across all devices.

Option 3:

AKA – The “I don’t have a budget, so kiss my butt” option.

  1. Include within image configuration (MDT, SCCM, Ghost, Acronis, etc.)

Pros

  • Easy

Cons

  • Difficult to maintain a consistent and current version across the enterprise

Option 4:

AKA – the “most fun to laugh about during the next beer-meeting” option

  1. Send the new guy around with a USB thumb drive

Pros

  • Great fun in the office

Cons

  • Do I really need to spell this out?

 

Interviews – Will IT Be More or Less Fun in 10 Years?

Question: “Do you expect that IT work in 10 years from now, will be more fun or less fun than it is today, and why?”

Mark Aldridge

I think that it will be more fun as there will so much more technology to learn in 10 years time and so many amazing features in ConfigMgr 2706!

DareDevelOPs

I think it will be more fun as open source projects become more the norm. Also as Infrastructure as Code becomes all things as code (ATaC?), the challenge level is going to go up. We will be more focused on solving the problem than running the infrastructure. Also the kinds of industries will change from country to country as the Human Development Index shifts and Nation State ranking changes. The level of Virtualization in all areas life will continue down Moore’s law critical path, even as the compute hardware reaches Moore’s limit. te things we IT will get shinier. …or Skynet.

Stephen Owen

More fun! IT has only gotten more interesting and varied, with the introduction of mobile devices and tons of new form factors. I think in ten years we all will finally know what we’re doing with Windows updates, and probably have a better handle on security practices.

I think the Wild West days of IT are behind us, and I for one am happy about it. My phone definitely rings less on weekend now than it did five years ago.

Damien Van Robaeys

When I see all available technologies, I can imagine that, in 10 years, the working environment will also change.
This will be the time of mixed reality, even if Minority report won’t be for now.
I hope we will work with Holographic computers, like the Hololens.

Maybe computers, like laptop or desktop, if they still exist, will use an holographic screen and holographic keyboards.
Imagine your computer, in a small box that will display a screen above, and a keyboard on your desk.

Meetings would be done with holographic system, like in Star Wars , with a system that will allow you to say Hey call Mr X, and Mr X will appear in front of you in hologram, like Obi Wan Kenobi.

Rob Spitzer

Fun is such a relative thing. I’ve met DBAs that are super passionate about their jobs yet I can’t imagine how that could be any fun. Conversely I’ve been asked on multiple occasions how I deal with Exchange every day. It’s just something I found that I enjoy doing.

There’s no doubt IT is changing. We’ve seen this happen before. We rarely build hardware anymore and now we’re seeing things like software installation and configuration go away as we move more to the cloud. I’ve seen Exchange change a lot over the last 20 years but, at its heart, it’s still the same thing I’ve enjoyed all along, even in the cloud.

You just need to make sure you find a role that you’re passionate about. If you have a hard time putting down at the end of the day, odds are you found it.

Ami Casto

IT, fun? What? IT has been and will always be what you make of it. 10 years from now you’ll still be fixing some idiot policy you didn’t create but have to clean up the mess now that the poo has hit the fan. You’ll just have to keep looking for the things that make you passionate about what you do.

Arnie Tomasovsky

I expect it to be less fun, as thanks to AI, everything will be a lot more automated. BUT human being will remain as the end user, therefore fun won’t disappear 🙂

Johan Arwidmark

I expect it to be more fun, and more complex. Why? Hopefully less politics, and more ongoing maintenance/upgrades, and more automation.

Nicke Kallen

There are two directions that this can go in… either we aim for a specialized knowledge set where employees will continue tinkering as they do today. The number will not be as many as we have today, but larger corporations will still depend on this knowledge and for the people that have actively developed this skillset – it’s a lot more fun.

The other option is that we are somewhere down the journey to be completely commoditized. Perhaps a few service providers have staff, but apart from that we define business requirements and ensure the logistics part of delivering IT works. Its most likely not the cup of tea for today’s it workers…

Mike Terrill

I think IT will be even more fun 10 years from now. The reason for this is because our field is growing at a rapid pace and will continue to do so over the next 10 years. Just imagine some of the gadgets we will have in the future and how much AI will have progressed.

Rod Trent

A: <Beavis and Butthead mode on…> Hehe…you said work in IT is fun </Beavis and Butthead mode off>

Chris DeCarlo

So I’m sure everyone you asked this question will say “More fun…” So I’ll play devil’s advocate here and say less fun. AI is already making decent strides, and with the great progress of robots and VR already I envision AI being fully integrated into robotics in the next 10 years. These AI enhanced robots will take over our call centers and end user support roles with 24×7 support and no need for breaks or health care. From there AI will be integrated into the Windows OS and automatically Google( or I mean Bing) and fix any errors that appear on your server/sccm software leaving us “organ sacks” or “blood bags” with basic tasks such as lubricating the robots joints, and polishing the robots shiny metal ….

Skatterbrainz

More fun for some.  Less fun for others.  More work for software folks, less work for hardware folks.  In all, I think there will be some serious reduction in IT staffing for many data center roles, as those things morph into “Software-Defined <x>” while evaporating into the cloud.  Then again, it’s not inconceivable that some unforeseen events could trigger a massive reversion from cloud back to on-prem.  Government intrusion, for one, might have that sort of impact.

IT Pro Tips for Android Users

Warning: The following information may contain extreme and dangerous technical terminology. Read at your own risk.

  1. Do not submerge your device in water unless it is encased in a waterproof case.
  2. Do not smack the touch screen with large, rhinestone-embossed, 10k gold-plated pimp ring.
  3. Do not operate your device without some sort of protective case.  A protective case does not include duct tape or ziplock baggies.
  4. Do not puke on your device.
  5. Do not use your device as a substitute car jack.
  6. Do not use your device to crush dangerous insects or spiders.
  7. Do not use your device to pry open the jaws of an angry pit bull.
  8. Do not pound nails with your device.
  9. Do not use the flashlight feature on your phone to search in the dark for your phone.
  10. Do not attempt to clean a soiled device by dunking it in boiling water, or paint thinner.
  11. Do – Remove Facebook apps to increase battery life from 3 hours to 7 hours.
  12. Do  – Remove all apps to increase battery life from 7 to 400 hours.
  13. Do – Disable NFC unless you’re one of those weirdos that has an NFC device.
  14. Do – Disable Bluetooth unless you have Bluetooth devices to pair it with.
  15. Do – Install a real keyboard app on it as soon as possible (e.g. Swiftkey)
  16. Do – Avoid conversations with iPhone users. If it cannot be avoided, be sure to respond to the other person’s first question (regardless of the topic) with “You know, Android has been scientifically proven to extend penises.”
  17. Do – Avoid conversations with other Android device users if they’re not from the same manufacturer.  For example, if an LG device touches a Samsung device, it may cause a matter/anti-matter implosion.
  18. Do – Perform a factory-reset after you reach level 40 on Candy Crush Soda Saga.
  19. Do – enable cloud backup services.  If you cannot enable cloud backup services, toss the device in the nearest fire.
  20. Do not forget to enjoy using your new Android device.

SCCM, SQL, DBATools, and Coffee

Warning:  This article is predicated on (A) basic reader familiarity with System Center Configuration Manager and the SQL Server aspects, and (B) nothing better to do with your time.

Caveat/Disclaimer:  As with most of my blog meanderings, I post from the hip.  I fully understand that it exposes my ignorance at times, and that can be painful at times, but adds another avenue for me to learn and grow.

marie-wilson-cooking

I don’t recall exactly when I was turned onto Ola Hallengren, or Steve Thompson, but it’s been a few years, at least.  The same could be said for Kent Agerlund, Johan Arwidmark, Mike Niehaus, and others.  None of whom I’ve yet to meet in person, but maybe some day.  However, that point in time is when my Stevie Wonder approach to SQL “optimization” went from poking at crocodiles with a pair of chopsticks, to saying “A-Ha!  THAT’s how it’s supposed to work!

As a small testament to this, while at Ignite 2016, I waited in line for the SQL Server guy at his booth, like an 8 year old girl at a Justin Bieber autograph signing, just to get a chance to ask a question about how to “automate SQL tasks like maintenance plans, and jobs, etc.”.  The guy looked downward in deep thought, then looked back at me and said “Have you heard of Ola Hallengren?”  I said “Yes!” and he replied, “he’s your best bet right now.

Quite a lot has changed.

For some background, I was working on a small project for a customer at that time focusing on automated build-out of an SCCM site using PowerShell and BoxStarter.  I had a cute little gist script that I could invoke from the PowerShell console on the intended target machine (virtual machine), and it would go to work:

  • Install Windows Server roles and features
  • Install ADK 10
  • Install MDT 2013
  • Install SQL Server 2014
  • Adjust SQL memory allocations (min/max)
  • Install WSUS server role and features
  • Install Configuration Manager
  • Install ConfigMgr Toolkit 2012 R2
  • and so on.

Since it was first posted, it went through about a dozen iterative “improvements” (translation: breaking it and fixing and improving and breaking and fixing, and repeat).

The very first iteration included the base build settings as well, such as naming the computer, assigning a static IPv4 address, DNS servers and gateway, join to an AD domain, etc.  But I decided to pull that part out into a separate gist script.

The main thing about this experiment that consumed the most time for me was:

  1. On-the-fly .INI construction for the SQL automated install
  2. On-the-fly .INI construction for the SCCM install
  3. On-the-fly SQL memory allocation configuration

Aside from the hard-coding of content sources (not included on this list), item 2 drove me nuts because I didn’t realize the “SA expiration” date property was required in the .INI file.  The amount of coffee I consumed in that 12 hour window would change my enamel coloring forever.  Chicks dig scars though, right?  Whatever.

Then came item 3.  I settled on the following chunk of code, which works…

$SQLMemMin = 8192
$SQLMemMax = 8192
...
write-output "info: configuring SQL server memory limits..."
write-output "info: minimum = $SQLMemMin"
write-output "info: maximum = $SQLMemMax"
try {
  [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic') | Out-Null
  [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
  $SQLMemory = New-Object ('Microsoft.SqlServer.Management.Smo.Server') ("(local)")
  $SQLMemory.Configuration.MinServerMemory.ConfigValue = $SQLMemMin
  $SQLMemory.Configuration.MaxServerMemory.ConfigValue = $SQLMemMax
  $SQLMemory.Configuration.Alter()
  write-output "info: SQL memory limits have been configured."
}
catch {
  write-output "error: failed to modify SQL memory limits. Continuing..."
}

But there’s a few problems, or potential problems, with this approach…

  1. It’s ugly (to me anyway)
  2. The min and max values are static
  3. If you change this to use a calculated/derived value (reading WMI values) and use the 80% allocation rule, and the VM has dynamic memory, it goes sideways.

Example:

$mem = $(Get-WmiObject -Class Win32_ComputerSystem).TotalPhysicalMemory
$tmem = [math]::Round($mem/1024/1024,0)
...

I know that option 2 assumes a “bad practice” (dynamic memory), but it happens in the real world and I wanted to “cover all bases” with this lab experiment.  The problem that it causes is that the values returned from a WMI query can fluctuate along with the host memory allocation status, so the 80% value can be way off at times.

Regardless, forget all that blabber about static values and dynamic tragedy.  There’s a better way.  A MUCH better way.  Enter DBATools.  DBATools was the brainchild of Chrissy LeMaire, which is another name to add to any list that has Ola’s name on it. (side note: read Chrissy’s creds, pretty f-ing impressive). There are other routes to this as well, but I’ve found this one to be most user friendly for my needs. (Feel free to post better suggestions below, I welcome feedback!)

Install-Module dbatools
$sqlHost = "cm01.contoso.com"
$sqlmem = Test-DbaMaxMemory -SqlServer $sqlHost
if ($sqlmem.SqlMaxMB -gt $sqlmem.RecommendedMB) {
  Set-DbaMaxMemory -SqlServer $sqlHost -MaxMB $sqlmem.RecommendedMB
}

This is ONLY AN EXAMPLE, and contains an obvious flaw: I’m not injecting an explicit 80% derived value for the -MaxMB parameter.  However, this can be accomplished (assuming dynamic memory is not enabled) as follows…

Install-Module dbatools
$sqlHost = "cm01.contoso.com"
$sqlmem = Test-DbaMaxMemory -SqlServer $sqlHost
$totalMem = $sqlmem.TotalMB
$newMax = $totalMem * 0.8
if ($sqlmem.SqlMaxMB -ne $newMax) {
  Set-DbaMaxMemory -SqlServer $sqlHost -MaxMB $newMax
}

Here’s the code execution results from my lab…

sqlmemory2

You might have surmised that this was executed on a machine which has dynamic memory enabled, which is correct.  The Hyper-V guest VM configuration is questionable…

hyperv_setup1.png

This is one of the reasons I opted for static values in the original script.

Thoughts / Conclusions

Some possible workarounds for this mess would be trying to detect dynamic memory (from within the guest machine) which might be difficult, or insist on a declarative static memory assignment.

Another twist to all of this, and one reason I kind of shelved the whole experiment, was a conversation with other engineers regarding the use of other automation/sequencing tools like PowerShell DSC, Ansible, and Terraform.

The final takeaway of this is to try and revisit any projects/code which are still in use, to apply newer approaches when it makes sense.  If that means shorter code, improved security and performance, more capabilities, greater abstraction/generalization (for reuse), or whatever, it’s good to bring newer ideas to bear on older tools.  In this example, it was just replacing a big chunk of raw .NET reflection code with cleaner and more efficient PowerShell module code.  Backing out 10,000 feet, the entire gist could be replaced with something more efficient.

More Information

DBATools – twitterslackyoutubegithub

Ola Hallengren – web  (Ola doesn’t tweet much, yet)

My Twitter list of super awesometacular increditastical techno-uber genius folks – HERE

Back to my coffee.  I hope you enjoyed reading this!  Please post comments, thoughts, criticisms, stupid jokes, or winning lottery numbers below.  If nothing else, please rate this article using the stars above? – Thank you!