Projects, Scripting, System Center, windows

sktools

sktools2

UPDATE: 1/14/2019 – version 1901.13.2 was posted to address a problem with the previous upload.  Apparently, I posted an out-of-date build initially, so I’ll call this the “had another cup of coffee build”.

Dove-tailing from the previous idiotic blog post, I’ve taken some time off to retool, rethink, redesign and regurgitate “skattertools” as a single PowerShell module.  The new version blends PoSHServer into the module and removes the need to perform a separate install for the local web listener.  The first version of this is 1901.13.1 (as in 2019, 01=January 13th, 1st release).

How to Install and Configure sktools

  • Open a PowerShell console using Run as Administrator
  • Type: Install-Module sktools
  • Type: Import-Module sktools
  • Type: Install-SkatterTools (this creates a default “sktools.txt” configuration file in your “Documents” folder)
  • Type: Start-SkatterTools
  • Open your browser and navigate to http://localhost:8080

This next part is only temporary, and will be improved upon soon:

  • Once the web console is open, expand “Support” and click “Settings” and modify to suit your Configuration Manager site environment.
  • Close and reopen the PowerShell console (still “Run as Administrator”)
  • Type: Start-SkatterTools
  • Refresh your web browser session

Work will continue until morale is eliminated.  Easter eggs are included, sort of.  Thoughts, feedback, bug reports, enhancement requests, angry snarky comments, are all welcome.  Enjoy!

Advertisements
humor, Personal, Scripting, Technology

$HoHoHo = ($HoList | Do-HoHos -Days 12) version 1812.18.01

santa-riot

UPDATE: 2018.12.18 (1812.18.01) = Thanks to Jim Bezdan (@jimbezdan) for adding the speech synthesizer coolness!  I also fixed the counter in the internal loop.  Now it sounds like HAL 9000 but without getting your pod locked out of the mother ship. 😀

I’m feeling festive today.  And stupid.  But they’re not mutually exclusive, and neither am I, and so can you!   Let’s have some fun…

Paste all of this sticky mess into a file and save it with a .ps1 extension.  Then put on your Bing Crosby MP3 list and run it.

Download from GitHub: https://raw.githubusercontent.com/Skatterbrainz/Utilities/master/Invoke-HoHoHo.ps1

The function…

function Write-ProperCounter {
    param (
      [parameter(Mandatory=$True)]
      [ValidateRange(1,12)]
      [int] $Number
    )
    if ($Number -gt 3) {
        return $([string]$Number+'th')
    }
    else {
        switch ($Number) {
            1 { return '1st'; break; }
            2 { return '2nd'; break; }
            3 { return '3rd'; break; }
        }
    }
}

The bag-o-gifts…

$gifts = (
    'a partridge in a Pear tree',
    'Turtle doves, and',
    'French hens',
    'Colly birds',
    'gold rings',
    'geese a-laying',
    'swans a-swimming',
    'maids a-milking',
    'ladies dancing',
    'lords a-leaping',
    'pipers piping',
    'drummers drumming'
)
# the sleigh ride...
Add-Type -AssemblyName System.Speech
$Speak = New-Object System.Speech.Synthesis.SpeechSynthesizer

for ($i = 0; $i -lt $gifts.Count; $i++) {
    Write-Host "On the $(Write-ProperCounter $($i + 1)) day of Christmas, my true love gave to me:"
    $speak.speak(“On the $(Write-ProperCounter $($i + 1)) day of Christmas, my true love gave to me,”)
    $mygifts = [string[]]$gifts[0..$i]
    [array]::Reverse($mygifts)
    $x = $i + 1
    foreach ($gift in $mygifts) {
        if ($x -eq 1) {
            $thisGift = $gift
        }
        else {
            $thisGift = "$x $gift"
        }
        Write-Host "...$thisGift"
        $Speak.Speak($thisGift)
        $x--
    }
}

Enjoy!

Projects, Scripting, Technology

The Little (Code) Stuff That (Sometimes) Matters

As a follow-up to the post about tuning PowerShell scripts, this is going to be more general (language neutral).  I’d like to run through some of the “efficiency” or optimization techniques that apply to all program/script languages, due to how they’re parsed, and executed at the lowest layer of a conventional x86/x64 system.

Why?  Good question.  I’ve been digging into some of the MIT OpenCourseware content and it brought back (good) memories from college studies.  So I figured, why not.

Condition Prioritization

Performance isn’t mentioned as much these days outside of gaming or content streaming topics.  But processing any iterative or selective tasks that deals with larger volumes of data can still benefit greatly from some very simple techniques.

Place the most-common case higher in the condition tests.  This is also a part of heuristics, which is basically intuition or educated guessing, etc.  Using pseudo-code, here’s an example:

while ($rownum -lt $total) {
  switch ($dataset[$rownum].SomeProperty) {
    value1 { Do-Something; break; }
    value2 { Do-SomethingElse; break; }
    default { Fuck-It; break; }
  }
  $rownum++
}

Let’s assume that “value2” is found in 90% of the $dataset rows.  In this basic while-loop with a switch-case condition test, a small data set (chewed up into $dataset), won’t reveal much in terms of prioritizing the switch() tests.  Remember, that mess above is “pseudo-code” so don’t yell at me if it blows up if you try to run it.

Anyhow, what happens when you’re chewing through 400 billion rows of terabytes of data? The difference between putting “value2” above “value1” can be significant.

This is most commonly found with initialization loops.  Those are when you start with a blank or unassigned value, and as the loops continue, the starting value is incremented or modified.  There is often a test within the iteration that checks if the value has been modified from the original.  Since the initial (null) value may only exist until the first cycle of the iteration, it would make sense to move the condition [is modified] above [is not modified] since it skip an unnecessary test on each subsequent iteration cycle.

Make sense?  Geez.  I ran out of coffee 3 hours ago, and it almost makes sense to me.  Just kidding.

Sorted Conditions / Re-Filtering

Another pattern that you may run across is when you need to check if a value is contained within an Array of values.  For most situations, you can just grab the array and check if the value is contained within it and all is good.  But when the search array contains thousands or more elements, and you’re also looping another array to check for elements, you may find that sorting both arrays first reduces the overall runtime.  That’s not all, however.

What happens when the search value begins with “Z” and your search array contains a million records starting with “A”?  You will waste condition testing on A-Y.

What if you instead add a step within the iteration (loop) to essentially “pop” the previously checked items off of the search array?  So, after getting to search value “M”, the search array only contains elements which begin with “M” and on to “Z”, etc.

Figure 1 – Static Target Search Array

filtersearch1.png

Figure 2 – Reduction Target Search Array

filtersearch2

To help explain the quasi-mathematical gibberish above: S = Search Time, R = Array Reduction Overhead Time, N = Elements in Search Set.  So R+1 denotes the time incurred by calculating the positional offset, and moving the search array starting index to the calculated value.  Whereas, S alone indicates just starting each iteration on the first element of the (static) target array and incrementing until the matching value is found.

So, what does this look like with PowerShell?  Here’s *one* example…

Figure 3 – PowerShell sample code

param (
  [parameter(Mandatory=$False, HelpMessage="Pretty progressbar, but slower to run!")]
  [switch] $PrettyProgress
)
# build an array of ("A1","A2",...,"A100","B1","B2",...) up to 26 x 100 = 2600 elements

$searchArray = @()
$elementCount = 1000
$tcount = $elementCount * 26
$charArray = @()

cls

Write-Host "building search array..."
for ($i = 65; $i -le (65+25); $i++) {
  $c = [char]$i
  $charArray += $c
  for ($x = 1; $x -le $elementCount; $x++) {
     $cc = "$c$x"
     $searchArray += $cc
     if ($PrettyProgress) { Write-Progress -Activity "$($charArray -join ' ')" -Status "Building array set" -CurrentOperation "$c $x" -PercentComplete $(($m / $tcount) * 100) }
  }
}
# define list of search values...
$elementList = @("A50","C99","D75","K400","M500","T600","Z900")
$randomList  = @("T505","C99","J755","K400","A55","U401","Z960")

Write-Host "`nStatic search array"
foreach ($v in $elementList) {
  $t1 = Get-Date
  $test = ($v -in $searchArray)
  $t2 = Get-Date
  Write-Output "$v = $((New-TimeSpan -Start $t1 -End $t2).TotalSeconds)"
}

# protect the original target array for possible future use...
$tempArray = $searchArray

Write-Host "`nReduction search array"
foreach ($v in $elementList) {
  $t1 = Get-Date
  $test = ($v -in $tempArray)
  $t2 = Get-Date
  # this is the real "R"...
  $pos = [array]::IndexOf($tempArray, $v)
  $tempArray = $tempArray[$pos..$tempArray.GetUpperBound(0)]
  Write-Output "$v = $((New-TimeSpan -Start $t1 -End $t2).TotalSeconds)"
}

Figure 4 – PowerShell example process output

arraysearch1.png

The time values are in seconds, and will vary with each run depending upon thread processing overhead incurred by the host background processes.  But in general, the delta between the matched values in each iteration will be roughly the same.  To see this visually, here’s an Excel version…

Figure 4 – Spreadsheet table and Graph result

arraysearch2

It’s worth noting that the impact of R may vary by language, as well as processing platform (hardware, operating system, etc.) along a different vector than others, but that within the iteration tests, the differences should be roughly similar.

There are other methods to reduce the target array as well, which may depend upon the software language used to process the tasks.  For example, whether the interpreter or compiler makes a complete copy of the search array in the background in order to provide the index offset starting point to the script.

Again, this is all relatively meaningless for smaller data sets, or less complex data structures.  And it really only provides significant value for sequential (ordered) search operations, not for random search operations.

So, some questions might arise from this:

  1. If the source array is not sorted, does the sorting operation itself wipe out the aggregate time savings of the reduction approach?
  2. Where is the “tipping point” that would cause this approach to be of value?

These are difficult to answer.  The nature of the data within the array will have an impact I’m sure, as might the nature by which the array is provided (on demand or static storage, etc.) . To paraphrase a Don Jones statement: “try it and see.”

Now that I’m done pretending to be smart, I’m going to grab a beer and go back to being stupid.  As always – I welcome your feedback.  – Enjoy your weekend!

Scripting, Technology

The Basic Basics of Evolving a Basic (PowerShell) Script, Basically Speaking

hangover_movie_ap

In case it wasn’t obvious from the heading: This is very very very very basic basic stuff.  This is intended for people just starting to work with PowerShell.  Typical scenario:

  • Person creates a script to perform a single task (example: copy files)
  • Person doesn’t consider the future of that script (additional uses)
  • Person reads this article and decides their script may have a future
  • Person reads this article and decides to increase their alcohol consumption rate

I put PowerShell in parenthesis because this topic is really language-agnostic. I’m basing much of this on one of the course lectures from back when I attended Christopher Newport University years ago. In fact, that was when “software” was made from leather, “hardware” was either stone or wood, and processors ran on coal.

But anyhow, the point of this is to revisit something I often see with people who are just starting out with programming or scripting.  That is, how to give a basic script a tune-up, to make it more useful, and develop better coding habits going forward.

The format of this article will take an example script, and gradually (iteratively) modify it to address a few basic aspects that are commonly overlooked.  The example script is “copy-stuff.ps1”.

Version 1.0 – no diaper. poo everywhere, plays with loaded guns and broken liquor bottles in traffic.  But so, sooooooo cute…

$TargetPath = "\\fs02\docs\files"
$files = Get-ChildItem "c:\foo" -Filter "*.txt"
foreach ($file in $files) {
  copy $file.FullName $TargetPath
}

Example usage:

.\Copy-Stuff.ps1

This little chunk of tasty goodness works fine, and you just want to pinch its little fat cheeks and say stupid baby-talk things.  But it’s really rough around the edges.  It sticks forks in electrical outlets, yanks the dog’s tail, and keeps puking on everything.

Some things that would help make this script more useful:

  • Portability
    • What if you wanted to use this same script for different situations?
  • Exception handling
    • What if some (or all) of the things expected cannot be found at runtime?
    • What if the user doesn’t have permissions to source or target locations?
    • What if the target location doesn’t have enough free disk space?
    • What if you want to enforce some safeguards to prevent killing your network or disk space?
  • Self-describing information (help)
    • How can you make this easier for a new person to “figure out”?
  • Gold Teeth
    • Add a little spit-shine polish with your roommate’s best t-shirt

Version 1.1 – Portability and diaper added, learning “da da” already.

param (
  $TargetPath = "\\fs02\docs\files",
  $SourcePath = "c:\foo",
  $FileType = "*.txt"
)
$files = Get-ChildItem $SourcePath -Filter $FileType
foreach ($file in $files) {
  copy $file.FullName $TargetPath
}

Now, the script can be called with -TargetPath, -SourcePath and -FileType parameters to work with different paths and file types.

Example usage:

.\Copy-Stuff.ps1 -TargetPath "c:\folder2" -SourcePath "c:\folder1" -FileType "*.jpg"

But this still doesn’t help with Error Handling.  For example, if the user enter “x:\doofusbrain” or “y:\YoMamaSoBigSheGotLittleMamasOrbitingHer” and they don’t actually exist in the corporeal reality we call “Earth”

Version 1.2 – Error Handling and utensils added, in a high-chair with a cold beer

param (
  $TargetPath = "\\fs02\docs\files",
  $SourcePath = "c:\foo",
  $FileType = "*.txt"
)
if (!(Test-Path $SourcePath) -or !(Test-Path $TargetPath)) {
  Write-Warning "check those paths son. you might be on drugs."
  break
}
$files = Get-ChildItem $SourcePath -Filter $FileType
foreach ($file in $files) {
  copy $file.FullName $TargetPath
}

At this point, it’s portable and checking for things before putting both feet in.  But it still needs some body work.  For example, suppose that the user staggers in from a night in jail, drops their liquor bottle and tries to invoke your script, but instead of putting in a non-existent -SourcePath value, they enter “” (an empty string).

.\Copy-Stuff -TargetPath "dog" -SourcePath "" -FileType "I'm sooo wastsed"

It’s time to add some parameter input validation gyration to this…

Version 1.3 – Kevlar-lined diapers, baby bottle converts into a pink “Hello Kitty!” RPG launcher…

param (
  [parameter(Mandatory=$False)]
    [ValidateNotNullOrEmpty()]
    [string] $TargetPath = "\\fs02\docs\files", 
  [parameter(Mandatory=$False)]
    [ValidateNotNullOrEmpty()]
    [string] $SourcePath = "c:\foo", 
  [parameter(Mandatory=$False)]
    [ValidateSet('TXT','JPG')]
    [string] $FileType = "TXT" 
)
if (!(Test-Path $SourcePath) -or !(Test-Path $TargetPath)) {
  Write-Warning "check those paths son. you might be on drugs."
  break
}
$files = Get-ChildItem $SourcePath -Filter "*.$FileType"
$filecount = $files.Count
$copycount = 1
foreach ($file in $files) { 
  copy $file.FullName $TargetPath 
  Write-Output "copied $copycount of $filecount files"
  $copycount++
}

The indention of each [ValidateNotNullOrEmpty()] and [string] within the param() block are really not necessary.  I added them for visual clarity.  In fact, you could put the entire param() block on a single line, as long as you use comma separators and inhale enough paint solvent fumes first.  I recommend Xylene.

Notice that I switched from nude beach free-for-all party time on -FileType to a suit-wearing, neatly groomed, conservative business person variation using ValidateSet().  This takes away the loaded gun and gives the baby a squirt gun with only a few teaspoons of clean, luke-warm water.

Note: You could swap the [ValidateNotNullOrEmpty()] stuff with [ValidateScript()] and apply some voodoo toilet water magic to test for valid path references *before* diving into the murkiness.  But I already spent $5 on the (Test-Path) bundle and didn’t want to waste it.  Option C would be to inform every user that intentional misuse may result in their vehicle experiencing sudden loss of paint and tire pressure.

But – there’s still at least one more “error” case to consider.  What if the copy operations can’t be completed, no matter what?

What if the script is invoked by a user or service account/context, which doesn’t have sufficient permissions to the source or target locations to read and/or copy (write) the files?  Or what if the target location doesn’t have enough free disk space to allow the files to be copied?  So many “what-if’s”.

Version 1.4 – Old enough to drink, and shoot guns, but still getting carded at the door

param (
  [parameter(Mandatory=$False)]
    [ValidateNotNullOrEmpty()]
    [string] $TargetPath = "\\fs02\docs\files",
  [parameter(Mandatory=$False)]
    [ValidateNotNullOrEmpty()]
    [string] $SourcePath = "c:\foo",
  [parameter(Mandatory=$False)]
    [ValidateSet('TXT','JPG')]
    [string] $FileType = "TXT" 
)
if (!(Test-Path $SourcePath) -or !(Test-Path $TargetPath)) {
  Write-Warning "check those paths son. you might be on drugs."
  break
} 
$files = Get-ChildItem $SourcePath -Filter "*.$FileType"
$filecount = $files.Count
$copycount = 1
foreach ($file in $files) { 
  try {
    copy $file.FullName $TargetPath -ErrorAction Stop
    Write-Output "copied $copycount of $filecount files"
    copycount++
  }
  catch {
    Write-Error $Error[0].Exception.Message
    break
  }
}

There’s much more you can do with error (exception) handling.  You could enforce restrictions on file types, or file sizes.  You could check for the error type and display more targeted explanations, rather than just dumping the $Error[0].Exception.Message content.  For more on this topic, I recommend this.

Version 1.5 – Self-Describing Help with french fries and a lobster bib

param (
  [parameter(Mandatory=$False, HelpMessage = "Destination Path")]
    [ValidateNotNullOrEmpty()]
    [string] $TargetPath = "\\fs02\docs\files",
  [parameter(Mandatory=$False, HelpMessage = "Source Path")]
    [ValidateNotNullOrEmpty()]
    [string] $SourcePath = "c:\foo",
  [parameter(Mandatory=$False, HelpMessage = "File extension filter")]
    [ValidateSet('TXT','JPG')]
    [string] $FileType = "TXT" 
)
if (!(Test-Path $SourcePath) -or !(Test-Path $TargetPath)) {
  Write-Warning "check those paths son. you might be on drugs."
  break
} 
$files = Get-ChildItem $SourcePath -Filter "*.$FileType"
$filecount = $files.Count
$copycount = 1
foreach ($file in $files) { 
  try {
    copy $file.FullName $TargetPath -ErrorAction Stop
    Write-Output "copied $copycount of $filecount files"
    $copycount++
  }
  catch {
    Write-Error $Error[0].Exception.Message
    break
  }
}

Now the script can be poked to display information that describes the purpose of each parameter.  But there’s so much more we can do to this.  But hit pause for a second…

> Why bother?  What’s wrong with a simple “copy from-this to-that” script?

The point of developing a skill/craft/caffeine-habit is to expand your capabilities and your value as a technical resource.  Anyone can fix a leak with duct tape.  But the person who can fix it with duct tape, while chugging a six-pack of beer, and singing Bohemian Rhapsody at the same time, is going to make a higher income.  And besides, it’s just cool stuff to learn.

Version 1.6 – the spit-polish, hand-rubbed, gluten-free version

[CmdletBinding(SupportsShouldProcess=$True)]
param (
  [parameter(Mandatory=$True, HelpMessage = "Destination Path")]
    [ValidateNotNullOrEmpty()]
    [string] $TargetPath,
  [parameter(Mandatory=$True, HelpMessage = "Source Path")]
    [ValidateNotNullOrEmpty()]
    [string] $SourcePath,
  [parameter(Mandatory=$False, HelpMessage = "File extension filter")]
    [ValidateSet('TXT','JPG')]
    [string] $FileType = "TXT" 
)
$time1 = Get-Date
if (!(Test-Path $SourcePath) -or !(Test-Path $TargetPath)) {
  Write-Warning "check those paths son. you might be on drugs."
  break
} 
$files = Get-ChildItem $SourcePath -Filter "*.$FileType"
$filecount = $files.Count
$copycount = 1
foreach ($file in $files) { 
  $pct = $($copycount / $filecount) * 100
  try {
    copy $file.FullName $TargetPath -ErrorAction Stop
    Write-Progress -Activity "Copying $copycount of $filecount files" -Status "Copying Files" -PercentComplete $pct
    $copycount++
  }
  catch {
    Write-Error $Error[0].Exception.Message
    break
  }
}
$time2 = Get-Date
Write-Verbose "completed in $([math]::Round((New-TimeSpan -Start $time1 -End $time2).TotalSeconds,2)) seconds"

Example usage (using -WhatIf):

.\Copy-Stuff.ps1 -SourcePath "c:\folder1" -TargetPath "x:\folder3" -Verbose -WhatIf

Some of the changes added to this iteration:

  • [CmdletBinding(SupportsShouldProcess=$True)] added so we can use Write-Verbose to toggle output display only when we really need it, and use -WhatIf to see what would happen if it actually did happen.
  • Write-Progress added for impressing people who like visual progress indication.
  • Displays total run time at the end, for those who are impatient.

[CmdletBinding()] vs. [CmdletBinding(SupportsShouldProcess=$True)] ?

If your code is going to modify things somewhere, and you’d like to have an option to try it in a “what-if?” mode first, use the longer form above.  If you only want to see verbose output (a la debugging/testing), you can use the shorter form above.  For more detail about this cool feature, and the other options it provides, click here.

Anyhow, I hope this was at least mildly helpful or amusing.  I’m sure half of you didn’t read this far, and half of those that did are rolling your eyes “He should’ve ____.  What a loser.”

Updated: Changed highlight color from dark red to turquoise because it sounds better. 🙂

Updated 2: Fixed “$FileType.*” to “*.$FileType” – thanks to @gpunktschmitz for catching that!

Anyhow, post feedback if you would like.  I’m once again weighing the future of this blog by the feedback (or lack thereof).  It’s starting to feel like talking into an empty room again.

Cloud, Projects, Scripting, Technology

Part 2 – Copy Azure Blob Containers between Storage Accounts using Azure Automation with Fries and a Drink

So, my previous post was about using PowerShell from a compute host (physical/virtual computer) to connect to an Azure subscription and copy containers between storage accounts.  I call that “part 1”.  This will be “part 2”, as it takes that smelly pile of compost and shovels into Azure Automation.

In short, the basic changes from the previous example:

  • Not nearly as much fuss with credentials within the PowerShell code
  • Configuration settings are stored in Azure as Variables, rather than a .json file
  • Less code!

The previous article refers to the diagram on the left.  This one refers to the one not on the left.

aa0

Assumptions

  • You have access to an Azure subscription
  • In Azure, you have at least one (1) Resource Group, having two (2) Storage Accounts (one for “source” and the other for “destination”.  The “backup” this performs is copying from “source” to “destination”)
  • You somehow believe I know what I’m talking about
  • You stopped laughing and thought “Shit. Maybe this idiot doesn’t know what he’s talking about?
  • After a few more minutes you thought “Why am I reading what I’m actually thinking right now?  How does he know what I’m thinking?  It’s like he’s an idiot savant!  Maybe he counts toothpicks on the floor while brushing his teeth…
  • You consider that this was written in November 2018, and Azure could have changed by the time you’re reading this.

Basic Outline

The basic goals of this ridiculous exercise in futility are (still):

  • Copy all (or selected) containers from Storage Account 1 to Storage Account 2 using an Azure Automation “runbook”, once per day.
  • The copy process will append “yyMMdd” datestamps to each container copied to Storage Account 2
  • The copy process will place the destination containers under a container named “backups”.  For example, “SA1/container1” will be copied to “SA2/backups/container1-181117”
  • Both storage accounts should be within the same Azure Resource Group, and in the same Region
  • New Goal: Eliminate a dedicated host machine for running the script in lieu of an Azure Automation Runbook.

Important!

This is a demo exercise only.  DO NOT perform this on a production Azure tenant without testing that absolute living shit out of it until your fingers are sore, your eyes are bloodshot and you’ve emptied every liquor bottle and tube of model glue in your house/apartment.

The author assumes NO responsibility or liability for any incidental, accidental, intentional or alleged bad shit that happens resulting from the direct or indirect use of this example.  Batteries and model glue not included.

Preparation

First, we need to set up the automation account and some associated goodies.  Some of the steps below can be performed using Azure Storage Explorer, or PowerShell, but I’m using the Azure portal (web interface) for this exercise.  Then we’ll create the Runbook and configure it, and run a test.

  1. From the Azure portal, click “All Services” and type “Automation” in the search box.
  2. Click the little star icon next to it.  This adds it to your sidebar menu (along the left)
  3. Click on “Automation Accounts
  4. Click “Add” near the top-left, fill in the Name, select the Resource Group, Location and click Create
  5. From the Automation Accounts blade (I hate the term “blade”, in fact I hate the Azure UI in general, but that’s for another paint-fume-sniffing article), click on the new Automation Account.

Credentials and Variables

  1. Scroll down the center menu panel under “Shared Resources” and click on “Credentials“, and then click “Add a credential” at the top. Fill in the information and click Create.  This needs to be an account which has access to both of the storage accounts, so you can enter your credentials here if you like, since this is only a demo exercise.
  2. Go back to “Automation Accounts” (bread crumb menu along top is quickest)
  3. Go back to the Automation Account again, and scroll down to “Variables
  4. Add the variables as shown in the example below.  All of the variables for this exercise are “String” type and Encrypted = “No”.  This part is a bit tedious, so you should consume all of your elicit substances before doing this step.

The Runbook

  1. Go back to the Automation Account again and click on “Runbooks
  2. Click “Add a runbook” from the menu at top, then click “Quick Create / Create a new runbook” from the middle menu pane.
  3. Enter a name and select “PowerShell” from the Runbook type list.  Enter a Description if you like, and click Create.

When the new Runbook is created, it will (should) open the Runbook editor view.  This will have “> Edit PowerShell Runbook” in the heading, with CMDLETS, RUNBOOKS, and ASSETS along the left, and line 1 of the editor form in the top-middle.

  1. Copy/Paste the code from here into the empty space next to line 1 in the code editor.
  2. Make sure the variable names at lines 10-16 match up with those you entered at step 9 above.  If not: for each variable that needs to be corrected: delete the code to the right of the equals sign (“= Get-AutomationVariable -Name …”), place the cursor after the “=” and click Assets > Variables > then click the “…” next to the variable you want, and select “Add “Get Variable” to canvas“. (see example below)
  3. After entering the code and confirming the variable assignments, click Save.  Don’t forget to click Save!

Testing

  1. Click “Test pane” to open the test pane (I’m shocked they didn’t call it the “test blade”) – Tip: If you don’t see “Test pane” go back to the Runbook editor, it’s at the top (select the Runbook, click Edit).
  2. Click “Start” and wait for the execution to finish.  (Note: Unlike running PowerShell on a hosted computer, Azure Automation doesn’t show the output until the entire script is finished running)

Code Note: You may notice that line 97 ($copyJob = Start-AzureStorageBlobCopy…) is commented out.  This is intentional so as to mitigate the chances of you accidentally copying an insane amount of garbage and running your Azure bill into the millions of dollars.

Testing Note: Since line 97 is commented out, the test should simply show what was found, but no copies are actually processed.  In the last image example (below) you will still see “copy completed” for each container set, but that’s just more glue-sniffing imaginary hallucination stuff for now.  Once you remove the comment, that becomes very real.  As real as Kentucky Fried Movie 3D punching scenes.

When you’ve tested this to your satisfaction, simply uncomment that line (or better yet, add $WhatIfPreference = $True at the top of the script, just below the $VerbosePreference line)

aa1

aa2

(there was a sale on red arrows and I couldn’t say no)

aa3a

aa3b

aa3

aa4

aa5

aa6

aa7.png

 

 

 

Cloud, Projects, Scripting, Technology

Backup Azure Blob Containers Between Storage Accounts

This is NOT an article which is intended to say “this is how you do it”. This is an article that says “this is ONE way you COULD do it, if you’re dealing with the same conditions”. Okay. Step away from the ledge and sit down. You’re blocking my spot on the ledge.

The Challenge

Customer asked for the following:

  • Copy blob containers from Storage Account 1 (SA1)to Storage Account 2 (SA2)
  • Source containers should be copied to folders on SA2 such that they’re renamed to append a date stamp (e.g. “Container1-181114”)
  • Should be able to explicitly control includes and excludes for groups of source containers
  • Should support job scheduling
  • Should support using external configuration files (Azure subscription, Resource Group, Storage Accounts, etc.)
  • Should support running from a Windows Server VM running in Azure (IaaS)
  • You have 2 hours to accomplish this

Options

  • Third-party backup products
  • Azure Function App
  • PowerShell script

Rationale

Third-party backup products cost money and require installation, learning curves, etc.  Azure Function App is arguably the ideal option, but I haven’t worked with it/them enough to meet the requirements in the allotted time.  I chose PowerShell because it requires the least effort and impact for on-boarding (installation, configuration, learning curve, cost, etc.).

Approach

  • A single script (portable)
  • A set of .JSON configuration files
  • AzureRM module and Azure credentials

I had some code laying around from a past project that involved uploading and downloading content to Azure RM storage.  This time it was copying between two Azure RM storage accounts using a VM host residing in Azure.  So I decided to map out the inputs/outputs and draft the configuration data file.

Sample Configuration File

{
    "ResourceGroupName": "Toilet",
    "SourceStorageAccount": "turdstorage1",
    "DestinationStorageAccount": "turdstorage2",
    "StorageAccountKey1": "<insert really long key here>",
    "StorageAccountKey2": "<insert another really long key here>",
    "CustomerName": "StinkTech",
    "AzureUserID": "you@yourcompany.com",
    "SubscriptionName": "YourSubscriptionName",
    "DestinationContainer": "backups",
    "BackupDateFormat": "yyMMdd",
    "IncludeContainers": "",
    "ExcludedContainers": ["azure-webjobs-hosts","junk"]
}

So, what does this “say”?  Besides mentioning fecal matter a few times, it describes the following:

  • The Azure RM resource group name
  • The two (2) storage account names
  • The two (2) storage account access (primary) keys
  • I’ll explain the “customername” later
  • The Azure Subscription name
  • The Azure Subscription user ID
  • The Destination container name (beneath DestinationStorageAccount)
  • The date-stamp format for appending to the destination folders
  • Lists of (source) container names to constrain the total list and/or exclude from the total list

Sample

  • Container1
  • Container2
  • Container3

If “IncludeContainers” = [“Container1″,”Container3”] then only those two will be copied/backed-up.

If “ExcludeContainers” = [“Container3”] then only Container1 and Container2 will be copied/backed-up.

If “IncludeContainers” = [“Container1″,”Container3”] and “ExcludeContainers” = [“Container2″,”Container3”] then the final (result) list of containers that will be copied is “Container1”.  And if “IncludeContainers” = “ExcludeContainers” then you’re just fucked.  I’m sorry, but life can be hard sometimes.

“CustomerName” is used for a rather handy little feature buried inside this mess: Azure login credential storage and recall.  Basically, the first time you run the script, you will be prompted to enter credentials to connect to the specified Azure tenant.  This will use the “SubscriptionName” and “AzureUserID” values from the .JSON file to process the login request.  If the request is successful, the password is encrypted/hashed and stored in a separate .JSON file, which uses “CustomerName” as the base name.  So if “CustomerName”: “StinkTech” then the credentials file will be named “StinkTech.json”

WARNING: Be careful not to assign “CustomerName” to the same name as your configuration .JSON file!  This will cause the credential storage to overwrite the configuration file.  I could add some exception-avoidance code for that, but that wouldn’t be any fun.  People often go to NASCAR races with hopes of seeing a good crash at some point (as long as nobody gets hurt).

Using the example above (Container names), along with the sample JSON configuration file, the containers would be copied over to destination Container “backups” and be named [ContainerName]+[YYMMDD].  So “Container1” will be copied to “backups/Container1-181114” and so on.  Why not just copy like-for-like and not shove them under a mid-level container?  Because, that’s what the customer insisted.

The Code

#requires -version 5.0
#requires -modules AzureRM.Storage
<#
.DESCRIPTION
    Backup Azure Blob Containers from one Storage Account to Another.
    Copies each source container and blob to a date-stamped name in the second account.
    Example: SA1\container1\folder\133.txt --> SA2\backups\container1-181114\folder\133.txt
.PARAMETER ConfigFile
    Path to JSON configuration file
.PARAMETER Force
    Force overwrite of existing targets (destinations) if they already exist
.EXAMPLE 
    .\Copy-AzureBlobs.ps1 -ConfigFile .\myconfig.json
.EXAMPLE
    .\Copy-AzureBlobs.ps1 -ConfigFile .\myconfig.json -Force -Verbose -WhatIf
.EXAMPLE
    .\Copy-AzureBlobs.ps1 -ConfigFile .\myconfig.json -Force -ResetCredentials
.NOTES
    1.0.0 - 2018/11/14 - First release (skatterbrainz)
#>
[CmdletBinding(SupportsShouldProcess=$True)]
param (
    [parameter(Mandatory=$True, HelpMessage="Path to configuration file")]
    [ValidateNotNullOrEmpty()]
    [string] $ConfigFile,
    [parameter(Mandatory=$False, HelpMessage="Force backups even when targets already exist")]
    [switch] $Force,
    [parameter(Mandatory=$False, HelpMessage="Force credential reset")]
    [switch] $ResetCredentials
)
$time1 = Get-Date
$Script:countall = 0
$Script:ccount = 0
function Invoke-AzureBlobBackup {
    [CmdletBinding(SupportsShouldProcess=$True)]
    param (
        [parameter(Mandatory=$False, HelpMessage="List of source containers to exclude from backups")]
            [string[]] $ExcludeContainers
    )
    Write-Verbose "connecting to storage accounts"
    $context1 = New-AzureStorageContext -StorageAccountName $SourceStorageAccount -StorageAccountKey $StorageAccountKey1
    $context2 = New-AzureStorageContext -StorageAccountName $DestinationStorageAccount -StorageAccountKey $StorageAccountKey2
    Write-Verbose "getting storage containers"
    $sc1 = Get-AzureRmStorageContainer -ResourceGroupName $ResourceGroupName -StorageAccountName $SourceStorageAccount
    if ($IncludeContainers.Count -gt 0) {
        Write-Verbose "filtering list of source containers"
        $sc1 = $sc1 | ?{$IncludeContainers -contains $_.Name}
        Write-Verbose "containers: $($($sc1).Name -join ',')"
    }
    if ($ExcludeContainers.Count -gt 0) {
        Write-Verbose "removing excluded containers from source list"
        $sc1 = $sc1 | ?{$ExcludeContainers -notcontains $_.Name}
        Write-Verbose "containers: $($($sc1).Name -join ',')"
    }
    Write-Verbose "validating destination container [$DestinationContainer]"
    try {
        $sc2 = Get-AzureRmStorageContainer -ResourceGroupName $ResourceGroupName -StorageAccountName $DestinationStorageAccount -Name $DestinationContainer -ErrorAction Stop
        Write-Verbose "container [$DestinationContainer] exists in destination"
        $destBlobs = (Get-AzureStorageBlob -Container $DestinationContainer -Context $Context2).Name
        Write-Verbose "$($destBlobs.count) destination blobs found in [$DestinationContainer]"
        Write-Verbose $($destBlobs -join ',')
    }
    catch {
        Write-Verbose "container [$DestinationContainer] not found in destination, creating it now"
        try {
            $c2 = New-AzureRmStorageContainer -ResourceGroupName $ResourceGroupName -StorageAccountName $DestinationStorageAccount -Name $DestinationContainer
        }
        catch {
            $stopEverything = $True
            Write-Error $Error[0].Exception.Message
            break
        }
    }
    Write-Verbose "enumerating source containers"
    $Script:countall = 0
    $Script:ccount = 0
    foreach($sc in $sc1) {
        $sourceContainer = $sc.Name
        Write-Verbose "source container: $sourceContainer"
        $srcBlobs  = Get-AzureStorageBlob -Container $sourceContainer -Context $context1
        Write-Verbose "------------------------- $sourceContainer ---------------------------------"
        Write-Verbose "$($srcBlobs.count) source blobs found in [$sourceContainer]"
        #$srcBlobs
        Write-Verbose "copying blobs to [$DestinationContainer]..."
        foreach ($blob in $srcBlobs) {
            $Script:countall++
            $srcBlob = $blob.Name
            $destPrefix = $sourceContainer+'-'+(Get-Date -f $BackupDateFormat)
            $destBlob = "$destPrefix`/$srcBlob"
            if ($Force -or ($destBlobs -notcontains $destBlob)) {
                Write-Verbose "[$sourceContainer] copying [$srcBlob] to [$destBlob]"
                try {
                    $copyjob = Start-AzureStorageBlobCopy -Context $context1 -SrcContainer $sourceContainer -SrcBlob $srcBlob -DestContainer $DestinationContainer -DestBlob "$destBlob" -DestContext $context2 -Force -Confirm:$False
                    Write-Verbose "copy successful"
                    $Script:ccount++
                }
                catch {
                    Write-Error $Error[0].Exception.Message
                }
            }
            else {
                Write-Verbose "blob [$destBlob] already backed up"
            }
        }
    }
}
function Get-AzureCredentials {
    [CmdletBinding()]
    param (
        [parameter(Mandatory=$True, HelpMessage="Azure Subscription UserName")]
        [ValidateNotNullOrEmpty()]
        [string] $AzureUserID,
        [parameter(Mandatory=$True, HelpMessage="Azure Subscription Name")]
        [ValidateNotNullOrEmpty()]
        [string] $SubscriptionName,
        [parameter(Mandatory=$False, HelpMessage="Credential file basename")]
        [ValidateNotNullOrEmpty()]
        [string] $CredentialName = "cred",
        [parameter(Mandatory=$False, HelpMessage="Force credentials reset")]
        [switch] $ForceUpdate
    )
    $ProfilePath = ".\$CredentialName.json"
    Write-Verbose "searching for $ProfilePath"
    if (Test-Path $ProfilePath) {
        if ($ForceUpdate) {
            Write-Verbose "deleting credential storage file: $ProfilePath"
            try {
                Get-Item -Path $ProfilePath -ErrorAction SilentlyContinue | Remove-Item -Force -WhatIf:$False
            }
            catch {}
            Write-Verbose "stored credential removed. prompt for credentials to create new file"
            try {
                $pwd = Get-Credential -UserName $AzureUserID -Message "Azure Credentials" -ErrorAction Stop
                $pwd.password | ConvertFrom-SecureString | Set-Content $ProfilePath -WhatIf:$False -ErrorAction Stop
                Write-Verbose "$ProfilePath has been updated"
            }
            catch {
                Write-Warning "$ProfilePath was NOT updated!"
            }
            try {
                $pwd = Get-Content $ProfilePath | ConvertTo-SecureString -Force
                $azCred = New-Object System.Management.Automation.PSCredential -ArgumentList $AzureUserID, $pwd
            }
            catch {
                Write-Error $Error[0].Exception.Message
                break
            }
        }
        else {
            Write-Verbose "$ProfilePath was found. importing contents"
            try {
                $pwd = Get-Content $ProfilePath | ConvertTo-SecureString -Force
                $azCred = New-Object System.Management.Automation.PSCredential -ArgumentList $AzureUserID, $pwd
            }
            catch {
                Write-Error $Error[0].Exception.Message
                break
            }
        }
    }
    else {
        Write-Verbose "$ProfilePath not found. prompt for credentials to create new file"
        try {
            $pwd = Get-Credential -UserName $AzureUserID -Message "Azure Credentials" -ErrorAction Stop
            $pwd.password | ConvertFrom-SecureString | Set-Content $ProfilePath -WhatIf:$False -ErrorAction Stop
            Write-Verbose '$ProfilePath has been updated'
        }
        catch {
            Write-Warning '$ProfilePath was NOT updated!!'
        }
        try {
            $pwd = Get-Content $ProfilePath | ConvertTo-SecureString -Force
            $azCred = New-Object System.Management.Automation.PSCredential -ArgumentList $AzureUserID, $pwd
        }
        catch {
            Write-Error $Error[0].Exception.Message
            break
        }
    }
    try {
        $azLogin = Connect-AzureRmAccount -Subscription $SubscriptionName -Credential $azCred -Environment $EnvironmentName -WhatIf:$False
        Write-Verbose "azure credentials verified"
    }
    catch {
        Write-Warning "azure credentials have expired. Prompt for new credentials"
        $azLogin = Connect-AzureRmAccount -Subscription $SubscriptionName -Environment $EnvironmentName -WhatIf:$False
    }
    $azLogin
}
function Get-AzureBackupConfig {
    param(
        [parameter(Mandatory=$True, HelpMessage="Path to configuration JSON file")]
        [ValidateNotNullOrEmpty()]
        [string] $FilePath
    )
    if (!(Test-Path $FilePath)) {
        Write-Warning "$FilePath not found!!"
        break
    }
    Get-Content -Raw -Path $FilePath | ConvertFrom-Json
}
if ($config = Get-AzureBackupConfig -FilePath $ConfigFile) {
    Write-Verbose "reading configuration data from file $ConfigFile"
    $config.psobject.properties | ForEach-Object{
        Set-Variable -Name $_.Name -Value $_.Value -Scope Script -WhatIf:$False
    }
    if ($ResetCredentials) {
        Get-AzureCredentials -AzureUserID $AzureUserID -SubscriptionName $SubscriptionName -CredentialName $CustomerName -ForceUpdate
    }
    if (Get-AzureCredentials -AzureUserID $AzureUserID -SubscriptionName $SubscriptionName -CredentialName $CustomerName) {
        Invoke-AzureBlobBackup -ExcludeContainers $ExcludedContainers
    }
    else {
        Write-Warning "run Set-AzureCredentials to update credential store and try running again"
    }
}
$time2 = Get-Date
Write-Verbose "completed. $($Script:countall) total objects processed. $($Script:ccount) were copied"
Write-Verbose "total runtime $($(New-TimeSpan -Start $time1 -End $time2).TotalSeconds) seconds"

The source is available on GitHub here.

Sample Usage

.\Copy-AzureBlobs.ps1 -ConfigFile ".\toilets.json" -Verbose -WhatIf

This will run the script with verbose output enabled (lots of detailed output) and using -WhatIf shows what it *would* do, without actually doing anything.  Well, actually, it will import, process and update the credentials file, but no blog copying will be processed.

The nice thing about the external configuration file approach is that you can prepare and use multiple configurations by simply calling different configuration files, each using its own credentials file, so they don’t need to have anything in common such as Azure tenant, resource group, storage accounts, or Azure user credentials.

What If?

What if you’ve run the script a few times and it’s conveniently reusing the credentials without prompting, but then someone changes the password on the account in Azure?  You can do one of two things:

  • Delete (or rename) the credentials .json file
  • Use the -ResetCredentials switch parameter

Either of these will force a login prompt to update the credentials, and then save a new file.

Other Notes

  • The -Verbose option also displays individual blob names as they’re being copied (or would be copied, if -WhatIf is also used)
  • The -Verbose option also displays total counts of blob objects, how many were copied and total runtime in seconds.
  • You can wrap the execution (with -Verbose) within Start-Transcript to capture all the details for analysis, logging, auditing, and troubleshooting
  • If there’s any measurable interest/support for this, I may post it to PowerShell Gallery for easier use (let me know?)

Thank you!

humor, Personal, Society, Technology

Cranky AFaaS

I’m starting to use this “aaS” suffix more and more in casual conversation now.  I’m not just stooping to bag my dog’s fecal dispersions, I’m providing Feces-aas or FaaS.  I’m not talking shit around the coffee pot anymore, I’m providing BSaaS.  That’s right, I claim it as the first official use of “Bull-Shit-as-a-Service”, even though technically, the act itself was perfected by the US government a hundred years ago.  Nobody can touch them now.

techsupport

So, this week has immersed me in a series of, shall I say, annoyances.  The kind that spin my brain platter around to that classic tune: “Stupid AF but we’ve gotten so used to it that it seems normal now”.

Like this…

bs_onedrive.png

and this…

bs_office_odt

…and that’s only the beginning.

Then I heard a clerk at the grocery store talking to a customer ahead of me.  It went a little like this…

Clerk: “No maam, once you write the check out for the actual amount, I can’t give you cash back, unless you write another check.”

Maam: “This shit is bullshit!

Clerk: “Well, I suppose that it has to be some kind of shit. But that’s all I can do.

Now, technically he was absolutely correct.  But I don’t think is manager was amused, but he obviously agreed with this employee, and dammit, my beer was getting warm on that slimy conveyor belt waiting for her to move on.

Then I found out that “SCCM” has been hijacked, like all good initialisms/acronyms, by some glue-sniffing, child-abducting gang calling themselves “Society of Critical Care Medicine“.  The nerve of those people thinking their silly medical skills somehow matter more in this dangerous world than deploying patches to machines over shitty WAN/VPN/Wi-Fi links at 3am.

For the love of caffeine, can we get someone to form an official group to manage all these acronyms which now have multiple meanings?

Then!

Then I walking my dog, Dory, who at 100lbs, actually walks me, but that’s beside the point, and one of my neighbors stops me on the street…

Her: “OMG.  Did you see the rabid fox running around here?!  It chased me into the house with my two little dogs dragging behind me!”

Me: “Ummm…”

Her: “So, I called the police, they said I had to call Animal Control, who said unless I could keep my eyes directly on it, they can’t come out to do anything.  And I said…” (this is where I started to glaze over and pictured my dog getting mauled by some rabid animal and me trying to fend it off with a roll of poopoo bags in a plastic container…) “and so I just wanted to let you know.  Be careful!”

She went inside, I kept walking (getting walked by) my dog, and then saw the rabid fox limping around like it had finished off a case of beer or something, about 100 yards to my right.  I called our action-packed police department…

311: “Police non-emergency.  What’s the problem you wish to report”

Me: “We have a rabid fox running around our neighborhood.”

311: “I’ll patch you through to Animal Control.  If you get put on hold too long, their direct number is (insert “1-800-IDGAF”).  Please hold…”

20 minutes, no answer.  Repeated recording about how important my call is.

Hang up.  Call back.  10 minutes on hold.  Another call, 5 minutes.  Never mind.  At this point, I’m hoping it bites the first city employee that drives through the area, but I don’t really mean that, it just sounds snarky.

So I tweet our tax-paid folks with my complaint…

311.png

It’s now 4:51 PM ET on a Friday, which means those folks left work about 5 days ago.

Anyhow.  I’m staying away from work this weekend, but I will be doing something.  Maybe cleaning up my Github tragedy, or rebuilding my lab catastrophe, or staring at my belly button and thinking “I was once connected by a cable!”

Seriously, taking the wife and two of our kids to see Bohemian Rhapsody tonight.  I hope it’s good.