GPODoc – PowerShell Module

I just posted my first PowerShell module to the PowerShell Gallery, even though it’s been available on my GitHub site for a little while longer.  It’s a small project, with only two (2) functions, so far.  Get-GPOComment and Export-GPOCommentReport.

Get-GPOComment is purely CLI and helps query GPO’s to return embedded descriptions, comments and so on.

Export-GPOCommentReport is a CLI function as well, but produces an HTML report, which is technically GUI output, of embedded GPO comments and descriptions.

The module is simple to load by way of the Install-Module and Import-Module cmdlets (see below).  The only requirement is to have the GroupPolicy PowerShell module installed first.  This is normally installed on all AD domain controllers, as well as other servers and workstations which have the Windows 10 RSAT package installed (tip: You can also use Chocolatey to install RSAT:  choco install rsat <or> cinst rsat)

Loading The PowerShell Module

Install-Module GPODoc

Trying it out: Get-GPOComment

Get-GPOComment -GPOName '*' -PolicyGroup Policy

get-gpocomment1

Get-GPOComment -GPOName '*' -PolicyGroup Preferences

get-gpocomment2You can easily modify the output layout since the output is in hash table format…

Get-GPOComment -GPOName '*' -PolicyGroup Preferences | Format-Table

If you want more flexibility in selecting the inputs, you can use the Get-GPO cmdlet and channel the output through a pipeline.  For example…

Get-GPO -All | Where-Object {$_.DisplayName -like "U *"} | 
  Select-Object -ExpandProperty DisplayName | 
    Foreach-Object {Get-GPOComment -GPOName $_ -PolicyGroup Policy}

get-gpocomment3

The parameters for this function are as follows:

  • GPOName
    • [string] single or array of GPO names
  • PolicyGroup
    • [string] (select list) = “Policy”, “Preferences” or “Settings”
    • Policy = query the GPO descriptions only
    • Settings = query the comments attached to GPO internal settings only
    • Preferences = query the comments attached to GPO Preferences internal settings only

This also respects -Verbose if you want to see more background details.

Export-GPOCommentReport

This function only takes two (2) inputs:

  • GPOName
    • [string] single or array of GPO names
  • ReportFile
    • [string] the path/filename for the HTML report file to create.

It invokes Get-GPOComment, combining “policy”,”preferences” and “settings” output, and writes it all to an HTML report file of your choosing.  The CSS formatting is hard-coded for now, but I’m open to feedback if you want more options. You can specify the GPO name(s) directly, or feed them via the pipeline.

export-gpocommentreport1

Example web report…

export-gpocommentreport2

If you used 1.0.1 (posted a few days earlier) scrap that and get 1.0.2.  The older version had some major bugs in the Get-GPOComment function, and the Export-GPOCommentReport function wasn’t available until 1.0.2.

Feedback and suggestions are welcome.  Even if it’s “hey man, your shit sucks. Consider a dishwashing job, or something else.”  Any feedback is better than no feedback.

Thank you!

 

Documenting Your IT Environment (The Easy Way)

One of the most common, and tragically problematic, aspects of today’s IT world is the lack of sufficient documentation. In fact, ANY documentation. It’s not associated with any one area either. It crosses everything from help desk tickets, to change requests, to Active Directory objects, to Group Policy, to SQL Server, to System Center to Azure.

UPDATED 8/8/17 – added SQL and SCCM examples at very bottom.

When and Why Does It Matter?

It matters when things go wrong (or “sideways” as many of my colleagues would say).  There’s the “oh shit!” moments where you or your staff find the problem first.  Then there’s the “oh holy shit!!” moments when your boss, his/her boss, and their bosses find the problem first and come down hard on you and your team.  When shit goes wrong/sideways, the less time you waste on finding out the what/where/why/who/when questions for the pieces involved with the problem, the better.  To that end, you can document things in other places, like Word documents, spreadsheets, help desk and ITIL systems, post-it notes, whiteboards, etc.  Or you can attach comments DIRECTLY to the things which are changed.  This isn’t possible in all situations, but it is most definitely possibly for many of the things that will straight-up shove your day into shit storm, and I have no idea what that really means, but it just sounds cool.  Anyhow…

The best part of this approach is that you can leverage PowerShell and other (free and retail) utilities to access and search comments you nest throughout the environment.  For example, you can collect and query Group Policy Object comments…

Need to Get the Comment buried in a particular GPO or GPPref setting?  Here’s one example for querying a GPO under User Configuration / Preferences / Control Panel Settings / Folder Options…

#requires -modules GroupPolicy
<#
.SYNOPSIS
  Get-GPPrefFolderOptionsComments.ps1
#>
param (
  [parameter(Mandatory=$True)]
  [ValidateNotNullOrEmpty()]
  [string] $GPOName
)
try {
  $policy = Get-GPO -Name $GPOName -ErrorAction Stop
}
catch {
  Write-Error "Group Policy $GPOName not found"
  break
}
$policyID = $policy.ID
$policyDomain = $policy.DomainName
$policyName  = $policy.DisplayName
$policyVpath = "\\$($policyDomain)\SYSVOL\$($policyDomain)\Policies\{$($policyID)}\User\Preferences\FolderOptions\FolderOptions.xml"
Write-Verbose "loading: $policyVpath"

if (Test-Path $policyVpath) {
  [xml]$SettingXML = Get-Content $policyVpath
  $result = $SettingXML.FolderOptions.GlobalFolderOptionsVista.desc
}
else {
  Write-Error "unable to load xml data"
}
Write-Output $result

Example…

So, What Now?

So, in the interest of avoiding more bad Mondays, start documenting your environment now.  Need some examples of where to start?  Here you go…

 

Thank you for your support.

Poor Man’s IT Chain Reactions

wpid-chinese-take-out.jpg

Challenge:

Make sure every machine in the enterprise (connected to LAN or always-on VPN) has the latest version of psexec.exe on the local C: drive.

Why?

Why not?  That’s why.

Option 1:

AKA – the semi-automated, safety switch turned off, fully-loaded, drunk guy holding the trigger option.

  1. Download psexec.exe from live.sysinternals.com (or direct: https://live.sysinternals.com/psexec.exe) and place into AD domain SYSVOL scripts folder (e.g. \\contoso.com\netlogon)
    Example…

    $WebClient = New-Object System.Net.WebClient
    $WebClient.DownloadFile("https://live.sysinternals.com/psexec.exe","\\contoso.com\netlogon\psexec.exe"
  2. Create Group Policy Object (GPO) with Computer Preferences setting to copy psexec.exe from the SYSVOL share to a location on the local C: drive. Configure to “update” so that future version updates will be passed down to the clients.
  3. Create a Scheduled Task to keep the SYSVOL copy up to date with the latest version.

Pros

  • Cheap (free)
  • Fairly automated (just add water, makes it’s own sauce / set it and forget it)

Cons

  • Smells like duct tape and coat hanger wire

Option 2:

AKA – The “I have a budget, so kiss my butt” option.

  1. SCCM package or application deployment

Pros

  • You look cool pulling it off, but not as geeky as option 1.

Cons

  • More moving parts under the hood.
  • May require additional steps to maintain a consistent current version across all devices.

Option 3:

AKA – The “I don’t have a budget, so kiss my butt” option.

  1. Include within image configuration (MDT, SCCM, Ghost, Acronis, etc.)

Pros

  • Easy

Cons

  • Difficult to maintain a consistent and current version across the enterprise

Option 4:

AKA – the “most fun to laugh about during the next beer-meeting” option

  1. Send the new guy around with a USB thumb drive

Pros

  • Great fun in the office

Cons

  • Do I really need to spell this out?

 

SCCM, SQL, DBATools, and Coffee

Warning:  This article is predicated on (A) basic reader familiarity with System Center Configuration Manager and the SQL Server aspects, and (B) nothing better to do with your time.

Caveat/Disclaimer:  As with most of my blog meanderings, I post from the hip.  I fully understand that it exposes my ignorance at times, and that can be painful at times, but adds another avenue for me to learn and grow.

marie-wilson-cooking

I don’t recall exactly when I was turned onto Ola Hallengren, or Steve Thompson, but it’s been a few years, at least.  The same could be said for Kent Agerlund, Johan Arwidmark, Mike Niehaus, and others.  None of whom I’ve yet to meet in person, but maybe some day.  However, that point in time is when my Stevie Wonder approach to SQL “optimization” went from poking at crocodiles with a pair of chopsticks, to saying “A-Ha!  THAT’s how it’s supposed to work!

As a small testament to this, while at Ignite 2016, I waited in line for the SQL Server guy at his booth, like an 8 year old girl at a Justin Bieber autograph signing, just to get a chance to ask a question about how to “automate SQL tasks like maintenance plans, and jobs, etc.”.  The guy looked downward in deep thought, then looked back at me and said “Have you heard of Ola Hallengren?”  I said “Yes!” and he replied, “he’s your best bet right now.

Quite a lot has changed.

For some background, I was working on a small project for a customer at that time focusing on automated build-out of an SCCM site using PowerShell and BoxStarter.  I had a cute little gist script that I could invoke from the PowerShell console on the intended target machine (virtual machine), and it would go to work:

  • Install Windows Server roles and features
  • Install ADK 10
  • Install MDT 2013
  • Install SQL Server 2014
  • Adjust SQL memory allocations (min/max)
  • Install WSUS server role and features
  • Install Configuration Manager
  • Install ConfigMgr Toolkit 2012 R2
  • and so on.

Since it was first posted, it went through about a dozen iterative “improvements” (translation: breaking it and fixing and improving and breaking and fixing, and repeat).

The very first iteration included the base build settings as well, such as naming the computer, assigning a static IPv4 address, DNS servers and gateway, join to an AD domain, etc.  But I decided to pull that part out into a separate gist script.

The main thing about this experiment that consumed the most time for me was:

  1. On-the-fly .INI construction for the SQL automated install
  2. On-the-fly .INI construction for the SCCM install
  3. On-the-fly SQL memory allocation configuration

Aside from the hard-coding of content sources (not included on this list), item 2 drove me nuts because I didn’t realize the “SA expiration” date property was required in the .INI file.  The amount of coffee I consumed in that 12 hour window would change my enamel coloring forever.  Chicks dig scars though, right?  Whatever.

Then came item 3.  I settled on the following chunk of code, which works…

$SQLMemMin = 8192
$SQLMemMax = 8192
...
write-output "info: configuring SQL server memory limits..."
write-output "info: minimum = $SQLMemMin"
write-output "info: maximum = $SQLMemMax"
try {
  [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic') | Out-Null
  [System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
  $SQLMemory = New-Object ('Microsoft.SqlServer.Management.Smo.Server') ("(local)")
  $SQLMemory.Configuration.MinServerMemory.ConfigValue = $SQLMemMin
  $SQLMemory.Configuration.MaxServerMemory.ConfigValue = $SQLMemMax
  $SQLMemory.Configuration.Alter()
  write-output "info: SQL memory limits have been configured."
}
catch {
  write-output "error: failed to modify SQL memory limits. Continuing..."
}

But there’s a few problems, or potential problems, with this approach…

  1. It’s ugly (to me anyway)
  2. The min and max values are static
  3. If you change this to use a calculated/derived value (reading WMI values) and use the 80% allocation rule, and the VM has dynamic memory, it goes sideways.

Example:

$mem = $(Get-WmiObject -Class Win32_ComputerSystem).TotalPhysicalMemory
$tmem = [math]::Round($mem/1024/1024,0)
...

I know that option 2 assumes a “bad practice” (dynamic memory), but it happens in the real world and I wanted to “cover all bases” with this lab experiment.  The problem that it causes is that the values returned from a WMI query can fluctuate along with the host memory allocation status, so the 80% value can be way off at times.

Regardless, forget all that blabber about static values and dynamic tragedy.  There’s a better way.  A MUCH better way.  Enter DBATools.  DBATools was the brainchild of Chrissy LeMaire, which is another name to add to any list that has Ola’s name on it. (side note: read Chrissy’s creds, pretty f-ing impressive). There are other routes to this as well, but I’ve found this one to be most user friendly for my needs. (Feel free to post better suggestions below, I welcome feedback!)

Install-Module dbatools
$sqlHost = "cm01.contoso.com"
$sqlmem = Test-DbaMaxMemory -SqlServer $sqlHost
if ($sqlmem.SqlMaxMB -gt $sqlmem.RecommendedMB) {
  Set-DbaMaxMemory -SqlServer $sqlHost -MaxMB $sqlmem.RecommendedMB
}

This is ONLY AN EXAMPLE, and contains an obvious flaw: I’m not injecting an explicit 80% derived value for the -MaxMB parameter.  However, this can be accomplished (assuming dynamic memory is not enabled) as follows…

Install-Module dbatools
$sqlHost = "cm01.contoso.com"
$sqlmem = Test-DbaMaxMemory -SqlServer $sqlHost
$totalMem = $sqlmem.TotalMB
$newMax = $totalMem * 0.8
if ($sqlmem.SqlMaxMB -ne $newMax) {
  Set-DbaMaxMemory -SqlServer $sqlHost -MaxMB $newMax
}

Here’s the code execution results from my lab…

sqlmemory2

You might have surmised that this was executed on a machine which has dynamic memory enabled, which is correct.  The Hyper-V guest VM configuration is questionable…

hyperv_setup1.png

This is one of the reasons I opted for static values in the original script.

Thoughts / Conclusions

Some possible workarounds for this mess would be trying to detect dynamic memory (from within the guest machine) which might be difficult, or insist on a declarative static memory assignment.

Another twist to all of this, and one reason I kind of shelved the whole experiment, was a conversation with other engineers regarding the use of other automation/sequencing tools like PowerShell DSC, Ansible, and Terraform.

The final takeaway of this is to try and revisit any projects/code which are still in use, to apply newer approaches when it makes sense.  If that means shorter code, improved security and performance, more capabilities, greater abstraction/generalization (for reuse), or whatever, it’s good to bring newer ideas to bear on older tools.  In this example, it was just replacing a big chunk of raw .NET reflection code with cleaner and more efficient PowerShell module code.  Backing out 10,000 feet, the entire gist could be replaced with something more efficient.

More Information

DBATools – twitterslackyoutubegithub

Ola Hallengren – web  (Ola doesn’t tweet much, yet)

My Twitter list of super awesometacular increditastical techno-uber genius folks – HERE

Back to my coffee.  I hope you enjoyed reading this!  Please post comments, thoughts, criticisms, stupid jokes, or winning lottery numbers below.  If nothing else, please rate this article using the stars above? – Thank you!

SCCM and Chocolatey

browsers

Trying to leverage goodness from various mixtures of Chocolatey with SCCM is definitely not new. Others have been playing around with it for quite some time. However, I wanted to pause from a month of mind-numbing work-related things to jot down some thoughts, realizations, pontifications, gyrations and abbreviations on this.

Much of this idiotic rambling that ensues hereinafter is based on the free version of Chocolatey.  There is also a “Business” version that offers many automation niceties which you might prefer.  There’s a lot more to this Chocolatey thing than I can possibly blabber out in one blog post (even for yappy little old me), such as the Agent Service features, packaging, and so more.  Visit http://chocolatey.org for more.

1 – Is it “Better”?

No.  It’s just different.  But, regardless of whether if “fits” a particular need or environment, it’s often nice to know there’s another option available “just in case”.

2 – Who might this be of use to?

I can’t list every possible scenario, but I would say that if the potential benefits are lined up it kind of points to remote users without the use of a public-facing (or VPN-exposed) distribution point resource.  It also somewhat negates the need for any distribution resource, even cloud based (Azure, AWS), since there’s no need for staging content unless you want to do so.

3 – How does SCCM fit?

At this point (build 1703) it’s best suited for use as a Package object, since there’s no real need for a detection method, or making install/uninstall deployment types.  A Program for installation, and another for uninstallation, are pretty much all that’s needed.

4 – How does an Install or Uninstall work via SCCM?

As an example, to install Git, you would make a Package, with no source content, and then create one Program as (for example only) “Install Git” using command “choco install git -y”, and another as “Uninstall Git” using “choco uninstall git -y”.  (Caveat: some packages incur dependencies, which may throw a prompt during an uninstall.  For those you can add -x before the -y, but refer to the Chocolately documentation for more details)

5 – How do you push updates to Chocolatey apps via SCCM?

You can use the above construct with a third Program named “Update Git” (for example) with command “choco upgrade git -y”.  Another option (and my preference) is to deploy a scheduled task that runs as the local System account, to run “choco upgrade all -y” at a preferred time or event (startup, login, etc.).  And, as you might have guessed by now (if you haven’t fallen asleep and face-planted into your cold pizza), someone has done this for you.

6 – Can you “bundle” apps with Chocolatey with or without SCCM?

Absolutely.  There’s a bazillion examples on the Internet, but here’s one I cobbled together for a quick lab demo a while back.  This one feeds a list of package names from a text file. You can also hard-code the list, or pull it from anywhere that PowerShell can reach it (and not just PowerShell, but any script that you can run on the intended Windows device).

7 – What about MDT?

Here’s a twist, you can deploy Chocolatey packages using MDT, or deploy MDT using Chocolatey.  How freaking cool is that?  If you sniff enough glue, you might even construct a Rube Goldberg system that deploys itself and opens a wormhole to another dimension.  By the time you find your way back, America will be a subsidiary of McDonald’s and we have real hoverboards.

8 – What about applying this to Windows Server builds?

You can.  I’d also recommend taking a look at BoxStarter, and Terraform.  I built a few BoxStarter scripts using Github Gists for demos a while back.  Here’s one example for building and SCCM primary site server, but it’s in need of dusting off and a tune up.  You can chop this up and do things all kinds of different (and probably better) ways than this.

The list of automation tools for building and configuring Windows computers is growing by the day.  By the time you read this sentence, there’s probably a few more.  Hold on, there’s another one.

PS – If you get really, really, reeeeeeally bored, and need something to either laugh at, ridicule or mock, you can poke around the rest of my Github mess.  I don’t care as long as you put the seat back down after flushing.

TextPad – the other editor

Like many developers and script kiddies (God, I hated that term for so long, but have gotten over that finally), over the years, I’ve used a great many different code editors in the course of getting work done on time.  I’ve forgotten more than I can recall, but some that come to mind include emacs, vi, pico, Aurora, EDLIN (ha ha! smack!), good old DOS EDIT (the crackbaby born from EDLIN in a dumpster behind a Walmart), Notepad++, Sublime, Eclipse, Komodo, PrimalScript, Wise Script Editor, ColdFusion, and dozens of app-embedded editors from AutoCAD/VLIDE, to Office VBA, to Visual Studio.  And I’m not counting the interesting graphical experiments like MIT App Inventor.

Today I typically use Visual Studio Code, PowerShell ISE, the PowerShell console, Notepad++, and Visual LISP IDE (VLIDE) when the need arises.

But one that I always seem to miss is TextPad.

One reason for that, is that I was (at the time) bouncing frequently between a lot of different languages, and I found the need to standardize on one editor that was:

  • Flexible / Customizeable
  • Powerful and Capable
  • Cheap!! (very, very cheap)

But here’s the rub: Helios (the developer) insists it’s not a “code editor” at all, but rather, a “text editor”.  Never mind all the syntax and snippet add-ons available for more “code” languages than “human” languages, they stand by that assertion.

TextPad has offered many standard code editing features since it’s early days, and is now at version 8.x, with updates emerging several times a year (so far).

Workspaces. Dictionaries. Macros.  Syntax definitions.  External tools.  Regex string functions.  Customizeable toolbars and panels.  In short, it’s very flexible and easy to customize.  In my (humblest of humbled) opinion, TextPad has the easiest and “best” (subjective) snippet management features of any editor.  Some others today are equal to it, but it had those features ten years ago.  And I’m a curmudgeon that barks “newer does *not* always mean *better*” hrmpff!! (and then I turn slowly and run away as fast as I can).

textpad8.png

PowerShell

Here’s an example scenario for adding a language and support features for PowerShell v5 on Windows 10.  After installing TextPad, don’t launch it just yet.

  • Download the syntax files for languages you want to use from here.
  • Download my PowerShell v5 syntax file from here (because they’re slow to add new submissions from customers)
    • Note: Rather than being C/C++ based, this one works best with a PERL mapping, hence the “PERL=1” parameter at the top.
  • Drop the PowerShell5.syn file under %USERPROFILE%\AppData\Roaming\Helios\TextPad\8\
  • Open TextPad, click the Configure menu item and select “New Document Class”.
    • Class name: “PowerShell” or “PowerShell 5”
    • Class members: *.ps1, *.psm1, *.psd1
    • Enable syntax highlighting: check
      • Select the syntax file from drop-down list
    • Finish
  • Click Configure / Preferences
    • Note: for general editor settings, look under General, File, Editor and View at the top of the list.
    • Expand Document Classes
    • Expand “PowerShell” (or whatever you named it)
    • Select “Colors”
      • Set desired colors for Keywords 1 to 6, etc.
    • Select “Tabulation”
      • Modify tab settings as desired
  • Scroll to the bottom of the Preferences list (left-hand panel) and select “Tools”
    • Click “Add” > “Program”
      • C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
      • Parameters: -ExecutionPolicy ByPass -File “$File”
      • Initial folder: $FileDir
      • Capture output: check
      • Suppress output until completed: uncheck
      • Sound alert when completed: (check or uncheck, you decide)
    • Click OK

Now, open a sample PowerShell script, and click Tools / External Tools / PowerShell (you can also press CTRL+1 to execute PowerShell, or CTRL+n, where (n) is the ordered number of the external tool in your list).  For the setup shown below, I could press CTRL+2 to run the script.  Note that there is only a full script run (like F5  in other editors), but no partial execution (like F8) for now.

textpad8a

Notes

  • You can change the path to use a OneDrive, Google Drive or other cloud sync folder, and then drop your files there for portability, much like Notepad++ offers.  To configure this, click on Configure / Preferences / Folders.
  • You can modify all support folder paths as well, including macros, snippets, etc.
  • I also have a PowerShell 5 snippet sample file for TextPad available here.

Summary

Is it “better” than other editors?  That’s a subjective question, for the most part.  As with programming languages, each seems to be better at some things than the alternatives.  The real measure is focusing on which features matter most and then comparing on just those areas.

Interview: Jon Szewczak

wpid-wp-1409886754092.jpeg

Name: Jon Szewczak

Job Title: Official: Programmer III
Job Title: Unofficial:  SQL Server DBA / .NET Web Architect / Windows / Intel Server Administrator / Pain in the a$$

Preface

I first met Jon somewhere between 2000-2004, while working on a rather large CAD software development project.  Me being older, and somewhat stuck in my ways, at first I had a tough time being questioned about “why” when it came to API choices and strategic decisions.  But it turned out to be a life-changing experience for me.  I had forgotten the adaptive mindset that has to exist when working with software, relying instead on hard-worn habits, some of which were from lack of being immersed in more dynamic environments.

After a few months of being a one-man-team, I had a tough time getting used to someone asking questions and offering other ideas to the project.  But Jon has a way of presenting ideas that make you listen, rather than just shoving it in your face.

We initially disagreed on quite a few technical aspects, but over time our thinking became more in sync, which I attribute much towards me learning to listen more.  Everyone I’ve ever worked with has rubbed off on me in various ways, and Jon is one of those who left a positive influence on me (I don’t have many positive attributes, so even one is better than none).  Anyhow, let’s get to it…

1. Describe what you do for a living – to someone who has no idea what it means.

Hmmm. That can be hard. My job title is Programmer III – like Superman III only not as cool and no Richard Pryor. That title means that I make computers do things by typing in commands that it can understand after a bunch of translating.

I have designed and implemented a vast majority of the programming code that runs the complex website at http://www.mdvnf.com. I also develop, maintain, and support several custom desktop applications that the associates at my company use on a daily basis.

But over the years, I have taken on other roles within the IT department. When I was brought in, I was immediately the “subject matter expert” on SQL Server, by virtue of having worked with it in my previous job. I was by no means an expert. However, lots of querying and reading and researching allowed me to actually morph into a much more competent data professional.

I manage the non-mainframe data warehouse, and I make sure that any applications or users that are touching it, do so in a manner that is nearly transparent to all parties involved. It’s a really tough job now because of aging hardware and increasing demands.

A few years ago, our parent company decided to implement a “Shared Services” IT model. Which means that all of the Network and Server support teams across many different locations were merged into one team – including the few that worked at my office. What that effectively did, was make all of team members work everywhere but my office and server room. The servers were suffering from neglect. So, since it was critical to my SQL Server(s), I started taking over the admin duties.

2. How did you get into this type of work?

I originally didn’t set out to be a computer programmer or an IT person. I was originally going to be a Drafter. I went to school to be a drafter and earned in an Associates in Computer-Aided Drafting (CAD) and Design. While I was there I took one class in CAD programming with AutoCAD. It was interesting, but I didn’t really see the huge potential of it at the time.

When I graduated and got my first job as a drafter in the Shipbuilding industry, I went to work for a place that used AutoCAD software, but it was highly-customized with the same programming techniques as I saw in college. That’s where I met an individual who is still a veritable genius in CAD programming.  This guy took me under his wing and showed me how AutoCAD could be made to do things that I never even dreamed of.  I started working right away on programming AutoCAD to do all kinds of things for me. Anything that I did more than once, I tried to figure out how to make it a one step command.

When some of my co-workers saw that, they wanted the shortcuts and macros and programs that I had developed too. So I started to share. Many more years later, I met Dave Stein (of this illustrious Blog [edit: his words, not mine, I promise, and no, I didn’t pay him for that]) and we (along with a few others) started really working on ShipWorks. ShipWorks was an automation tool-suite for AutoCAD that put the phrase “tool-suite” to shame. It was more of an application unto itself than a tool-suite.

Anyway, my CAD programming went on for many more years, but it was never my main job. It always was filler work. That is until I finally got an opportunity to program full-time – in the Modeling and Simulation arena.

3. What area or aspect of technology are you most excited about?

That’s kind of tough. There is so much cool programming out there that I look at and say “how did they do that? I wanna do that!” I am fascinated by wireless tech, and the way it has interconnected so many aspects of our life. Game programming is also another arena that amazes me. Getting 3D graphical characters to do things on the television screen with so much realism is just incredible.

4. What gives you the most satisfaction today?

I like to see things working the way they were meant to. Whether it’s an API, or a web page, or a desktop application, it doesn’t matter. I like to see it work and work efficiently. There is so much “just get it done” crap code in my company that it is really hard to describe. The people who originally wrote the legacy applications, really had no idea what they were doing to make an efficient application – it bugs every time I have to fix a bug or something. I have to fight the desire to rip it all apart and do it right.

5. Name the 3 most inspiring people in your life or career?

The first would be Brad Hamilton. He is the individual who took me under his wing as a “wet behind the ears” kid and showed me how and encouraged me to really dig into CAD programming to make things better, quicker, more robust, and more efficient.

The next would be Dave Stein – and no that’s not just a shameless “suck up” plug. Dave welcomed me as a partner in the ShipWorks venture and then handed the management of it over to me when he needed to move on. This allowed me to grow as an application manager and showed me that there is much, much, much more to programming and application development then just typing some lines of code.

The last, and most important is my wife. Without her I would not be where I am, I would not be as successful as I am, I would not be anything.

[edit: I’m hoping to get Brad involved with this interview effort as well.  Like Jon describes, Brad is someone who made a huge impact on me for many years.  Words like genius, visionary, and Grateful Dead fan, don’t begin to describe him.  I’m not so sure about that Dave guy.  But 2 of 3 isn’t bad.]

6. If I hadn’t gone into this field, I’d probably be ____?

Still working as a CAD designer in the shipbuilding industry. I am not one who changes things often or lightly, so I probably would have stuck to it. I am so glad that I did not.

7. Favorite place to travel?

I don’t really have one. Some place that is relaxing. I don’t do much of that, and I always think it would be nice to find a place where I can do nothing – guilt free.

8. What 3 books, movies or other works have influenced you most in life?

I am not a person who reads a lot of self-help or motivational things. I watch movies for the escapism, so there’s hardly anything influential there. I love the well written poetry of Robert Frost, Edgar Allen Poe and others.

But, really, the only two influential things I can think of here are controversial, depending on your personal beliefs and stances.  The first is The Bible. And I’m more specifically talking about the New Testament and the teachings of Jesus Christ. I am Catholic – but I’m also a progressive Catholic. I don’t always agree with everything the Catholic Church teaches or espouses, but overall I am in line with it. At any rate, the most important things I have taken from Christ’s teachings are acceptance and a need to care for those who cannot do it for themselves.

The other book that I always come back to is Six Hours One Friday by Max Lucado. In it he makes this point: Life is not Futile, Failure is not Fatal, and Death is not Final. It’s a wonderful way to try to live.

9. There’s never enough ______.

Time.

10. There’s way too much _____.

Stress.

But, That’s Not All…

I sometimes do work on the side for people, setting up websites and what not. One of the sites I helped out with is for THE UNBATTLE PROJECT (http://theunbattleproject.org). It is a non-profit organization helping to provide much needed counselling and therapy services to Veterans and Active Duty Military members.

It’s a very worthy cause, and (full disclosure) I am friends with the CEO of the organization. It’s in its beginning phases and could use all of the publicity and help that can be provided. So please spread the word.

Dave: Thank you!