Personal, Projects, Scripting, Technology

FudgePack

Pardon the headline and semi-questionable graphic, but it’s all I had to work with on short notice.

As a result of way too much caffeine, tempered with a sudden burst of alcohol and intense, yet clueless conversation with a colleague, the following hair-brained idea sprang up. This required immediate action, because, stupid ideas have to be enacted quickly in order to produce rapid failures and immediate lessons-learned…

Idea: What if you could manage remote, non-domain-joined, Windows 10 computers from a web page control mechanism, for “free”, where the local user has NO local admin rights, to do things like run scripts, deploy or remove applications, etc.?

What if? Aye?

So, Chocolatey came to mind. Partly because I was eating something with chocolate in it, but mostly because I love the Chocolatey PowerShell packaging paradigm potential, and that’s a heavy string of “P”‘s in one sentence.  Anyhow, it felt like one of those sudden Raspberry Pi project urges that I had to get out of my system, so I could move on to more important things, like figuring out what to eat.

Caveats:

  1. The machine would need to be configured at least once by someone with local admin rights.
  2. The machine would need to be connected to the Internet in order to receive updated instructions
  3. The admin needs a little knowledge of technical things, like proper coffee consumption, shit-talking and locating clean restrooms

Outline of Stupid Idea

  1. Drop the FudgePack script into a folder on the device (e.g. C:\ProgramData\FudgePack, file is Invoke-FudgePack.ps1)
  2. Create a Scheduled Task to run under the local NT Authority\SYSTEM account
    1. Task should only run if a network connection is active
    2. Task should only run as often as you would need for immediacy
    3. User (device owner) should be able to invoke Task interactively if needed.
  3. Host the control data somewhere on the Internet where the script can access it

Procedure for Stupid Idea

  1. Here’s the FudgePack app control XML file.  Make a copy and edit to suit your needs.
  2. Here’s the FudgePack PowerShell script.  Douse it in gasoline and set on fire if you want.
  3. Here’s an example Scheduled Task job file to edit, import, point at, and laugh.

Setting up and Testing the Stupid Idea

  1. Copy the appcontrol.xml file somewhere accessible to the device you wish to test it on (local drive, UNC share on the network, web location like GitHub, etc.)
  2. Edit the appcontrol.xml file to suit your needs (devicename, list of Chocolatey packages, runtime date/time values, etc.)
  3. Invoke the script under the SYSTEM account context (you can use PsExec.exe or a Scheduled Task to do this)
  4. Once you get it working as desired, create a scheduled task to run as often as you like
  5. Send me a box filled with cash – ok, just kidding.  but seriously, if you want to, that’s ok too.

More Stupid Caveats

  1. It’s quite possible in this day and age, that someone else has done this, and likely done a better job of it than I have.  That’s okay too.  I searched around and didn’t find anything like this, but my search abilities could quite possibly suck.  So, if someone else has already posted an idea identical to this, I will gladly set this aside to try theirs.  After all, it’s about solving a problem, not wasting a sunny, beautiful Saturday behind a keyboard.

Cheers!

Cloud, Projects, Scripting, System Center, Technology

Invoke CM_Build over the Web

Updated 10/15/2017 – Added -Override example

118057481

First off: WTF is CM_BUILD?

CM_BUILD is a PowerShell script that configures a “vanilla” Windows Server machine into having Configuration Manager Current Branch installed.  This includes ADK, MDT, Server Roles and Features (WSUS, BITS, etc.), SQL Server, ConfigMgr itself, and a few goodies like Right-Click Tools, ConfigMgr Toolkit.  The GitHub repo has a cute readme markdown page filled with overcaffeinated gibberish on how to use it.  CM_SiteConfig is the “part 2” to cm_build, which configures ConfigMgr into a semi-functional site.

Short answer: https://github.com/Skatterbrainz/CM_Build

Okay, why CM_BUILD?

I don’t know.  Why do we do anything?  For the thrills? I could have taken up robbing banks, raising a crocodile farm, or breaking world records of swilling down cans of Four Loco while working on electrical equipment.  But I chose the boring life.  And while I’m bored, I hate clicking buttons repeatedly, so …

I got inspired by Johan and Mikael’s ConfigMgr Hydration Kits and Deployment Fundamentals Vol. 6 book examples, and Niall’s noob scripts, (I know it’s not actually called that, but it sounds cool to say “Niall’s noob scripts“), and after 45 cups of terrible coffee I said “I can shove all that into an XML file and call my JSON friends up and laugh hysterically at them, saying things like ‘You and your snotty little JSON drivel!  Always mocking poor, starving little XML.  Well, I’ll have you know I can still write XML, and probably even a little COBOL! So what do you think of that?!  Hello?  Hello?  Did you just hang up on me?!! WTF!

Anyhow…. Hold on, I need to get my dog outside before she has an accident….

okay, I’m back.

Why Invoke it over the Web?

There are several potential reasons for wanting to do this:

  • I was really bored and it’s been raining all freakin day, and…
  • It’s 3am and I can’t sleep, and…
  • I saw this, and …
  • I wanted to pull this off within Azure, using a VM extension, without having to import any actual files, and it would be cool to tie all this together with a runbook so I can send a text message “new lab configmgr p01“, to fire off a lab build in Azure and have it text me back “your stupid lab is ready, now leave me alone!” then I can forget it’s still running and it runs all my MSDN credits down to $0 until the next monthly cycle, and…
  • I scrolled through Dan Bilzerian’s twitter feed just long enough to hate my boring life, and needed a distraction, and…
  • It seemed like something cool to try

Example

Time to put on a poker face and act serious now.  The example below calls the cm_build.ps1 script from the GitHub master branch URL, converts it into a -ScriptBlock object, and passes in the -XmlFile parameter value using the Github Gist raw URL (you can make your own by copying the cm_build.xml into your own “secret” Gist, so you don’t openly share sensitive information to the whole world)

$ps1file = 'https://raw.githubusercontent.com/Skatterbrainz/CM_Build/master/cm_build.ps1'
$xmlfile = '<your-gist-raw-url>'

$script = Invoke-WebRequest $ps1file
$scriptBlock = [ScriptBlock]::Create($script.Content)
Invoke-Command -ScriptBlock $scriptBlock -ArgumentList @($xmlfile, $True, $True, $True)

But you can also invoke the interactive gridview menu using the -Override parameter, by simply appending one more $True into the -ArgumentList array.

Invoke-Command -ScriptBlock $scriptBlock -ArgumentList @($xmlfile, $True, $True, $True)

Then you get this budget-sized, corner-cutting, hack of a menu to choose from…override-gui

You may see a red warning about “Split-Path : Cannot bind argument to parameter ‘Path’ because it is null.”  I’ll fix that soon.  It only impacts the log output, but nobody reads log files anyway, right?

Anyhow, it’s 3:33 am, and I’m still typing, which is probably bad for my health, but if two people read this and it actually provide useful information one of you, mission accomplished.  Actually, I know for a fact this is bad for my health.  Regardless, I ran the above snippet (with a real URL in the $xmlfile assignment) in my Hyper-V duct-tape and chewing gum lab at home, and it worked like a charm.  Now I can log into the server, open the ConfigMgr console and proceed with CM_SiteConfig, or apply real world tactics and break the ConfigMgr site entirely and start over.

zzzz

Personal

Short (But True) Stories – Beer Cup

I was cleaning up old hard drives and found a “diary” of sorts and it reminded me of a bunch of stories I’ve shared with people around me, but never online. Anyhow, this hasn’t been a good week for me, and I need something to get my mind off of stupid crap and bad news, so here’s a bit of therapy. I hope you enjoy!

In 1984, I had been playing drums in a local rock band in Hampton, Virginia.  I had a day job, but the music gig was fun and I was earning enough for gas, food, and drum sticks, while my day job paid for the other important things. Our band was asked to play at a squadron picnic at the local Air Force base. Two of the band members, the singer and bassist, were active-duty at the base, so they brought a bigger crowd than we usually had.  There was plenty of food, beer and families with kids, the weather was fantastic and it was a lot of fun.

After our second of three sets, we took a break and were standing around talking and kidding around. Then one friend of ours, playfully bumped the bass player, and knowing it was in fun, set his Dixie cup of beer down and went after the other guy to wrestle. After a few minutes of tussling around, they got back up, laughing, and our bassist went back to his beer and we continued kidding around.

A few minutes later, he started turning blue in the face and couldn’t talk. We thought he was pranking us, as he often did. Then he slowly went down on the ground, rolled over on his back, and started making gurgling sounds. One of the guys bumped his arm with his foot, saying something like “come on, man, that’s not funny.” But he looked even worse.

Suddenly one of the guys rolled him over and patted him on his back really hard, thinking he may have choked on something. After a few hard slaps on his back, he coughed up a large bumble bee. It had landed in his beer cup while he was play-wrestling, and he swallowed it, where it apparently stung the back of his throat.

Being that we were in the middle of a military base, we were lucky to get EMT help fast, and after a few hours in the emergency room, he was allowed to go home.

Good times.

interviews, Surveys, Technology

Interviews – Will IT Be More or Less Fun in 10 Years?

Question: “Do you expect that IT work in 10 years from now, will be more fun or less fun than it is today, and why?”

Mark Aldridge

I think that it will be more fun as there will so much more technology to learn in 10 years time and so many amazing features in ConfigMgr 2706!

DareDevelOPs

I think it will be more fun as open source projects become more the norm. Also as Infrastructure as Code becomes all things as code (ATaC?), the challenge level is going to go up. We will be more focused on solving the problem than running the infrastructure. Also the kinds of industries will change from country to country as the Human Development Index shifts and Nation State ranking changes. The level of Virtualization in all areas life will continue down Moore’s law critical path, even as the compute hardware reaches Moore’s limit. te things we IT will get shinier. …or Skynet.

Stephen Owen

More fun! IT has only gotten more interesting and varied, with the introduction of mobile devices and tons of new form factors. I think in ten years we all will finally know what we’re doing with Windows updates, and probably have a better handle on security practices.

I think the Wild West days of IT are behind us, and I for one am happy about it. My phone definitely rings less on weekend now than it did five years ago.

Damien Van Robaeys

When I see all available technologies, I can imagine that, in 10 years, the working environment will also change.
This will be the time of mixed reality, even if Minority report won’t be for now.
I hope we will work with Holographic computers, like the Hololens.

Maybe computers, like laptop or desktop, if they still exist, will use an holographic screen and holographic keyboards.
Imagine your computer, in a small box that will display a screen above, and a keyboard on your desk.

Meetings would be done with holographic system, like in Star Wars , with a system that will allow you to say Hey call Mr X, and Mr X will appear in front of you in hologram, like Obi Wan Kenobi.

Rob Spitzer

Fun is such a relative thing. I’ve met DBAs that are super passionate about their jobs yet I can’t imagine how that could be any fun. Conversely I’ve been asked on multiple occasions how I deal with Exchange every day. It’s just something I found that I enjoy doing.

There’s no doubt IT is changing. We’ve seen this happen before. We rarely build hardware anymore and now we’re seeing things like software installation and configuration go away as we move more to the cloud. I’ve seen Exchange change a lot over the last 20 years but, at its heart, it’s still the same thing I’ve enjoyed all along, even in the cloud.

You just need to make sure you find a role that you’re passionate about. If you have a hard time putting down at the end of the day, odds are you found it.

Ami Casto

IT, fun? What? IT has been and will always be what you make of it. 10 years from now you’ll still be fixing some idiot policy you didn’t create but have to clean up the mess now that the poo has hit the fan. You’ll just have to keep looking for the things that make you passionate about what you do.

Arnie Tomasovsky

I expect it to be less fun, as thanks to AI, everything will be a lot more automated. BUT human being will remain as the end user, therefore fun won’t disappear 🙂

Johan Arwidmark

I expect it to be more fun, and more complex. Why? Hopefully less politics, and more ongoing maintenance/upgrades, and more automation.

Nicke Kallen

There are two directions that this can go in… either we aim for a specialized knowledge set where employees will continue tinkering as they do today. The number will not be as many as we have today, but larger corporations will still depend on this knowledge and for the people that have actively developed this skillset – it’s a lot more fun.

The other option is that we are somewhere down the journey to be completely commoditized. Perhaps a few service providers have staff, but apart from that we define business requirements and ensure the logistics part of delivering IT works. Its most likely not the cup of tea for today’s it workers…

Mike Terrill

I think IT will be even more fun 10 years from now. The reason for this is because our field is growing at a rapid pace and will continue to do so over the next 10 years. Just imagine some of the gadgets we will have in the future and how much AI will have progressed.

Rod Trent

A: <Beavis and Butthead mode on…> Hehe…you said work in IT is fun </Beavis and Butthead mode off>

Chris DeCarlo

So I’m sure everyone you asked this question will say “More fun…” So I’ll play devil’s advocate here and say less fun. AI is already making decent strides, and with the great progress of robots and VR already I envision AI being fully integrated into robotics in the next 10 years. These AI enhanced robots will take over our call centers and end user support roles with 24×7 support and no need for breaks or health care. From there AI will be integrated into the Windows OS and automatically Google( or I mean Bing) and fix any errors that appear on your server/sccm software leaving us “organ sacks” or “blood bags” with basic tasks such as lubricating the robots joints, and polishing the robots shiny metal ….

Skatterbrainz

More fun for some.  Less fun for others.  More work for software folks, less work for hardware folks.  In all, I think there will be some serious reduction in IT staffing for many data center roles, as those things morph into “Software-Defined <x>” while evaporating into the cloud.  Then again, it’s not inconceivable that some unforeseen events could trigger a massive reversion from cloud back to on-prem.  Government intrusion, for one, might have that sort of impact.

Personal

On This Day – 1991

On July 21, 1991, I received a small box in the mail from the Kill Devil Hills, NC police department.  I wasn’t expecting any packages, let alone something from a police department. I opened it and found enclosed an old, weather-beaten leather wallet, with credit cards, a driver’s license, and $1 dollar.  There was a small amount of sand in the folds. There was also a hand-written note that said “Your wallet was found on the roadside and turned in by a good Samaritan. We used $4 from the inside to cover the postage, we hope you don’t mind. – KDHPD”

The credit cards had been cancelled four years earlier, along with getting a new driver’s license.  That was early Summer of 1987.

I lost the letter during hurricane Isabelle in 2003, and the wallet long before that, but I ran across a scrap of paper that mentioned the event and date while cleaning out some boxes.

I was a working musician in 1987; playing percussion mostly.  On that day, our band was in Nags Head, Manteo and Kill Devil Hills (North Carolina).  Staying in a cheap hotel, I ran out for lunch at the KFC a few miles up the road, with our guitarist, Jim.  We stopped on the way back to the hotel, at an overlook, to eat and watch the shorebreak, I stepped out, and (stupidly) set my wallet and flip-flops on the roof of my 1985 Toyota Pickup truck. I forgot the wallet when we got back in and drove back to the hotel.  It blew off the roof along one of the (back then) desolate cross roads (between the beach road and the bypass road).  There were no houses, or apartments back then along that stretch of land.  Just small dunes and seagrass, and each of those cross roads looked exactly like the next.  When we got back to the hotel, I realized the mistake and we drove back out and spent three hours driving and walking up and down each cross road until we gave up.

Today, that section of North Carolina doesn’t have a single road without a mass of townhomes, or condominiums.  Yet, they still look exactly alike.

What a journey it’s been from then to now.  After our first baby came along, I sold my gear and haven’t played since (aside from trips to Sam Ash and Guitar Center, every now and then). That led to drafting, which led to CAD, which led to programming, which led to my first IT job, which led to college, which led to more IT jobs, and on to consulting.  Queue the ridiculous sentimental soundtrack…. actually, someone in the house fed something bad to the dog and she has gas and won’t leave my office.  I have to get out.

business, Technology

The 5 Immutable Laws of IT Life

1 – The person you need most will be unavailable when you need them.
2 – The problem will stop as soon as you try to show it to someone else.
3 – The simplest task will end up taking the most time.
4 – The feature you need most will be the least documented.
5 – That which saves you time, will cost more money (and vice versa).

Personal, Technology

Flashback Time Again

Drink up – and follow me into the wormhole of utter pointless reminiscence…

cad.png

So, in the 1980’s I was working as a “senior engineering technician”, which was US Navy speak for “senior draftsman” or “senior drafter”.  We worked with various UNIX based workstations and mainframe systems to develop 3D models of US naval warship things.  Some of the names back then were CADAM, Pro/Engineer, Intergraph, and CADDS 4 or CADDS 5.

Interesting to note that, while AutoCAD existed in the late 1980’s, the US Navy strictly forbid the use of any “PC-based CAD tools” as they were deemed unreliable, inaccurate, “toys” as one admiral stated.  Over time, they gradually allowed the use of AutoCAD and later, Microstation, DesignCAD, Drafix, and a few others, but only for “textual document data” such as title-sheets (contains only tables, notes, and mostly text), while the actual design data was still allowed only for UNIX-based products.  Sometime in the early 1990’s, the Navy finally gave in and allowed PC-based CAD products for all design work.

While my job title sounded like a typical office job, it wasn’t that typical. We often split our time 50/50 with going aboard ships (all over the place), and climbing into the dirtiest, darkest, hottest, coldest and sometimes most dangerous places, on almost every kind of surface vessel the Navy had at the time.  From small frigates and supply ships, even hydrofoil patrol boats, up to aircraft carriers and commercial cargo ships.

In most cases, the scheduling worked out perfectly to send us to somewhere around 100F at 95% humidity to do this, which works great with a morning-after hangover (I was in my 20’s then).  By the time I left that industry, I had set foot into almost every space on a CVN68 class aircraft carrier, and about half of the spaces on LHA and LHD class ships.  I’ve seen a lot of interesting stuff, and got the bumps and scars to remember it by.

Anyhow, back in the office, one of the popular CAD systems of the time was CADDS 4 or CADDS-4X, sold by the Computervision corporation, which was somewhat affiliated with DEC/Digital.

The workstations we used were priced around $35,000 each at the time.  We had around 20 of them in our office.  The mainframe components, the annual subscription, and the annual support costs, were nearly 4 times the cost of the workstations combined.  Hence the “4X”, ha ha ha!  Good thing I didn’t have the checkbook then.

Turf Battles

One of the cool features was the digitizer tablet and pen setup, which looked like the (linked picture), except we had multi-color monitors.  The tablet consisted of a frame, a paper menu matte, and a clear plastic cover/surface.  The center of the tablet area was for drawing and manipulation.  The left, top and right outer regions were filled with menu “buttons”, which were sort of an on-screen button (no touch screens back then).

The buttons were programmable.  😀

We ran three (3) daily shifts to cover the projects, which were on a tight time schedule.  Myself, Kevin and Timmy, split the shifts on workstation P1, for “Piping Systems Division, Station 1”.  Every month or so, we’d swap shifts to keep from going insane.  During one month, I worked First shift (8am – 4pm), Kevin had Second shift (4pm – midnight) and Timmy had Third shift (midnight to 8am).

We met to discuss logistics, and so on, and agreed that we would claim a particular section of the tablet menu to use for our own custom button macro/command assignments.  First world problems, of course.

Timmy didn’t like being confined.  He considered himself a free-range drafter.

Each night, Timmy would change the button assignments on sections I and Kevin had agreed to claim.  This caused some angst, and I’ll explain why…

Sidebar –

Back then, the combination of hardware (processing power, memory caching, storage I/O performance, and network I/O) resulted in slow work, particularly when it came to opening and saving model data.  A typical “main machinery room” (aka. engine room) space model would take around 35-40 minutes to open.  The regular morning process was as follows:

  • 7:30 AM – Arrive at office
  • 7:35 AM – Log into workstation terminal
  • 7:37 AM – Open model file and initiate a graphics “regen all”
  • 7:39 AM – Search for coffee and sugary stuff
  • 7:45 AM – Discuss latest TV shows, movies, sports game, news story
  • 7:59 AM – Run for nearest restroom
  • 8:19 AM – Emerge from restroom
  • 8:20 AM – Model is generated and ready to begin work
  • 8:21 AM – Make first edit
  • 8:21:05 AM – Save!

Now, here’s the rub:  In 1987-88, there was no concept of an “undo” or “redo” in most of the CAD/CAM systems of the day.  So, we made sure to “save often and be careful with every edit”.

Second Rub:  Timmy liked to modify our programmed menu keys, which caused us a lot of headaches.  For example, clicking a button labeled as “Draw Line” might invoke “Translate Layer X” (select all objects on layer “X” and move them to another layer).  A drastic operation that required exiting the part (no save) and re-opening

Third Rub: Closing a model took about five (5) minutes.   So anytime a mistake occurred that required dropping out and coming back in, meant roughly 45-50 minutes of wasted time.  Well, not totally wasted.  It gave us more time to discuss the latest GNR album, Ozzy, whatever movies were out, and so on, and more coffee, and sugary stuff.

So, after repeated attempts to educate Timmy without success, Kevin and I agreed to modify all of Timmy’s command buttons to “exit part no file”.  So Timmy would open his model file and after 40-45 minutes, click “draw circle” and watch his screen go blank with a prompt showing “part file exited successfully” or something like that.

After one (1) day of that, Timmy didn’t mess with our menu buttons again.

By the way, in one of those 3D models, buried way inside one of the machinery spaces, there just might be a pressure gauge dial, among a cluster of other gauge dials, on an obscure corner bulkhead (ship-speak for “wall”), which has Mickey Mouse’s face and hands on it.  An Easter egg from 1988, laying dormant somewhere in an electronic vault somewhere in a dark warehouse in an undisclosed location.

business, interviews, Technology

What I’ve Learned from Doing IT Interviews

058-1

WARNING: My humor tank is running low today.  This one is a semi-quasi-serious post with sub-humor ramifications and subtle uses of pontificatory inflection.  cough cough…

Like many (most) of you, for years, I’ve been the one sweating through an interview.  I’ve had bad interview experiences, and good ones; maybe even a great one, once or twice.

On the bad list was one with a well-known hardware vendor, where I was introduced to three “tech reviewers” on the call who regularly speak at pretty much EVERY IT conference on Earth, and have written enough books for me to climb a stack and change a light bulb.  I was in over my head, but thankfully, they appreciated my humility and sense of humor (had an interesting follow-on conversation at the end as well, but I’ll leave that for another time).

On the good list was the most-recent interview I had (my current job) where the interviewer took the time to share some fantastic technical advise which helped me on the project I was working on with my previous employer.  More than an interview, it was like a mini-training session.  Needless to say, he liked my mental problem-solving process enough to offer me this job.  Very, very much appreciated.

But this post is really about the flip-side of the interview process; what I’ve learned from interviewing others for various types of positions.  At a former place I was the administrative “lead” of a team of six (6) incredibly skilled people.  Part of my role was to interview new hires for a very uncommon set of skills to fit into that project.

At my current employer, I’ve been interviewing like mad to help a customer fill staffing needs for another set of uncommon skills. Not that the individual skills are necessarily uncommon, but the mix of skills in a single person seems to be uncommon.  I have to say, it’s been both enjoyable, and educational for me.

I hope that this experience helps me with future interviews when I go looking for a new job (or a promotion).

I’ve tried to apply the “good” experiences from my interviewee past as much as possible.  For example, not just grilling candidates to make them sweat, but help them along the way, in a give-and-take discussion.  Not a lecture.  And not a cross-examination.  It’s been eye-opening for me, to say the least.  So here’s what I’ve learned:

1 – Keep it Simple

When asked to respond with a “what would you do if…” scenario, start with the most basic step.  A classic example question is “You have a web server, that relies on a separate SQL host, to support a web application.  After working fine for a while, it now shows an error that it can no longer connect to the SQL host.  What would your first step be?

Bad answers: “I’d check the SQL logs”.  “I’d confirm the SQL security permissions”, “I’d verify that the SQL services were running on the SQL host”, “I’d Telnet to the SQL host”

Better answers: “I’d try to ping the SQL host from the web server”

2 – Know the Basic Basics of your Platform

If the role involves system administration (aka “sysadmin”) duties, you should be familiar with at least the names of features, components, and commands.  You don’t necessarily have to know every syntactical nuance of them, just what they are, and what they’re used for.  For example, “what command would you use to register a DLL?” or “What command would you use to change the startup type of a service?”

If the interviewer doesn’t focus on scripting aspects, then ask if they want to know the command or what PowerShell cmdlet.  Then take it from there.  If they ask about the command, just give them the command.  You don’t need to describe the various ramifications of using the command, or how it would be better/easier/cooler to do it with PowerShell.  If they ask about PowerShell methods, answer with the appropriate cmdlet or just describe the script code at a 100,000 foot level.  That said, if the interviewer is focused on your PowerShell acumen, dive deeper, but ask if that’s what they want to hear first.

3 – Don’t be Afraid to say “I Don’t Know”

If the interview question leaves you stumped, don’t hem and haw, and don’t make up something.  Just say “I don’t know“, but, and I mean BUT…. follow that with some next-step direction.  For example, “I don’t know, but I would research that by going to ___ and searching for ____

4 – Ask Questions

A lot of the time, the interviewer is also looking for indications of how the candidate interacts with a situation, such as an interview.  They want to know if you’re inclined to question and discover each situation, rather than just react to it.  Sometimes, the interviewer will ask you “Do you have any questions?“, and sometimes they won’t.  Regardless, it’s often good to ask at least one or two questions, even if it’s just “what’s the next step?

5 – Get a Critique if Possible

At the end of the interview, unless you feel certain you nailed it, like this, I always recommend asking the interviewer for some feedback how how you did.  Ask if there were any areas you could have responded better.  Don’t worry about getting granular details, just general responses can be very helpful.  Whether it’s technical, personal, or otherwise, anything is pure GOLD when it comes to this.

It’s a rare chance to get some tips that will help you on future interviews.  This is particularly true when you feel pretty sure that the employer isn’t going to make you an offer.  That doesn’t mean you are a failure, it just means you didn’t provide indication for the position they’re looking to fill.

Society, Technology

NTP and DateTime and Space Colonies

I just finished up migrating a customer from Windows Server 2008 R2 to 2016 Active Directory.  Thankfully, it was only a single AD forest and domain; nothing too complex in that regard.  I also migrated their DFS namespace from 2000 to 2008 mode.  Afterwards, we gently wrapped their 2008 R2 domain controllers (virtual machines) in duct tape, smothered them in imaginary rags soaked in ether, and carefully loaded them onto little imaginary rafts to paddle out into the river, where they’d be sunk with an imaginary RPG round.

058-1.png

During the process, we ran through the usual checklists of things; DNS, replication, and of course time.  Time, as in NTP.  w32tm, and all that.  Aside from having spent a lot of time on the micro- implications of time back in the early 2000’s, getting immersed in the concepts of NTP stratum, drifts, huff-n-puff, and intervals, I still think about the macro- implications today.  This is particularly apropos with the increased talk about SpaceX, Blue Origin, and so on, and all the talk about Mars expeditions.

That got me to thinking about time on Earth.  Julian dates, 24-hour time, leap years, and so forth.  These Earth-bound notions of seconds, minutes, hours, days, weeks, months, seasons/quarters, years, decades, centuries, and millennia.  And, in turn, that got me into thinking about meetings.  After all, time and meetings go together like politicians and eggs, or ham and drugs, or one of those.  And, for the record, meetings are most closely associated with the time construct we refer to as an “eon“.

Imagine this:

50 or 100 years from now, we may have a colony on another moon or planet.  And that moon or planet is very likely NOT going to share the same cyclical frequency of rotations and revolutions as Earth.  In other words, the relative time from one day to the next, or one orbit around the Sun (or host planet), won’t be the same as that of Earth.  Their local “day” may be only a few hours of that on Earth, or may be much, much longer.

That said, will the concepts of an hour, a week, or a month, be relevant?

What if this imaginary colony rests on a planet that has a pattern of daylight that equates to 48 hours on Earth?  Or it orbits the Sun (or again, host planet) once every 3.5 months of Earth time?  What if one “year” on that remote place equates to less or more than a year on Earth?

Some would argue that their relative (local) perception wouldn’t be significant.  But that’s assuming they wouldn’t have seasons either.  Seasons are what give weight and meaning to relative dates and times on Earth.  Cold and Hot.  Crops grow or whither.  Animals graze or migrate.  You get the idea.  So, seasons have a HUGE impact on the significance of “annual” cycles, because they dictate much of the things on which human life depends.

Just because Earth has seasons, and the only remote places we’ve seen (Moon, Mars, Jupiter, etc.) don’t appear to have any reference of a “season”, doesn’t mean that in 50-100 years we wouldn’t have landed on (and colonized) another place that does have such a phenomena.

Will the locals of those colonies still insist on marking “time” and “date” in Earth units?  If so, why?  And for how long?

Keep in mind that even on Earth, we differ from one region to another on a great many things.  This includes social/civil things, like marriage, drinking, voting, enlisting for armed services, driving, and so on.  We also differ on time.  Some places recognized Daylight Saving Time, and some do not.  While others impose a half-hour offset, rather than a full-hour.  The basic point here is that even on this one ball of dirt and water, we don’t have uniform rules.

Now, add to that, our history of colonization and divestiture.  By that, I mean colonies that fought hard to win their independence (a-hem, cough-cough, no names please).  Some of those fared better than others of course, but, the takeaway is that many of the rules imposed by the former overlord were replaced or banished by the new management.

So, getting back to the plotted course of this diatribe, even if the initial colonization were established with strict Earth-centric rules, there’s nothing to prove, or even expect, that with enough time, the colonists might decide those rules make no sense and would therefore be replaced.

Now comes the fun part.

Imagine, during this interim period, being between the era in which the colonists follow the same rules (minutes, hours, days, weeks, months, years) as on Earth, and the time before they revolt and declare full independence from Earth-mandated taxes, fees and regulations (akin to 1604-1776, let’s say).

There would likely be some business interests that exist on the colonized planet or moon which remain in contact with their Earth counterparts.  I would assume these would be contractors who are initially part of the expedition, much like those who are embedded with today’s exploration and military engagements (you can guess their names I’m sure).  Probably related to things like telecommunications, mineral extraction, human support (medical, subsistence, housing, entertainment, etc.)

At some point, one of the project teams on Earth will be scheduling a meeting with their counterparts on the remote colony.  They’ll click “Friday, April 23, 2117” and “9:00 AM EDT” and when it arrives in the inbox on the other end, what will that mean?

Pick up the voice comm…

(crackling sound)… “Hey!  How are you guys doing?”

“Great!  You sound pretty clear and your video feed is clear as well.  How are you?”

“Not bad. Not bad.  Say…. We were wondering if you guys are available next Tuesday, say…. around 9:00 AM our time?  That would be like 14:35 AM your time, tomorrow.”

“Hold on.  That’s actually around 14:55 AM, but yeah, we should be good.”

“What’s it like there ?  I mean…. how do you sleep and work and all that?”

“Oh yeah.  So, an hour for us, is like 4.25 hours for you, but 4 days for you is like 0.99 days for us.  So our sleep patterns are very different from yours.”

You get the idea.

I think like this all the time.  Like whenever I see a Sci-Fi movie and the aliens are always humanoid (a head, 2 arms, 2 legs, etc.)  I think “what if we can’t even imagine other life forms?”  Even the Star Wars bar scenes are filled with loose variations of this bi-pedal form, with other (Earth-centric) animal features glued onto various parts of the body.

What if they look like a coffee cup?

What if they “talk” in a way that, to humans, sounds like farting?

What if they think extending a hand for shaking is a gesture for sexual activity?

What if a Asian-protocol bow of formality, is seen as a request to be attacked or eaten?

What if?

So, all digression digestion aside, back to the time and date thing.  How will computers will be configured?  How much Y2K-ish work will become heavily in-demand, in order to handle such date/time offsets, especially across more than one remote colony?

I’m guess too, that AI/ML will be so commonplace by then, that we won’t even be aware of the translations that occur in the background.  It’ll be something that we’re taught in school as a “just-in-case the machines crash” scenario, and then will be forgotten.

Date and Time.

 

Projects, Technology

Analysis Time: GUI vs CLI, round 3

I did a round 1 several years ago, and I can’t remember when round 2 happened, so I played it safe by calling this round 3.  Anyhow, the point of this is what exactly?

coolman

The point is, I’m an emphatic proponent of anti-dogma.  It started as a child, when others would say to me “G.I. Joe cannot really fly.” and I set out to prove them wrong.  Then they said “Plastic model battleships can’t really do battle or explode and sink” and again, I proved them wrong (and a big thanks to the local fire department for helping to bring that sea battle to a safe close).

I just had my first cup of real coffee in 48 hours.  I can’t explain why.  I think I was having some sort of self-hatred phase or something.  So, I’m having what Leonard, in the movie Awakenings, would call a “moment of clarity”.  Or, maybe that was Jules in Pulp Fiction, anyhow…

In this case, it’s the dogma that CLI is inherently better than GUI.  It’s the same argument that one programming language is “the best”.  Or that only one kind of food is objectively “the best”.  To me that’s saying one tool in the entire hardware store is “the best”.  Being yet another kind of tool, as I’m often called, a CLI or GUI is “best” with regards to the context.

For scalable repetition, CLI is often the obvious choice.  And for single-instance scenarios, a GUI is often ideal.  But there’s that big middle ground, where the Red Bull drinkers debate the Supplement powder drinkers for world domination of the “my way is the only way” argument.  Those are indeed first world dilemmas.

First off, let’s keep things in perspective.  Just as the early-mid 1960’s were rife with the phrase, “it’s a man’s world“, and don’t hate me, I’m just offering an observation from watching Mad Men; 2017 is still a GUI world.  You can argue this if you want, but if CLI were truly the king of all that is software, you would NEVER see a packaged application installer with a GUI face, and your “smartphone” would only be a command prompt, and nothing more.  Touchscreens, video games, music production software, credit card swipers, ATMs, airline ticket kiosks, all are immersed in the GUI bathtub.

That said, CLI is obviously the king of process automation.  The efficiency comes in with zero-interaction operations.  Pre-configured parameters.  Even when a GUI is optimal for a given task, the trend today is to equip the GUI to save parameters to a file in order to leverage that captured parameter state for CLI repetition.

As one friend of mine would say “if you’re pulling one engine, you rent a hoist, and buy a case of beer.  If you’re pulling 1,000 engines, you get a loan and build an assembly line.

Let’s break the GUI down a bit.  One of the common points of debate is around item selection:  Radio buttons, check boxes, single and multi- select lists, dials, sliders, date pickers, range approximation, color pickers, and so on.

Much of the following is derived from a project I worked on back in college.  We were discussing the merits of CLI vs GUI and our professor said “prove it.  with numbers.”  So we consumed sufficient quantities of caffeine and sugar, and went to work.

If you use a stop watch, and compare two engineers, equally caffeinated and infused with sugary snacks, and have them execute the same tasks in both the GUI and CLI, excluding differences in relative typing speed/accuracy, or mouse control skills, the completion times will lean towards one or the other based on the circumstances:

Disclaimer: This is based on a “first run” scenario, where no preconfigured data is yet available or determined.

  • Check boxes.  If using “Select All” or “Select None” the difference is zero.  But sporadic, non-contiguous selections, are quicker by GUI clicks than by typing in non-contiguous names.  This is further differentiated by the relevance of string pattern consistency.  If RegEx or wildcards can be used, it can lean in the favor of CLI.  But if the selections provide no consistent patterns, the GUI will win hands down.
  • Radio buttons are a wash
  • Single-select lists are a wash if the list does not require scrolling in the GUI.  If the list requires scrolling, the CLI is faster.
  • Multi-select lists are only marginally quicker by GUI due to CLI having tab-completion.  This is predicated on a static set of options to select, which can be fed into an IntelliSense cache.
  • Sliders are faster via CLI (direct input)
  • Date selection is faster via CLI (direct input)
  • Calendar pickers are faster by CLI (direct input)
  • Color pickers will vary by pre-determined color values.  If you don’t know the color, but you are selecting by visual acuity only, the GUI is faster.  If you know the color name or code value, the CLI is faster.

The caveat from here on out is that once the GUI is used, if the inputs are captured, then the CLI steps in and knocks it out of the park.  This goes back to the repetition claim.

So, what does this really mean?  Aside from turning capable IT professionals into non-productive windbags (until the boss walks in), it means that anyone writing software that uses a GUI, and expects their product to be used repeatedly, should be building in a means to capture inputs to a file to facilitate reuse via CLI.  If they are not doing this, they can only fall into one of the following categories:

  • They’re stupid or lazy, or both
  • They don’t really care about customer satisfaction
  • They need to be beaten with a pepper spray can and tasered until the battery runs out

Until next time: happy CLI-ing.