sir_shower_cat

First off, what qualifies me to speak on such a topic?  Well, I have an IT Bachelor’s Degree from an accredited US public university, and I’ve been working in the IT field since the 1980’s.  I’m old as dirt, and I’ve seen my share of industry trends, technology innovations, and all that.  When someone jokes about DOS 3 or OS/2, I chuckle along with full understanding.  I respect Assembler programmers much like WWII veterans (I may even hold doors open for them).

Blah blah blah.  Enough already.  This may get deep, so if you need to run and fetch your drugs and liquor now I can wait…

Anyhow, the premise of this article comes from a common question I hear from younger folks, looking to pursue an “IT career”:

Does a college degree ‘help’ in pursuing an IT career?

You young people.  Always acting young and stuff.  Sheesh!

The media is often blasting commercials from all sides at us, portending to make a “degree” seem obligatory for almost any field.  Nursing.  Automotive mechanics.  Public Safety.  IT.  I won’t argue that a degree isn’t valuable.  But the value isn’t in the degree itself as much as in what you do with it and where you do it (or try to do it. you get the point).  The best wrench on Earth is worthless if you never use it.  It’s also worthless if there’s nothing around to use it on.

So, if you study books, blogs, web sites, tutorials and pass all your certifications, can you still land a kick-ass, well-paying job?  Absolutely.  Often without even that much investment.  So, what then does a college or higher-education experience give you (okay, sell you) that makes it a value-add proposition?  Depth of knowledge, and rigorous exposure.

On the Depth of Knowledge part:

It teaches you the theory and basis on which technology exists.  Why things work the way they do.  How they were derived or discovered.  What they’re really made of.  The “when” and “where” aren’t intrinsically as valuable, but helpful, as it pertains to understanding the various contextual implications.

Does that mean you’ll be putting your college experience to use every day at your shiny new IT job?  Not necessarily.  But employers often look at that college experience like they do any other profession.  When they ask someone to do the unthinkable, they want someone with experience AND depth of knowledge.  A sort of artistic nuance is implied (not always, but I’ve had a beer so I get to wax poetic a bit).

On the Rigorous Exposure part:

One of the best valuable features of a classroom (or online) atmosphere is that you are often assigned tasks to solve which are uncomfortable.  By that, I mean that they will often be focused on things you don’t find appealing.  Left to our own devices, humans often navigate towards what makes them happy, or saves them time, rather than what someone else wants.  That someone else is (going to be) your next employer.

This is why the military uses drill instructors: they force you to do what you probably wouldn’t do on your own.  Sure, you could start running, working out, and all that, but someone pointing a gun at you, or stepping a boot on your back, provides an additional boost of motivation that you won’t likely find on your own.

Imagine how your coding practices would change if someone made you do push-ups with their foot on your back for each bug they found.  Ha ha ha!  I’d love to see it (from a distance, of course).

If you expect to walk into your first “real” IT job and only be assigned the kinds of tasks YOU want to work on, well, you’re living the dream.  A very, very, very rare and drug-induced dream.  The rest of us mortals are usually paid to solve other people’s problems not our own personal problems.

Where was I?  Oh yeah…

Finally, there’s the historical aspect.  It helps to know who, how and why things were created and how they evolved into becoming the things you use today.  While knowing the names and contributions of people like Babbage, Stroustrup, McCarthy, Berners-Lee, Turing, Hopper, Thompson, and Ritchie isn’t required in order to become a successful professional IT geek.  It does help you gain deeper insight into how those tools are best used.

But beyond the names, places and dates, there’s the real meat (or soy, if you’re vegan): computer science.  CS, as it were, is based on mathematics.  That’s right.  Computers are math machines.  Layers upon layers of math, until the numbers mash together to make forms, buttons, documents, pictures and movies.  It’s all math and numbers.

Ever done those ugly IP subnet problems with the binary calculations?  How about those arrays and their silly base-zero indexes?  Or what about the syntactical ordering of a SQL statement (select this from that where this is not that, etc.)?  Heuristics.  Predictive analytics.  Linguistics and phrasing associativity.  Fuzzy logic.  Regular expressions.  Derivations.  Parallel and pipeline processes.  blah blah blah… Yep.  All math.  Nerdy nerdy math.

Today however, you may only see passing traces of that legacy, as they’ve been hermetically sealed and encased in age-proof capsules, spray-coated with a clean polymer shell, dipped in acronym gel, flame-hardened into API modules, and buried deep inside that thing you call Visual Studio (or Eclipse, etc.).  You just click, drag and drop.  Not a clue what’s inside those little morsels of magic.

Taking this progression (or digression) further, you can see the ship is sailing towards this.  There’s good and bad to be debated here of course.

(As a side note, I really HATE the semi-SQL quasi-language features many IDE tools use today, in order to distance the developer from touching that icky, stinky, smelly, putrid SQL code.  Don’t drink from the shiny cup!  SQL is better. SQL is best handled by SQL.  Learn it. Know it. Live it.)  Anyhow…

So much emphasis now is being placed on convenience and pleasantry.  Pretty UI/UX tools, slick animations and workflows.  One old friend of mine joked that one analogy might be that you wake up in bed, proclaim you are hungry, and a foodbot responds by “materializing” your favorite meal onto a tray upon your lap.  How those eggs, bacon and toast were made doesn’t matter.  Not even the chickens, pigs or flour.  It only matters that it’s still hot and tastes good.

Where it once mattered more to build everything from scratch, it’s now considered more artful, and even more “efficient” to assemble things from building blocks, rather than knowing how those blocks were made (and how they work, under the hood).  You’re placing all of your trust into hidden things.  (this is where the open source folks jump out from a trap door and shout, “a -ha!!!!”  But even they aren’t immune to this trend).

Origins are fading.  Convenience is king.

This brings me to the part I find most interesting…

Coding vs. Developing

This was the crux of a deep discussion I had with some colleagues a while back.  Actually, over the course of several separate discussions.

  • Is coding (or programming) the same thing as developing?
  • Is there a distinction?
  • Should there be?

Basically, is it “programming” when you drag widgets and controls from a pallet and drop them on a workspace form and add some filler code?

Is it “programming” when you start with a template?

Where does “programming” begin?  At the machine code level?  At the ASCII, source code layer?  At the wire-frame or diagram phase?  On the napkin at the lunch table?

Aside from the declarative “is it” stuff, another question to ask is a semi-imperative, “does it really matter?”  Going back to my earlier statement: Is there a distinction? And, should there be?

Should we teach children how numbers are added, or just teach them how to use a calculator?

Does anyone need to know anything about how cell phones communicate?  Or just how to turn them on and use the apps?

Forget the qualifying aspects for a moment.  How do you quantify a value for knowing the basis of something which is utilized only at a higher layer of abstraction? Is it tangible?

And when developmental evolution continues to slide the abstraction layer to the right (assuming a left-to-right vector here, excuse me), where do we, as a society, draw the line as to how far ‘back’ evolutionary knowledge needs to be imparted to the next generation of users?  To fix cars, do you need to understand the progression from horse and buggy, to steam coach, to gasoline engine?

Taking a more relevant perspective:

A common example I cite when this goes deeper is when a novice doesn’t stop to consider the performance implications of choosing one processing method over another.  Maybe it’s a layered condition branch (if/else/elseif/…) or a linear condition branch (switch/select/cond), or maybe it’s Do-While versus Do-Until.  Whatever.  When you start asking yourself which option (when there’s more than one) will yield the optimal result, and optimal performance, you’re starting to think like a programmer, rather than a developer.

Jumping back outside a bit, you’re building a house, but now you pause to ask if the studs are made from a particular wood and how it relates to the climate conditions where the house is being built.  Or you can just continue nailing the boards in place and go home to drink beer.  The material aspects are “someone else’s problem“.

None of this is “official” obviously.  We’re slogging our way through murky soup, and there’s no diagram or instruction manual for most of it.  Just pieces and scraps.

It’s somewhat analogous to being a cook or a chef.  Or a painter versus an artist.  The general distinction being that one leans towards assembling things from prepared components, while the other leans towards building the prepared components.

One colleague of mine explained it something like this…

If you go to the grocery store and buy chopped onions, canned olives, chopped lettuce, crumbled Feta, and a bottle of dressing, and toss those into a bowl, mix them up, then you have a ‘Greek salad’, or something close to it.  Is that ‘cooking’?

To the children at the table, staring you down in desperate hunger, it is chef school magic at its best.  To Emeril, well, not so much.

Herein lay the real question:  What do you want to focus on?

If you’re waiting for someone else to tell you what’s valuable, you’re asking the wrong question.  If college interests you, go after it.  If not, go after something else.

zzzzzz…

Advertisements

One thought on “College vs. Computer Science

  1. My perspective…

    I’m not quite as old as dirt, and I’ve left the fold of IT folk. Even so, my 10+ years of experience makes me (arguably) qualified to voice my opinion. So, in the spirit of my new career field (law)…

    Do you need a college degree? It depends.

    I got into IT without a college education beyond a rather useless to IT AS in Social Science. I found that employers were more impressed with performance and skill (and those certifications) over college degrees. However, the fact that I -had- a college degree seemed to please them nonetheless; I suspect that the employer than we shared hired me in part because I happened to have one.

    For someone looking to break into system administration, though, I’d say… focus on your certifications first. They’re cheaper, you’ll get hired, and you can take classes at night to backfill that IT degree if you like.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s