Enough kidding around for tonight. I have to get serious for a few minutes. Stop laughing. I’m trying to be serious here. This is serious stuff.
Ever since I learned to write program code, sometime back around the War of 1812, I’ve heard didactic claims thrown like snowballs at a blind kid about “never ever do this!” and “doing that is always bad!” It’s always about coding practices, not sexual or dietary idiosyncrasies, so you can relax.
Authors. Professors. Senior programmers. Dishwashers. Yep. Repeating it like the way coffee pours through a filter and still comes out as coffee. I’m not confusing this with my 5 Rules for Writing Code, and things like documenting and indenting code. Those are always good. But even those rules have exceptions (gasp!).
If you write code you’ve probably heard a few. Things like “using global variables is the devils work”, and “console echoing will make you burn in hell”. Bah! I’m throwing the red bullshit flag on the field and calling for a review. I suppose that should be a brown flag, but red is more serious-looking.
What some folks call a global variable others call a system variable, or application scope, or an environment setting. So what is a global variable? Simple. It’s a variable, which is defined such that it is accessible to all moving parts inside and outside of the code that defines it. Windows uses them. Linux, iOS, and most every programming language themselves use them. Don’t believe me? Change the PATH or WINDIR system variable and reboot your computer. (tee hee, chuckle chuckle). Actually, don’t do it. I was just kidding.
There’s a longer, more official sounding definition, but I tend to fall asleep before I can finish saying it. Basically, it’s globally-accessible and therefore is in global scope. Global variable, system environment setting: A pig with lipstick on is still a pig. Heck, the HKEY_LOCAL_MACHINE registry hive is one big-ass wagon of global variables.
It’s the same argument (pardon the code pun for a moment) as with fire, guns and politicians. Each has its uses for good things. Each can also be misused and cause great harm; with the exception of politicians, which are always harmful.
Using things like global variables and console echoing is (are?) not intrinsically bad. It’s really a matter of what you’re doing and what you’re trying to do. The more scientifically sound approach would be to advise beginner programmers not to use them indiscriminately. And then explain why. An analogy could be made to compare them with a loaded gun: be careful where you point it. Be even more careful when you squeeze the trigger.
From an abstract aspect, if that’s even grammatically correct to say, something as basic and common as storing and loading application settings using a file, database, or a web service, is very similar to a global variable. Only the specific mechanism is different. It’s like storing crap in your own garage, or in a storage unit. It’s still stored crap, and everything else about it is essentially the same (loading, unloading, enumerating, disposing, etc.) I realize that might be a crappy example. 🙂
The value association is stored in a central place (conceptually speaking), and therefor prone to access from all sides. That also means it can (potentially) be modified from all sides. But that’s where YOU as the developer/architect/coffee-drinker, has the control to constrain access to the value. You control the scope context, and the scope ambience. Ha. Ambience. And you thought “didactic” was all I had up my sleeve.
The problem I’ve seen tends to be a lack of objectivity. As technical and methodical as we might perceive ourselves to be, especially after high doses of caffeine, we are still humans. We still approach things with tendencies; habits. We call them heuristics and intuition, but they still come in the habit happy meal pack. Usually with a small Red Bull can.
Objectivity, and I don’t mean object-oriented, is just as hard to find in the tech world as it is on the street. Human nature is a bitch. Just ask the next guy you see bent over the hood of a cop car. Just ask any programmer, especially after they take the idiot-bait question: “what’s the best programming language?” (Face palm) (tip: if you’re ever asked that question, there are only two correct answers: “it depends” or “all of them”)
So, if using global declarations in C# or Java is evil. If using write-host in PowerShell, or wscript.echo in vbscript, or princ in LISP is so horribly catastrophic that planets will spin out their orbits, cats will begin eating dogs, and politicians will start being truthful, then why where those things even made available to us mere mortals? Why not strip them out? Why not make programmers wear a dog shock collar and apply the juice every time they type those horrific things into their doomed code?
Because they’re not “always wrong”.