Earlier this week, Dwight Silverman wrote a post in Techblog about the demise of HAL-PC which at one time was the largest computer user group in the US. The relevance of HAL-PC, and computer user groups in general, has become so low in recent years that many of you may be surprised that HAL-PC hung on in some form well in to the 2010s.
From the post:
Bill Jameson, a former board member who spoke by phone from HAL-PC’s South Post Oak offices, confirmed the decision. He reiterated the “changing society” theme in the email, saying “this society we live in now has a different set of interests and goals. Our type of organization is not included in that.”
“Most of our members are older,” Jameson said. “The cultural norms we group up with are pretty much gone, and as a consequence the organization has not sufficiently adapted to this new culture.”
What led to this? Let’s take a look back to the 1970s and early to mid-1980s when computers were a new thing, and an era before the vast majority of people had access to the Internet. Most computer-to-computer communication was done via modems over analog telephone lines. Through most of these two decades, 9600 bits per second (bps) modem speeds were a long-chased ideal, with 300, 1200, and 2400 bps being much more common. In the place of the Internet, there were bulletin board systems (BBSes) and amateur email networks like FidoNet.
But more importantly, there was no hitting the power switch, waiting a minute, and coming back to a full color graphical user interface, and clicking on a few things to launch whatever software one wanted to run. There was the DOS prompt, or on earlier computers a BASIC interpreter (some of the really exotic models didn’t even had that, but only had a Forth interpreter or even just an assembler). There were no mice during this era for the most part, much less something to move a pointer to and click on. If one wanted the computer to do something, one typed it in. Typing was an unmistakable prerequisite to computer literacy, with knowing MS-DOS commands or their equivalent on one’s platform following closely behind. Most people learned a little programming, even if it was MS-DOS batch files or writing short BASIC programs, out of necessity.
Most importantly, though, the line between programmer (today more often called “developer”) and user was much blurrier than it is today (I’ll cover this in more depth later). And this is where user groups came in, where the more advanced users would teach those newer to computing how to get the most out of their gadgets. User groups are the reason technology is not feared as it once was by those who lived through the era in which they existed and were largely relevant.
Fast forward to the mid-1990s. Microsoft came out with Windows 95 and with it there was no separate MS-DOS product any more, it was all graphical and you had to dig for the MS-DOS prompt if you still wanted it. At least through Windows 98 there was still a fair amount of MS-DOS compatibility (I think Windows 98 still had the ability to boot into a “command prompt” as they call it to run older MS-DOS software). But before too long, the command prompt would become harder and harder to find, and at least Microsoft would rather have you believe it is simply less useful in modern times (I personally believe Microsoft themselves made it that way on purpose). Instead of being something magical, computers take their place next to the TVs and stereo systems at stores like Best Buy and Target. For the most part, computers are just another appliance now. It makes as much sense to have a computer user group as it does a refrigerator user group or toaster oven user group.
On one hand, it still amazes me that once back in 2008 or so, I found a still-usable computer sitting out by the dumpster, and the main reason for this was that there was some kind of issue with the Windows XP install. Rather than try to fix it, this person dumped it and bought a new one. That computer eventually became a firewall/router which served us well for a good 3 years plus, though those who know me will (correctly) guess the first thing I did was wipe the Windows XP install and replace it with OpenBSD (4.9 or 5.0, I think, but I could be wrong). On the other, it’s a rather sad reflection on the public’s attitude to computers, and just how much ease of use has taken a lot of the magic out of learning how to use a computer.
I use a graphical interface now, though I have not kept Windows installed on any computer I’ve considered “mine” for at least 12 years now. While I am not quite at “a mouse is a device used to point at the xterm you want to type in” it’s rare that I don’t have at least one command line open somewhere. In at least one situation on a computer that wasn’t “mine” where getting rid of the installed copy of Windows wasn’t an option, I kept an Ubuntu install on a thumb drive and booted that instead of Windows when I needed to use that computer. The installation failed several times in weird and not-so-wonderful ways, but I got it back up and running almost every time. (The one time I didn’t? The thumb drive itself (not the Linux kernel finding filesystem errors) started throwing write protection errors. I got the surviving important data off that drive and at least temporarily used a different (very old and underpowered) computer exclusively for a while.)
Personally, I’ve never lost sight of the magic behind computing. I’ll admit it, I get a thrill out of installing a new operating system on either brand-new or new-to-me hardware, which I’ve done for every system up until the last one I received new (the one I’m writing this post on). This one was ordered custom-built and with the operating system (Ubuntu GNU/Linux 11.04) already on it for three reasons: first, because for once, it was a realistic option to buy a computer with Ubuntu pre-installed; second, I needed to make immediate use of the computer as soon as it arrived; and third, it was a different thrill to experience the closest equivalent to how most people today get a store-bought PC. The great job Canonical (the company behind Ubuntu) has done even trying to mount some kind of challenge to what is a damn-near-monopoly by Microsoft deserves a post all its own (which I may make sometime in July).
But I think it’s a sad commentary on the state of computing that most computer users in this decade will never even think that building their own computer is a realistic option, much less doing their own operating system install, much less realizing there are many other choices for operating system besides those which come from Microsoft (or Apple). There is a certain degree of intimidation to overcome when it comes to staring down an empty computer case and the components that will go into it. I was there once myself; I once built a new 80486DX/33 and barely had a freaking clue what the heck I was doing. It helped that I had a friend at the time to guide me through the tricky parts (over the phone). Today’s hardware is, if anything, much more friendly towards do-it-yourself builds: RAM chips, CPU chips, power supply connectors, and SATA (hard drive) connectors are all keyed to only go in one way; the only thing decreasing is the number of people actually willing to pick up the screwdriver.
(Quick sidenote here: Apple never did embrace the idea that users could build their own computers. For better or worse, Apple has positioned themselves as sort of a “luxury brand” of electronics. The only thing worse than Microsoft’s near monopoly is that it’s impossible to buy components and build one’s own iMac, or even buy an Apple computer without Mac OS X. Apple has actually made it a EULA violation to run Mac OS X on unlicensed hardware, even though today’s “PC compatible” computers can run it. This is one reason I point to when I say that I believe Apple has been more harmful to the state of computing than Microsoft has been.)
Another sad commentary is the rather rigid wall that’s been built between “user” and “developer” (what we used to call “programmer”). Even “power user” doesn’t have quite the same aura it once did, and it’s used as a derisive term more often than one might otherwise think (and way more often than it should be, in my opinion). I find myself slamming into this wall on many occasions, as there are things I’d like to be able to do as a user, which I research and find out one needs to actually be a developer to do them. (Which sometimes means it’s impossible or going to be much harder to do than it need be; other times, I simply want to say “no, this shouldn’t be a developer feature, I’m just a user who wants to make full use of the technology.”) For example: Windows (which has lineage back to MS-DOS) no longer comes with a BASIC interpreter. Another example: Neither Windows nor Mac OS X come with compilers suitable for writing one’s own software. (Microsoft makes no-cost versions available for download, but they aren’t easy to find, and in all likelihood are a thinly disguised excuse to get one bumping into the limits and then shelling out money for the “real” compilers.) It is in fact expected that most users will simply spend amounts of money (which can run into hundreds, thousands, or even ten thousands of dollars) on the appropriate pre-written, shrink-wrapped, proprietary software. This is great for the stockholders of Microsoft, Apple, and other members of the proprietary software cartel like Adobe. It’s lousy if one’s “just a user.”