User groups and the chameleon of technology

Earlier this week, Dwight Silverman wrote a post in Techblog about the demise of HAL-PC which at one time was the largest computer user group in the US. The relevance of HAL-PC, and computer user groups in general, has become so low in recent years that many of you may be surprised that HAL-PC hung on in some form well in to the 2010s.

From the post:

Bill Jameson, a former board member who spoke by phone from HAL-PC’s South Post Oak offices, confirmed the decision. He reiterated the “changing society” theme in the email, saying “this society we live in now has a different set of interests and goals. Our type of organization is not included in that.”

“Most of our members are older,” Jameson said. “The cultural norms we group up with are pretty much gone, and as a consequence the organization has not sufficiently adapted to this new culture.”

What led to this? Let’s take a look back to the 1970s and early to mid-1980s when computers were a new thing, and an era before the vast majority of people had access to the Internet. Most computer-to-computer communication was done via modems over analog telephone lines. Through most of these two decades, 9600 bits per second (bps) modem speeds were a long-chased ideal, with 300, 1200, and 2400 bps being much more common. In the place of the Internet, there were bulletin board systems (BBSes) and amateur email networks like FidoNet.

But more importantly, there was no hitting the power switch, waiting a minute, and coming back to a full color graphical user interface, and clicking on a few things to launch whatever software one wanted to run.  There was the DOS prompt, or on earlier computers a BASIC interpreter (some of the really exotic models didn’t even had that, but only had a Forth interpreter or even just an assembler). There were no mice during this era for the most part, much less something to move a pointer to and click on. If one wanted the computer to do something, one typed it in. Typing was an unmistakable prerequisite to computer literacy, with knowing MS-DOS commands or their equivalent on one’s platform following closely behind. Most people learned a little programming, even if it was MS-DOS batch files or writing short BASIC programs, out of necessity.

Most importantly, though, the line between programmer (today more often called “developer”) and user was much blurrier than it is today (I’ll cover this in more depth later). And this is where user groups came in, where the more advanced users would teach those newer to computing how to get the most out of their gadgets. User groups are the reason technology is not feared as it once was by those who lived through the era in which they existed and were largely relevant.

Fast forward to the mid-1990s. Microsoft came out with Windows 95 and with it there was no separate MS-DOS product any more, it was all graphical and you had to dig for the MS-DOS prompt if you still wanted it. At least through Windows 98 there was still a fair amount of MS-DOS compatibility (I think Windows 98 still had the ability to boot into a “command prompt” as they call it to run older MS-DOS software). But before too long, the command prompt would become harder and harder to find, and at least Microsoft would rather have you believe it is simply less useful in modern times (I personally believe Microsoft themselves made it that way on purpose). Instead of being something magical, computers take their place next to the TVs and stereo systems at stores like Best Buy and Target. For the most part, computers are just another appliance now. It makes as much sense to have a computer user group as it does a refrigerator user group or toaster oven user group.

On one hand, it still amazes me that once back in 2008 or so, I found a still-usable computer sitting out by the dumpster, and the main reason for this was that there was some kind of issue with the Windows XP install. Rather than try to fix it, this person dumped it and bought a new one. That computer eventually became a firewall/router which served us well for a good 3 years plus, though those who know me will (correctly) guess the first thing I did was wipe the Windows XP install and replace it with OpenBSD (4.9 or 5.0, I think, but I could be wrong). On the other, it’s a rather sad reflection on the public’s attitude to computers, and just how much ease of use has taken a lot of the magic out of learning how to use a computer.

I use a graphical interface now, though I have not kept Windows installed on any computer I’ve considered “mine” for at least 12 years now. While I am not quite at “a mouse is a device used to point at the xterm you want to type in” it’s rare that I don’t have at least one command line open somewhere. In at least one situation on a computer that wasn’t “mine” where getting rid of the installed copy of Windows wasn’t an option, I kept an Ubuntu install on a thumb drive and booted that instead of Windows when I needed to use that computer. The installation failed several times in weird and not-so-wonderful ways, but I got it back up and running almost every time. (The one time I didn’t? The thumb drive itself (not the Linux kernel finding filesystem errors) started throwing write protection errors. I got the surviving important data off that drive and at least temporarily used a different (very old and underpowered) computer exclusively for a while.)

Personally, I’ve never lost sight of the magic behind computing. I’ll admit it, I get a thrill out of installing a new operating system on either brand-new or new-to-me hardware, which I’ve done for every system up until the last one I received new (the one I’m writing this post on). This one was ordered custom-built and with the operating system (Ubuntu GNU/Linux 11.04) already on it for three reasons: first, because for once, it was a realistic option to buy a computer with Ubuntu pre-installed; second, I needed to make immediate use of the computer as soon as it arrived; and third, it was a different thrill to experience the closest equivalent to how most people today get a store-bought PC. The great job Canonical (the company behind Ubuntu) has done even trying to mount some kind of challenge to what is a damn-near-monopoly by Microsoft deserves a post all its own (which I may make sometime in July).

But I think it’s a sad commentary on the state of computing that most computer users in this decade will never even think that building their own computer is a realistic option, much less doing their own operating system install, much less realizing there are many other choices for operating system besides those which come from Microsoft (or Apple). There is a certain degree of intimidation to overcome when it comes to staring down an empty computer case and the components that will go into it. I was there once myself; I once built a new 80486DX/33 and barely had a freaking clue what the heck I was doing. It helped that I had a friend at the time to guide me through the tricky parts (over the phone). Today’s hardware is, if anything, much more friendly towards do-it-yourself builds: RAM chips, CPU chips, power supply connectors, and SATA (hard drive) connectors are all keyed to only go in one way; the only thing decreasing is the number of people actually willing to pick up the screwdriver.

(Quick sidenote here: Apple never did embrace the idea that users could build their own computers. For better or worse, Apple has positioned themselves as sort of a “luxury brand” of electronics. The only thing worse than Microsoft’s near monopoly is that it’s impossible to buy components and build one’s own iMac, or even buy an Apple computer without Mac OS X. Apple has actually made it a EULA violation to run Mac OS X on unlicensed hardware, even though today’s “PC compatible” computers can run it. This is one reason I point to when I say that I believe Apple has been more harmful to the state of computing than Microsoft has been.)

Another sad commentary is the rather rigid wall that’s been built between “user” and “developer” (what we used to call “programmer”). Even “power user” doesn’t have quite the same aura it once did, and it’s used as a derisive term more often than one might otherwise think (and way more often than it should be, in my opinion). I find myself slamming into this wall on many occasions, as there are things I’d like to be able to do as a user, which I research and find out one needs to actually be a developer to do them. (Which sometimes means it’s impossible or going to be much harder to do than it need be; other times, I simply want to say “no, this shouldn’t be a developer feature, I’m just a user who wants to make full use of the technology.”) For example: Windows (which has lineage back to MS-DOS) no longer comes with a BASIC interpreter. Another example: Neither Windows nor Mac OS X come with compilers suitable for writing one’s own software. (Microsoft makes no-cost versions available for download, but they aren’t easy to find, and in all likelihood are a thinly disguised excuse to get one bumping into the limits and then shelling out money for the “real” compilers.) It is in fact expected that most users will simply spend amounts of money (which can run into hundreds, thousands, or even ten thousands of dollars) on the appropriate pre-written, shrink-wrapped, proprietary software. This is great for the stockholders of Microsoft, Apple, and other members of the proprietary software cartel like Adobe. It’s lousy if one’s “just a user.”

The Flappy Bird saga, or: why some people shouldn’t make games

I was originally going to let all the flap about Flappy Bird sail right over my head and into wherever this stuff goes in cyberspace when it’s done being popular. I am, after all, someone who is very un-picky about exactly which games I play, leaning towards GPL software instead of the latest shrink-wrapped XBox One, PS4, or Wii titles. I thought this didn’t really concern me, but then I read Dwight Silverman’s post to TechBlog about Flappy Bird.

For some reason when I was about to read this, I had thoughts of recent articles about “rape culture” in my head. I had just finished watching a video about a human trafficking problem in Europe.

And then it all made sense.

I’m saying this as someone who never played Flappy Bird (and probably will never get a chance to thanks to Mr. Nguyen’s selfish actions).

This is why I’m leery about depending on mobile phone apps:

[Flappy Bird creator Dong] Nguyen said the main reasons for pulling the game were guilt due to its addictive quality, and the fact that the attention has made his life more complicated[…]

Games are supposed to make people happy. To Mr. Nguyen, making Flappy Bird wasn’t about making people happy. No, Flappy Bird, in the end, wasn’t really the game itself, but a piece on Mr. Nguyen’s game board. A piece due to the design of today’s mobile devices, he could choose to take off the board at his own whim. It’s about control, about the opportunity to impose his own morals on those who partook of the game for whatever reason.

Indeed, I think Mr. Nguyen is exactly the kind of person Richard Stallman is warning us about when he refers to the emotional argument in his essay “Why Software Should Be Free”:

The emotional argument goes like this: “I put my sweat, my heart, my soul into this program. It comes from me, it’s mine!”

This argument does not require serious refutation. The feeling of attachment is one that programmers can cultivate when it suits them; it is not inevitable. Consider, for example, how willingly the same programmers usually sign over all rights to a large corporation for a salary; the emotional attachment mysteriously vanishes. By contrast, consider the great artists and artisans of medieval times, who didn’t even sign their names to their work. To them, the name of the artist was not important. What mattered was that the work was done—and the purpose it would serve. This view prevailed for hundreds of years.

(Richard goes on in his essay to mention the economic argument, which I don’t think applies here, as Mr. Nguyen deleted Flappy Bird in spite of it making him a relatively obscene amount of money.)

What if Mr. Nguyen were an arcade game programmer in the late 1970s or early 1980s? It would be as if, say, Taito could have decided those who haven’t yet played one game of Space Invaders at a given point in time could never do so for their entire lives in light of a shortage of 100 yen coins in Japan. (Set aside for the moment the shortage didn’t actually happen, because it easily could have if Space Invaders was as popular in 1978 and 1979 as Flappy Bird, or even something like Angry Birds, is today.) Or if Atari decided something similar for Pong or Asteroids during those crazes. You get the idea.

And the probable result? There would be an outrage. The video game scene succeeded and became what it was, and rebounded as quickly as it did from the 1983 crash, because the companies knew their role. Once an arcade game was sold, it was sold and there was little the companies could really do regarding how many people got to play them.

So, based on what I have read, and as an electronic game player and historian with over 30 years of experience, it is my expert opinion that Mr. Nguyen has no business making games and for him to do so is a detriment to the entire gaming community. It isn’t proper in the least for any game designer to impose their own morals or value judgments over the players of their games. Nobody else has tried to get away with this, and for good reason. Mr. Nguyen clearly doesn’t give a shit about the gaming community. It is most unfortunate indeed that Apple and Google (and, I would assume should he make Windows Phone games, Microsoft as well) will keep letting him sell games in their respective online stores in spite of this, but again, they don’t have to give a shit either, they get their cut of the revenue.

The personality of Mr. Nguyen and the personality of the average rapist are one and the same. Rape isn’t about sex, it’s about control. Control over a rape victim, control over a Flappy Birds player… one and the same. If you really love a game you’ve made, set it free (GPL).

A huge step backwards for technology and free speech?

I’m going to start this post with some affirmations of what I believe, and more importantly what I believe to be sane beliefs regarding technology.

First, computers (which include not only desktop and laptop PCs, but most electronic technology which contains either a microprocessor or its equivalent) are powerful because they do exactly what they are told to do, and don’t do things they aren’t told to do, by their owners. The questions of whether or not it is a good idea, lawful, ethical, etc. are not made by the computer (device) but its operator.

Second, free speech, including documentation of government actions, is fundamental to government accountability for its actions. The ability for a government agency to arbitrarily prevent such documentation is prima facie evidence of a police state.

If these sound reasonable to you, then a recent story by RT News should be extremely troubling. In a nutshell, Apple now has a patent which, if implemented and used, would allow the police or other officials to arbitrarily disable your phone, including the camera.

This is at odds with legal rights under the First Amendment of the US Constitution and similar rights under Article 12, 17, and 19 in the UN Universal Declaration of Human Rights. Imagine a world where incidents like those that happened to Rodney King are commonplace–and no video or even still pictures of egregious police brutality ever surface because as soon as the cops want to beat someone up, they shut off all cameras and phones in the area.

It’s a terrifying thought, isn’t it?

And don’t just say “I obey the law, therefore I have nothing to worry about.” With this kind of power, you can be completely innocent, yet the cops can say anything they want. And there’s no video to prove them wrong. Guilty until proven innocent–and there’s little hope of being proven innocent. Already, the badge serves as a “get out of jail free” card for most garden variety perjury charges. This would turn the badge into carte blanche to silence anyone and everyone for any reason.

Those who work at Apple that made this patent a reality should be ashamed of themselves. This is a good reason to document abusive behavior by law enforcement like there’s no tomorrow. Because when it comes to accountability, there may not be a tomorrow.

The AppGratis incident: Showing Apple’s opacity yet again

Venturebeat recently reported on AppGratis and its unsuccessful attempts to just get a dialog with Apple after abruptly and quickly having its app removed from the App Store. Which, for iOS apps, basically means it was dead in the water (it has since been restored, though for how long remains to be seen).

I’ve said many times just how bad it is to place oneself at the mercy of a large corporation. If you’re lucky and don’t run afoul of either the published rules and the whims of Apple, then it might work out okay. Then again, it might not, as Apple can change the rules to make your app non-compliant, like they did to AppGratis. They can also decide on a whim to just not approve your app. Apple rules the iOS platform with an iron fist. This isn’t news, it’s been like this more or less since the beginning.

Google’s Android platform, though not perfect, doesn’t have these issues, in addition to giving consumers a wider choice of manufacturers (though Samsung is far and away the front runner at the time of writing). Unlike iOS, Android will let you install apps from sources besides the Google Play store; you do have to acknowledge a rather scary-looking warning to do so, but you can do it.

I still wish picking a smartphone platform wasn’t about choosing between the lesser of two evils. Well, actually, the least of four evils if you’re counting (Windows Phone and BlackBerry are also technically possibilities, but I find them equally as repulsive as the iPhone and for similar reasons).

Thoughts (primarily) on the passing of Steve Jobs

First, I am saddened the same as most people to hear of the passing of Steve Jobs. Outside of Steve’s contributions to technology, reason enough to be sad would be the relatively young age at which Steve left us; in modern times, 56 is quite a young age at which to pass on, barely two-thirds of the 78.7 years life expectancy in the US.

I agree in principle with, and in fact admire, many of the advancements in technology and user interfaces which Steve played a part in. It is remarkable that Steve took a company on the verge of failure and transformed it into something that has made even Microsoft sit up and take notice. This is no small feat and Steve has earned his legacy in the history of computing and technology. I also agree with the substance of the statements made by President Barack Obama, Bill Gates, Disney president Bob Iger, and Mark Zuckerberg.

I say all this despite the fact I have been actively boycotting Apple’s products in the recent past up until the present (for reasons that should be obvious to frequent readers of this blog), and this is unlikely to change for the forseeable future.

Will this seem odd to many people? Certainly. But this isn’t the first time.

I once had a copy of the book Winning with the P&G 99 by Charles Decker, purchased in the middle of my active boycott of Procter & Gamble (among others) for their sponsorship of the Jenny Jones talk show, which ended upon the show’s cancellation in 2003 (and, thankfully, predated P&G’s acquisiton of Gillette, thus I never had to quit shaving with Mach3 or Fusion razors as a result of the boycott).

The tactics of a company, including marketing, PR, and basic business strategy, are still relevant to my career as a marketing and PR consultant, whether or not I personally purchase their products. The same general principles apply to Apple now that applied to P&G then (though at the time of purchasing the P&G book I was not actively in marketing consulting then, but in a more general self-directed study of business). And thus the same general principles apply, also, to the work of Steve Jobs as did to the staff of P&G during the time of my active boycott of the company’s products.

I have more to say, but it’ll come in a followup post in about a week or so. Right now is simply not an appropriate time.