Wednesday, April 29, 2009

The more things change...

Just to give an idea where I'm coming from:

My first computer experience was playing with a TRS-80 Model I back in 5th grade, when one of my teachers brought it in to class for a few days. I followed all the computer magazines I could get my hands on - Byte, 80 Micro, Popular Computing, Creative Computing from time to time, A+, Nibble, and more. I lived my computer dreams vicariously through the magazines, because I couldn't afford a real computer of my own. When I was finally able to buy a TRS-80 Pocket Computer, I still lived vicariously - because nice as it was, it was still horribly underpowered compared to a full desktop computer. (1.9 kilobytes of RAM - that's smaller than the size of this post!) I read the pages and dreamed of machines I wanted to own - the TRS-80 Model 100, the Sinclair ZX Spectrum and QL, the Otrona Attache, the Workslate, the Epson HX-20, and more. (In the last few years, as people began unloading older systems on eBay, I've been able to pick up a number of them for reasonable prices.)

So I've lived through a lot of different waves in the computer world. I missed the initial microcomputer wave - the kit-builders who assembled S-100 systems like the Altair 8800 and the IMSAI 8080 from bags of parts, ran CP/M on them, and hooked up display terminals just to be able to communicate with them - but I did come in on the second wave. The TRS-80 Model I, the Commodore PET, and the Apple ][ were the first mass-produced microcomputers, the first that could arguably claim the "personal computer" label, and marked the first mass expansion of computer adoption. I was able to watch it unfolding, and that's given me a rather jaundiced view of many 'recent' buzz-trends in computing.

In the Beginning Was the Command Line? No. In the Beginning was the ROM-based BASIC prompt... well, actually, punch cards. And then little toggle switches on the front of the system cabinet. (What, you thought those switches and blinkenlightsen on the front of the IMSAI 8080 were just for show?) But for the first mass-adoption wave that started with the 1977 Trinity, and the home computer wave that followed, the BASIC interpreter built into the computer was the first thing users saw when they turned on the machine. And they could type in program listings that were included in most computer magazines, and it was Good. And it let them write their first "Hello, World!" program, and it was Good.

And yea, the Fans of the Computer were much pleased, and wrote introductory programming articles for Popular Computing starring Sherlock Holmes teaching Dr. Watson to program. (I Kid You Not.) And it was proclaimed that the Age of the Computer was upon us, and that everyone would have a computer in the home, and that everyone would learn to program. And it was said that this would lead to a revolution in society, that learning to program would make everyone better at critical thinking, to the greater benefit of all. And Lo!, the sales of the Atari 400/800 and the Commodore 64 did skyrocket, and the great Cosby did speak for the TI-99/4A, and Coleco did show a TV commercial proclaiming the Adam capable of temporal shifts. ("Adam, my time-travel program!") And all were sure that this would come to pass.

And then the bottom dropped out.

It turned out that by and large, the people buying these millions of first-generation home computers weren't interested in learning how to write their own recipe programs. The home computer market cratered, much like the home videogame market had, and most of the first-generation systems ended up getting stuffed in a closet somewhere, or on garage sale tables for $10. The 'computer in every home' had to wait another decade or two, and it wasn't because society in general was interested in learning how to program - it was because the GUI made computers easy enough and powerful enough for average people to use without learning to program.

So when I see someone touting an operating system that requires users to get down into the guts and tinker with it to make it work the way they want, and that this will lead to Wonderful Things because the users will know how the computer works, and will have more control over it and can make it do more... you'll pardon me if I'm skeptical.

No comments:

Post a Comment