I make no secret of the fact that I love old computers, and that I think modern computers have lost their way in terms of providing utility to users. To that end, I write about, and think about, computers and the way to move computers forward, without losing site of the fact that computers should serve their users. I grew up with hand-me-down computers, from Atari to Apple to Dell, and in the process I got to experience a sizable portion of computer history very quickly, in my teen years.
This left me with Opinions.
I write about things that are informed by these opinions often. When I talk about building a World Wide Web analog without the internet, about reviving the BBS or when I lament the fact that Gopher was doomed to obscurity by the modern web, it is in response to my experiences with an array of computers from the dawn of the home computer revolution up through to the modern age. There was a time when computers were magical.
I had come, in recent months, to suspect that I might just be an old fuddy-duddy. I'm approaching 30 years old, and I had begun to feel like I was looking at modern computers and modern software through the lens of someone who was being left behind, shouting at the sky, shaking my fists at the kids on my lawn. I was coming to the conclusion that my love of these computers of my childhood, and of ones that I had never had the chance to interact with, was some kind of rose tinted nostalgia.
I had not fully subscribed to this theory, but it seemed more likely that I was romanticizing something that was actually Not Great that it was that nearly every modern software and hardware platform had inexplicably gotten shittier.
I am now prepaired to say, with conviction, that every modern hardware and software platform has gotten shittier, and that it's not inexplicable. I'm going to try to explain how I came to this conclusion, and give some potential explainations.
First, let me lay out a little bit about my technical literacy and my profession, this might help explain some of the assertions that I'm going to make. I started using computers, and the internet, in 1995. Our family computer, my first computer, ran Windows 3.11 (for workgroups). Later, in the late 90s, I was given an Atari 400 and reams of books and magazines on basic, followed shortly by an Apple II GS and dozens of disks of software.
I loved computers. I loved making weird little programs, and silly applications and games. I'd build things in GW Basic or Freebasic, and distribute it to my friends on floppy disks. Even in the latter half of the 00s, I was passing half broken games around on floppy disks (or collections on CD-Rs, when I could talk someone in to buying some for me.) Computers were, by and large, ubiquitous in my life. Nearly everyone had an old one they didn't want, and a new one they didn't understand.
I collected cast offs from neighbors, from thrift stores, from office upgrades. I rebuilt them, installed useful or fun software on them, and sold them or gave them away. All of my friends had computers of their own, because I had access to these machines, and I cared enough to outfit them with the appropriate software.
(It must be said, at this point, that 'useful' and 'appropriate' are relative terms. In 2009 I gave a good friend a computer that had been built for Windows 98. It was running Puppy Linux from a CD, and saving data to a USB flash drive over USB 1.1. It did word processing, email, and basic web browsing. It had a whopping 64MB of RAM, and was covered in glitter, googley eyes, and carpet samples. But it was free, and it wasn't useless, and that was important.)
Some of these shortcomings are legitimate bugs. Some of them are bafflingly short sighted or poorly considered architectural decisions. Just as many are cases of a divergence between the needs of the user and the abilities of a program. Modern programs are often feature incomplete, poorly supported, and difficult or impossible to customize. Modern computers are often slow, and cranky. I'm responsible for handling the fallout of this unfortunate situation.
I've seen how revolutionary a computer can be, if it is designed with the needs of the user in mind, and how disastrous the same can be when it is not. I've seen computers used to empower people, and used to oppress. I've seen computers be Good, and the consequences of when they are not.
So that's who I am, and my experience with computers so far. Those are my credentials, and my qualifications.
The Computer Chronicles was a TV show that ran from the early 80s through the early 00s. Over it's nearly 20 year run, The Computer Chronicles covered nearly every facet of the newly developing Computer industry. It was hosted by people with Opinions.
The guests were, frequently, people who were proud of the things they made, or the software they represented.
Watching the developer of CP/M and DR DOS talk to a mainframe engineer who worked at IBM in the 50s about the future of computers as seen from the 1980s was eye opening.
On the one hand, this show serves as an excellent introduction to, or reminder of, the capabilities of computers 35 years ago. It helps us see how far we've come in terms of miniaturization, while also demonstrating again that, in many ways, there is nothing new under the sun.
Before the advent of the internet, reporters were writing their stories on laptops and sending them in over phone lines, 25 years before the release of the iphone HP released a computer with a touchscreen, three years before microsoft released he first version of windows Apple and Visicorp demontrated GUIs wih features that Windows wouldn't be able to approach for another 9+ years.
And, of course, I'm reminded again of Douglas Engelbart's 1968 "Mother of all Demos", in which he demonstrated the mouse, the GUI, instant messaging, networked gaming, and basically every other important development of the following 50 years.
It took 5 years for Xerox to refine and miniturize Engelbart's ideas to the point that they thought they could market them, and another 10 years before Apple refined and further miniturizaed the same ideas, and brought us the Mac.
Nothing is ever new.
There were others working around the same time on similar ideas, or at least from a similar philosophy. Working to make computers, if not intuitive, at least comprehensible. I think this is a noble goal.
The computer is often thought of as a tool, but it is more like a tool shed, in which we store a collection of tools, a source of power, and a workspace.
That is to say, in the 60s and 70s, computers were weak and slow and computer users were also computer programmers. A small, tight knit circle of developers and computer scientists were responsible for the bulk of the progress made in that time, and the idea of designing tools for non-technical users was never considered.
Computers became more affordable, slowly. Affordable computers became more powerful, quickly. Within 10 years, non-technical users were interacting with computers on a daily basis. It was against the beginnings of this backdrop that the phrase I mentioned earlier was coined. "Human Literate Computers" or "Human Centered Computing."
Ease of Use was the holy grail for a lot of computer companies. A computer that was so easy to use that they could sell it to grandma. But, to me at least, Human Literate and Easy to Use are distinct ideas. Many modern applications are Easy to Use. Netflix is Easy to Use. Facebook is, for all it's faults, pretty easy to use. The iPhone, the iPad, and ChromeOS are super easy to use.
Well, they are easy to use as long as you use them in the prescribed way. As long as you let them tell you what you want to do, instead of the other way around.
That, IMO, is the distinction.
I think that many of the steps towards demystifying the computer of the 80s and 90s did good work, but ultimately, the computer industry left the whole idea behind, in favor of making some tasks Very Easy while making other tasks Practically Impossible, and turning everything into a surveillance device.
When I was a kid I was brought up with computers that showed you how they worked.
You booted in to a command prompt or a programming language, or you could get to one, if you wanted to.
I got to play with GW Basic and qBasic and, a little, with hypercard.
I got to take apart software and put it back together and make things that made people happy.
I often wonder why Hypercard had to die.
It was because Jobs wanted the Computer to be an Appliance. A thing only used in prescribed ways.
Letting people build their own tools means letting people control their own destiny.
If I can make what I want, or if someone else can make what they want, and then I can take it apart and improve it, why would I pay for an upgrade? Why would I pay you to build something that doesn't meet my needs?
Hypercard, if your unfamiliar, is powerpoint + instructions.
Here's a great introduction/example: http://www.loper-os.org/?p=568
The author walks you through building a calculator app in about 5 minutes, step by step.
Warning: There's a bit of ableist language tossed around in the last paragraph. Skip it, there's nothing worth reading there anyway.
You use the same kinds of tools you would use to build a slideshow, but you couple them with links, multimedia, and scripting.
Want a visual interface for your database of client data? Great! slap together a roladex card, and drop in a search function.
Go from concept to presentation ready in an hour or two (or less, if you've done this before!)
My nephew has an ipad.
He asked his dad how to write games. His dad didn't know. His dad asked me how to write games on an iPad. I told him not to bother.
My nephew asked me how to learn to write games.
I gave him a raspberry pi and a copy of pico 8.
Now he writes computer games.
He couldn't do that on his iPad.
In the first episode of computer chronicles (https://www.youtube.com/watch?v=wpXnqBfgvPM) the mainframe guy is real adamant about how mainframes are good and micros are bad.
The host, a microcomputer legend, disagrees pretty strongly.
Later, when they talk about the future of networking, the mainframe guy talks about it as a return to mainframes. The micro guy talks about BBSs, peer to peer networks.
The mainframe guys are winning.
@ajroach42 thats because the people in power benefit from that economically.
"The clooouuuuud"
@hairylarry @ajroach42 I think that the need to convince people to buy your software over and over, and now to subscribe to updates, means you need to be constantly adding "features". Commercial software can never be "done".
@hairylarry I agree!
And we've settled for ease of use for business reasons, financial reasons, even though it's worse for users in the long run.
Learning to drive a car is also difficult and frustrating and dangerous, but people manage to do that.
There is a better balance available between ease of use and local control, and we gotta find it.
@ajroach42
Disagree. Firstly I'm not persuaded that the mainframe is a bad thing, and secondly pervasive computing infrastructure allows us to do things we couldn't before.
The question is who controls it it and to whom are they answerable? Mastodon is like Usenet: control is distributed. There are problems with that as Usenet found, but they're radically different from the problems creared by monopolists like Facebook.
@hairylarry I'm still trying to get it turned in to a blog post, but I got distracted by my new toy.
@ajroach42 this is a very insightful outlook, thank you for sharing your thoughts!
@ajroach42 don't forget 20 years of tweaking the CSS for button elements.
@ajroach42 Which is hilarious, considering that in just the past five years, the chip in my iPhone it has gotten so fast it could effectively function as a desktop computer, easily, were it plugged into a keyboard/mouse/monitor.
Heck -- with its always-on net connection, it could effectively function as my very own server node, for anything I wanted.
...if I was allowed to run a server on it.
We have insane amounts of computing sitting in our pockets now, that we can't use for what we want.
@ajroach42 we did fuck up the last 10 years, what do you think our geeks role in this #techmess as we actually have some direct power over this. Outside forces are often used as an excuse for #blocking the change challenge that we do have some power over #OMN
@ajroach42 just a heads up; I would love to read these thoughts in a blog, but there's no way I can keep up with them on my home timeline here
@technomancy No worries. I wasn't sure if I was going to do this as a thread, or if I was going to go straight for a blog post.
I decided to start with a thread in order to organize my thoughts. I'll blog it up tomorrow or Friday and post to my regular blog.
@theoutrider @ajroach42 and don't forget mario maker, which was nintendo (somewhat) embracing the mario ROM hacking scene!
@chr @theoutrider Nintendo is super frustrating with this stuff.
They embrace the community, do something novel, get people on board, and then get scared and cut it off.
@theoutrider
@ajroach42
I love SmileBASIC on 3DS! It truly feels like a little Apple or Commodore and can be surprisingly powerful. If it had external I/O (save/load/print), I think I could use it as a primary computer.
@ajroach42 I completely agree
@ajroach42 this is very much in the same spirit as pico-8. for iPad and iPhone http://lowres.inutilis.com
@ajroach42 that said, raspberry pi + pico8 is probably the more forward-looking and flexible choice!
@bunnyhero That's really cute! It also addresses what is normally my primary concern with iOS dev environments by giving you a way to share your applications.
I was under the impression that that was very much against apple's rules.
@ajroach42 same. i'm surprised that app lets you share code
@ajroach42 Please forgive me for wading into your thread half-way without reading all of it (I promise I come in good faith), but—"He couldn't do that on his iPad"—I don't think this is true:
- https://codea.io
- Not to name-drop but John Carmack has talked about writing Lisp (Scheme actually) on iPad: https://twitter.com/ID_AA_Carmack/status/352791343342436354
@ajroach42 I think, as other have said, this isn't exactly true?
Grasshopper (https://grasshopper.codes/) is a way to create games on an iPad
Hopscotch is Another.
Heck, Apple even released "Swift Playgrounds"
@PrincessRaspberry I'm not saying it's impossible, but rather that it isn't worth the effort.
I appreciate the links! I hadn't heard of these before, so I'm looking forward to exploring them.
My larger point re:ipad is less "you can't make things" and more "It's harder than it should be to make things, and harder still to share them."
'possible' and 'manageable' are worlds apart, you know? But that's a larger discussion.
The tools of the 60s and 70s were primitive, partially because of the limited space and limited power our toolbox could provide for them, but also because our ideas and understanding of how these tools should work were limited by the audience who was using the tools.