This is a thread about computers.

I have a lot to say.

I might not be able to finish right now.

I'm going to post unlisted.

I make no secret of the fact that I love old computers, and that I think modern computers have lost their way in terms of providing utility to users. To that end, I write about, and think about, computers and the way to move computers forward, without losing site of the fact that computers should serve their users. I grew up with hand-me-down computers, from Atari to Apple to Dell, and in the process I got to experience a sizable portion of computer history very quickly, in my teen years.

This left me with Opinions.

I write about things that are informed by these opinions often. When I talk about building a World Wide Web analog without the internet, about reviving the BBS or when I lament the fact that Gopher was doomed to obscurity by the modern web, it is in response to my experiences with an array of computers from the dawn of the home computer revolution up through to the modern age. There was a time when computers were magical.

I had come, in recent months, to suspect that I might just be an old fuddy-duddy. I'm approaching 30 years old, and I had begun to feel like I was looking at modern computers and modern software through the lens of someone who was being left behind, shouting at the sky, shaking my fists at the kids on my lawn. I was coming to the conclusion that my love of these computers of my childhood, and of ones that I had never had the chance to interact with, was some kind of rose tinted nostalgia.

I had not fully subscribed to this theory, but it seemed more likely that I was romanticizing something that was actually Not Great that it was that nearly every modern software and hardware platform had inexplicably gotten shittier.

I am now prepaired to say, with conviction, that every modern hardware and software platform has gotten shittier, and that it's not inexplicable. I'm going to try to explain how I came to this conclusion, and give some potential explainations.

First, let me lay out a little bit about my technical literacy and my profession, this might help explain some of the assertions that I'm going to make. I started using computers, and the internet, in 1995. Our family computer, my first computer, ran Windows 3.11 (for workgroups). Later, in the late 90s, I was given an Atari 400 and reams of books and magazines on basic, followed shortly by an Apple II GS and dozens of disks of software.

Later still, I started collecting computer detritus, and assembling frankenstiend linux boxes, and emulating some of the the machines I read about in the magazines I had as a kid.

I loved computers. I loved making weird little programs, and silly applications and games. I'd build things in GW Basic or Freebasic, and distribute it to my friends on floppy disks. Even in the latter half of the 00s, I was passing half broken games around on floppy disks (or collections on CD-Rs, when I could talk someone in to buying some for me.) Computers were, by and large, ubiquitous in my life. Nearly everyone had an old one they didn't want, and a new one they didn't understand.

For a teenager and an aspiring computer programmer, the 00s were a great time to learn.

I collected cast offs from neighbors, from thrift stores, from office upgrades. I rebuilt them, installed useful or fun software on them, and sold them or gave them away. All of my friends had computers of their own, because I had access to these machines, and I cared enough to outfit them with the appropriate software.

(It must be said, at this point, that 'useful' and 'appropriate' are relative terms. In 2009 I gave a good friend a computer that had been built for Windows 98. It was running Puppy Linux from a CD, and saving data to a USB flash drive over USB 1.1. It did word processing, email, and basic web browsing. It had a whopping 64MB of RAM, and was covered in glitter, googley eyes, and carpet samples. But it was free, and it wasn't useless, and that was important.)

I went to school to become a programmer, and discovered that I don't enjoy programming as it exits today. I understand it well enough, and I *can* do it, but I don't *want* to. I make websites, and I build tools to help other people use computers.

I make my living as a systems administrator and support engineer. (and I'm looking for a new gig, if you're hiring.) That's a fancy way of saying that I solve people's computer problems. Professionally, I'm responsible for identifying and mitigating the shortcomings of various computer systems.

Guess what?
There are a lot of these shortcomings. Like, a lot. More than I ever expected.

Some of these shortcomings are legitimate bugs. Some of them are bafflingly short sighted or poorly considered architectural decisions. Just as many are cases of a divergence between the needs of the user and the abilities of a program. Modern programs are often feature incomplete, poorly supported, and difficult or impossible to customize. Modern computers are often slow, and cranky. I'm responsible for handling the fallout of this unfortunate situation.


I've seen how revolutionary a computer can be, if it is designed with the needs of the user in mind, and how disastrous the same can be when it is not. I've seen computers used to empower people, and used to oppress. I've seen computers be Good, and the consequences of when they are not.

So that's who I am, and my experience with computers so far. Those are my credentials, and my qualifications.

Before we go any further, let's talk about The Computer Chronicles.

The Computer Chronicles was a TV show that ran from the early 80s through the early 00s. Over it's nearly 20 year run, The Computer Chronicles covered nearly every facet of the newly developing Computer industry. It was hosted by people with Opinions.

The guests were, frequently, people who were proud of the things they made, or the software they represented.

Watching the developer of CP/M and DR DOS talk to a mainframe engineer who worked at IBM in the 50s about the future of computers as seen from the 1980s was eye opening.

On the one hand, this show serves as an excellent introduction to, or reminder of, the capabilities of computers 35 years ago. It helps us see how far we've come in terms of miniaturization, while also demonstrating again that, in many ways, there is nothing new under the sun.

Before the advent of the internet, reporters were writing their stories on laptops and sending them in over phone lines, 25 years before the release of the iphone HP released a computer with a touchscreen, three years before microsoft released he first version of windows Apple and Visicorp demontrated GUIs wih features that Windows wouldn't be able to approach for another 9+ years.

And, of course, I'm reminded again of Douglas Engelbart's 1968 "Mother of all Demos", in which he demonstrated the mouse, the GUI, instant messaging, networked gaming, and basically every other important development of the following 50 years.

It took 5 years for Xerox to refine and miniturize Engelbart's ideas to the point that they thought they could market them, and another 10 years before Apple refined and further miniturizaed the same ideas, and brought us the Mac.

Nothing is ever new.

The whole video of Engelbart's Online System (NLS) is available on youtube. Some of it is *really* interesting. Most of it is unfortunately dry. It's easy to forget that this was 50 years ago, and also mindblowing that it was only 50 years ago.

Anyway, back to Computer Chronicles. In an episode about Word Proccessors, the man they were interviewing said "There's a lot of talk about making people more computer literate. I'd rather make computers more people literate." There's a phrase that resonated with me in a big way.

It sounds like the kind of semantic buzzword shuffling so common in standard corporate speak, but I got the impression that the guy that said it, believed it. He believed that computers had gotten powerful enough that they no longer had to be inscrutable.

There were others working around the same time on similar ideas, or at least from a similar philosophy. Working to make computers, if not intuitive, at least comprehensible. I think this is a noble goal.

The computer is often thought of as a tool, but it is more like a tool shed, in which we store a collection of tools, a source of power, and a workspace.

The tools of the 60s and 70s were primitive, partially because of the limited space and limited power our toolbox could provide for them, but also because our ideas and understanding of how these tools should work were limited by the audience who was using the tools.

That is to say, in the 60s and 70s, computers were weak and slow and computer users were also computer programmers. A small, tight knit circle of developers and computer scientists were responsible for the bulk of the progress made in that time, and the idea of designing tools for non-technical users was never considered.

Computer culture had, by and large, a kind of elitism about it as a result of the expense and education required to really spend much time with a computer. This changed, slowly, starting in the mid 70s with the development of the Microcomputer Market and CP/M.

Computers became more affordable, slowly. Affordable computers became more powerful, quickly. Within 10 years, non-technical users were interacting with computers on a daily basis. It was against the beginnings of this backdrop that the phrase I mentioned earlier was coined. "Human Literate Computers" or "Human Centered Computing."

Ease of Use was the holy grail for a lot of computer companies. A computer that was so easy to use that they could sell it to grandma. But, to me at least, Human Literate and Easy to Use are distinct ideas. Many modern applications are Easy to Use. Netflix is Easy to Use. Facebook is, for all it's faults, pretty easy to use. The iPhone, the iPad, and ChromeOS are super easy to use.

Well, they are easy to use as long as you use them in the prescribed way. As long as you let them tell you what you want to do, instead of the other way around.

That, IMO, is the distinction.

I think that many of the steps towards demystifying the computer of the 80s and 90s did good work, but ultimately, the computer industry left the whole idea behind, in favor of making some tasks Very Easy while making other tasks Practically Impossible, and turning everything into a surveillance device.

When I was a kid I was brought up with computers that showed you how they worked.

You booted in to a command prompt or a programming language, or you could get to one, if you wanted to.

I got to play with GW Basic and qBasic and, a little, with hypercard.

I got to take apart software and put it back together and make things that made people happy.

I got to make things that I needed. I got to make things that make me happy.

Today, the tools to do that are complex to compensate for the vast additional capabilities of a modern computer, but also to reinforce technical elitism.

I often wonder why Hypercard had to die.

It was because Jobs wanted the Computer to be an Appliance. A thing only used in prescribed ways.

Letting people build their own tools means letting people control their own destiny.

If I can make what I want, or if someone else can make what they want, and then I can take it apart and improve it, why would I pay for an upgrade? Why would I pay you to build something that doesn't meet my needs?

I'm mentioning hypercard specifically because I've been relearning hypercard recently, and it is *better* and more useful than I remember it being.

It's honestly revelatory.

Show newer

@drwho How wild is it that Jobs let something like that out of the building?

@ajroach42 also, as a final comment it looks like someone is making an opensource hypercard called vipercard


I hope you are preserving this thread in one place. It is a thoughtful and valuable series of observations.

I am sorry that you felt it was good to remove from public timelines AND I am glad I'm following you so I get to see it.

Thank you.

@Algot I posted unlisted because it's a Lot of posts, and I didn't want to clog the FTL. Feel free to boost.

@Algot I will be posting it as a blog post eventually, once these ideas are a little less raw.

I'll post about that when it's ready.


Thank you.

Much of what you say resonates with my own experiences.

TRS-80 was my beginning point --> Apple II+ --> Mac --> DOS --> Windows --> GNU/Linux --> 3D printing and RPi

I feel like I'm getting back to my beginnings.


Easy to use may actually stand in the way of being a learning tool when understanding the tool itself is the goal.

This last bit isn't exactly true: making computers accessible to and usable by students specifically was a big deal (and recieved enormous amounts of government funding), and it's exactly this research (rather than the defense research) that became the basis for the most important stuff in computer history.

BASIC came out of an attempt at Dartmouth to give non-technical users access to a computer they had obtained -- first, Dartmouth students, then later, undergraduates at other universities along the east coast, and then ultimately, high school students and prisoners across the north-eastern united states. The process of making these time-sharing systems usable molded BASIC into a state where, years later, putting it on home computers was a no-brainer.

PARC's Alto & Smalltalk programs grew out of the same project at the ARPA level (though ARPA got rid of it not long after), hence the focus on catering to school children.

A small tight-knit circle of developers and computer scientists were absolutely responsible for the bulk of the progress made during that period, but many of them cared deeply about non-technical users; the people who didn't (and still don't) care about non-technical users are not the researchers but the corporate devs, since in many ways business protects software from needing to become usable & functional.

@enkiv2 @ajroach42 There's also PLATO! Even though PLATO was about using computers for education, they needed people to be able to create lessons for it without a huge amount of training, and as a result students were able to create all kinds of software for it.

@freakazoid @enkiv2 I learned about PLATO after writing this thread. I haven't spent much time with the emulator, but what I've seen is impressive.

@ajroach42 I really need to finish reading The Friendly Orange Glow. I'm about halfway through. It's REALLY LONG. has a bunch of PLATO-related material. It would be awesome to see an emulator and a bunch of the software on there! Closest I can find is what appears to be a port of a course from PLATO to the Apple ][.

Have you thought about creating a collection on there that exemplifies your thinking in this thread?


@freakazoid @enkiv2

There's a web based emulator. I'd have to find it, but it exists.

@ajroach42 Yes, with your favorite hypercard stacks, etc. Software that exemplifies what it means for a computer to be usable AS a computer.


@freakazoid @enkiv2 That's an idea I had not considered.

Examples of human centered computing.

Yeah I should probably put some stuff together.

Show newer

@freakazoid @enkiv2 @ajroach42 as a kid in the early 80’s Plato was hot stuff. Sure you could have any color you liked as long as they were orange and black, but 512x512 dot addressable display with custom character sets and interactive multiuser applications was way different from 8-bit micros. Sunday mornings biking over to the CERL building to play Empire was living in the future.

@enkiv2 @ajroach42

I think you're both right; but it is important to remember, as much as Dartmouth and basic were about getting computers into the minds and hands of people, it was the US on which that focused, and it was at a time when most people in the US only knew of the computer from the census or the space program.

There was no installed base of existing users. It was all about creating new users.

Where a tremendous amount of public involvement with computers was happening at the time was Canada and the UK. They had the whole Commodore PET thing going, the Amstrads, plus the BBC Micro and the Sinclair kits.

I guess the point is, it wasn't at all a sure thing back then. Selling computers to the common folk was risky, because the the computer itself was the killer app of the day; and if you weren't practically engaged with that philosophy already when you got it, you probably wouldn't stick with it for long.

@ajroach42 I'm kind of sad that one part of that demo never caught on, that being the chording keyboard. Having a one handed chording device makes quite a bit of sense when combined with a mouse in the other hand.

@LilFluff yes! The cykey or the microwtiter made good strides here and then disappeared.

@ajroach42 my sister used to regularly use a brailler which gave me a bit of a bug for the idea at a young age (if you haven't seen one, there are 8 regular keys (for the 6 and 8 dot varieties of braille), a space bar, and on some models a single character advance/backspace keys or else a clutch&slider to move along the line. If you want to type an "r" you simultaneously press keys 1, 2, 3, and 5 to emboss those dots at the current position.)

@ajroach42 with the six dots of English Braille you can type all 26 letters of the alphabet, numbers, punctuation, and several common two and three letter combinations. Numbers are done using a character that says the following is a number and then a-j are reused for the ten digits. Likewise there's a capital sign that says the following letter is a capital. So despite six dots only having 64 combinations, standardized English Braille has around 250 'characters'.

@LilFluff The microwriter used something akin to brail chording, IIRC.

I *really* wanted one when I was in highschool, but I'm less enamored these days. I'd rather see a modern recreation, I think.

@ajroach42 I started using computers in 1982 and had a similar experience to you. I think your experience is rare among folks who started using computers in '95, but not at all rare in someone who started using them 10-20 years earlier.

@ajroach42 I read what you wrote but have not dug through all the responses, so please forgive me if I say something that's already been mentioned.

I think a lot of the problem is that the nature of the people programming computers has changed. Back in the 60s-80s, nearly everyone writing software was a tinkerer. Nowadays "programmers" are mass produced, and by and large they are neither tinkerers nor engineers; they are laborers.

@ajroach42 So we now have a system for churning out software using laborers. Naturally, the vast majority of tools are built for those laborers.

@ajroach42 The tools are also built *by* laborers. The thing you want would have to be a product of an entirely alien evolutionary line. I think Smalltalk is a pretty good example of a tool like that that is still in active use. But it stays small because if you want to actually work in programming (i.e. be a laborer) or build tools for the laborers, Smalltalk isn't particularly useful for it.

@ajroach42 I've been thinking about the idea of trying to provide something analogous to that experience of booting to the OK prompt on modern hardware without severely restricting functionality, and I think Genode might be on its way to providing a good base for such a thing, because it could run both your "shell" with very little underneath as well as VMs running full-blown OSes that could be started from the shell.

Sign in to participate in the conversation
R E T R O  S O C I A L

A social network for the 19A0s.