This is a thread about computers.
I have a lot to say.
I might not be able to finish right now.
I'm going to post unlisted.
I make no secret of the fact that I love old computers, and that I think modern computers have lost their way in terms of providing utility to users. To that end, I write about, and think about, computers and the way to move computers forward, without losing site of the fact that computers should serve their users. I grew up with hand-me-down computers, from Atari to Apple to Dell, and in the process I got to experience a sizable portion of computer history very quickly, in my teen years.
This left me with Opinions.
I write about things that are informed by these opinions often. When I talk about building a World Wide Web analog without the internet, about reviving the BBS or when I lament the fact that Gopher was doomed to obscurity by the modern web, it is in response to my experiences with an array of computers from the dawn of the home computer revolution up through to the modern age. There was a time when computers were magical.
I had come, in recent months, to suspect that I might just be an old fuddy-duddy. I'm approaching 30 years old, and I had begun to feel like I was looking at modern computers and modern software through the lens of someone who was being left behind, shouting at the sky, shaking my fists at the kids on my lawn. I was coming to the conclusion that my love of these computers of my childhood, and of ones that I had never had the chance to interact with, was some kind of rose tinted nostalgia.
I had not fully subscribed to this theory, but it seemed more likely that I was romanticizing something that was actually Not Great that it was that nearly every modern software and hardware platform had inexplicably gotten shittier.
I am now prepaired to say, with conviction, that every modern hardware and software platform has gotten shittier, and that it's not inexplicable. I'm going to try to explain how I came to this conclusion, and give some potential explainations.
First, let me lay out a little bit about my technical literacy and my profession, this might help explain some of the assertions that I'm going to make. I started using computers, and the internet, in 1995. Our family computer, my first computer, ran Windows 3.11 (for workgroups). Later, in the late 90s, I was given an Atari 400 and reams of books and magazines on basic, followed shortly by an Apple II GS and dozens of disks of software.
Later still, I started collecting computer detritus, and assembling frankenstiend linux boxes, and emulating some of the the machines I read about in the magazines I had as a kid.
I loved computers. I loved making weird little programs, and silly applications and games. I'd build things in GW Basic or Freebasic, and distribute it to my friends on floppy disks. Even in the latter half of the 00s, I was passing half broken games around on floppy disks (or collections on CD-Rs, when I could talk someone in to buying some for me.) Computers were, by and large, ubiquitous in my life. Nearly everyone had an old one they didn't want, and a new one they didn't understand.
For a teenager and an aspiring computer programmer, the 00s were a great time to learn.
I collected cast offs from neighbors, from thrift stores, from office upgrades. I rebuilt them, installed useful or fun software on them, and sold them or gave them away. All of my friends had computers of their own, because I had access to these machines, and I cared enough to outfit them with the appropriate software.
(It must be said, at this point, that 'useful' and 'appropriate' are relative terms. In 2009 I gave a good friend a computer that had been built for Windows 98. It was running Puppy Linux from a CD, and saving data to a USB flash drive over USB 1.1. It did word processing, email, and basic web browsing. It had a whopping 64MB of RAM, and was covered in glitter, googley eyes, and carpet samples. But it was free, and it wasn't useless, and that was important.)
I went to school to become a programmer, and discovered that I don't enjoy programming as it exits today. I understand it well enough, and I *can* do it, but I don't *want* to. I make websites, and I build tools to help other people use computers.
I make my living as a systems administrator and support engineer. (and I'm looking for a new gig, if you're hiring.) That's a fancy way of saying that I solve people's computer problems. Professionally, I'm responsible for identifying and mitigating the shortcomings of various computer systems.
There are a lot of these shortcomings. Like, a lot. More than I ever expected.
Some of these shortcomings are legitimate bugs. Some of them are bafflingly short sighted or poorly considered architectural decisions. Just as many are cases of a divergence between the needs of the user and the abilities of a program. Modern programs are often feature incomplete, poorly supported, and difficult or impossible to customize. Modern computers are often slow, and cranky. I'm responsible for handling the fallout of this unfortunate situation.
I've seen how revolutionary a computer can be, if it is designed with the needs of the user in mind, and how disastrous the same can be when it is not. I've seen computers used to empower people, and used to oppress. I've seen computers be Good, and the consequences of when they are not.
So that's who I am, and my experience with computers so far. Those are my credentials, and my qualifications.
Before we go any further, let's talk about The Computer Chronicles.
The Computer Chronicles was a TV show that ran from the early 80s through the early 00s. Over it's nearly 20 year run, The Computer Chronicles covered nearly every facet of the newly developing Computer industry. It was hosted by people with Opinions.
The guests were, frequently, people who were proud of the things they made, or the software they represented.
Watching the developer of CP/M and DR DOS talk to a mainframe engineer who worked at IBM in the 50s about the future of computers as seen from the 1980s was eye opening.
On the one hand, this show serves as an excellent introduction to, or reminder of, the capabilities of computers 35 years ago. It helps us see how far we've come in terms of miniaturization, while also demonstrating again that, in many ways, there is nothing new under the sun.
Before the advent of the internet, reporters were writing their stories on laptops and sending them in over phone lines, 25 years before the release of the iphone HP released a computer with a touchscreen, three years before microsoft released he first version of windows Apple and Visicorp demontrated GUIs wih features that Windows wouldn't be able to approach for another 9+ years.
And, of course, I'm reminded again of Douglas Engelbart's 1968 "Mother of all Demos", in which he demonstrated the mouse, the GUI, instant messaging, networked gaming, and basically every other important development of the following 50 years.
It took 5 years for Xerox to refine and miniturize Engelbart's ideas to the point that they thought they could market them, and another 10 years before Apple refined and further miniturizaed the same ideas, and brought us the Mac.
Nothing is ever new.
The whole video of Engelbart's Online System (NLS) is available on youtube. Some of it is *really* interesting. Most of it is unfortunately dry. It's easy to forget that this was 50 years ago, and also mindblowing that it was only 50 years ago.
Anyway, back to Computer Chronicles. In an episode about Word Proccessors, the man they were interviewing said "There's a lot of talk about making people more computer literate. I'd rather make computers more people literate." There's a phrase that resonated with me in a big way.
It sounds like the kind of semantic buzzword shuffling so common in standard corporate speak, but I got the impression that the guy that said it, believed it. He believed that computers had gotten powerful enough that they no longer had to be inscrutable.
There were others working around the same time on similar ideas, or at least from a similar philosophy. Working to make computers, if not intuitive, at least comprehensible. I think this is a noble goal.
The computer is often thought of as a tool, but it is more like a tool shed, in which we store a collection of tools, a source of power, and a workspace.
The tools of the 60s and 70s were primitive, partially because of the limited space and limited power our toolbox could provide for them, but also because our ideas and understanding of how these tools should work were limited by the audience who was using the tools.
That is to say, in the 60s and 70s, computers were weak and slow and computer users were also computer programmers. A small, tight knit circle of developers and computer scientists were responsible for the bulk of the progress made in that time, and the idea of designing tools for non-technical users was never considered.
Computer culture had, by and large, a kind of elitism about it as a result of the expense and education required to really spend much time with a computer. This changed, slowly, starting in the mid 70s with the development of the Microcomputer Market and CP/M.
Computers became more affordable, slowly. Affordable computers became more powerful, quickly. Within 10 years, non-technical users were interacting with computers on a daily basis. It was against the beginnings of this backdrop that the phrase I mentioned earlier was coined. "Human Literate Computers" or "Human Centered Computing."
Ease of Use was the holy grail for a lot of computer companies. A computer that was so easy to use that they could sell it to grandma. But, to me at least, Human Literate and Easy to Use are distinct ideas. Many modern applications are Easy to Use. Netflix is Easy to Use. Facebook is, for all it's faults, pretty easy to use. The iPhone, the iPad, and ChromeOS are super easy to use.
Well, they are easy to use as long as you use them in the prescribed way. As long as you let them tell you what you want to do, instead of the other way around.
That, IMO, is the distinction.
I think that many of the steps towards demystifying the computer of the 80s and 90s did good work, but ultimately, the computer industry left the whole idea behind, in favor of making some tasks Very Easy while making other tasks Practically Impossible, and turning everything into a surveillance device.
When I was a kid I was brought up with computers that showed you how they worked.
You booted in to a command prompt or a programming language, or you could get to one, if you wanted to.
I got to play with GW Basic and qBasic and, a little, with hypercard.
I got to take apart software and put it back together and make things that made people happy.
I got to make things that I needed. I got to make things that make me happy.
Today, the tools to do that are complex to compensate for the vast additional capabilities of a modern computer, but also to reinforce technical elitism.
I often wonder why Hypercard had to die.
It was because Jobs wanted the Computer to be an Appliance. A thing only used in prescribed ways.
Letting people build their own tools means letting people control their own destiny.
If I can make what I want, or if someone else can make what they want, and then I can take it apart and improve it, why would I pay for an upgrade? Why would I pay you to build something that doesn't meet my needs?
I'm mentioning hypercard specifically because I've been relearning hypercard recently, and it is *better* and more useful than I remember it being.
It's honestly revelatory.
Hypercard, if your unfamiliar, is powerpoint + instructions.
Here's a great introduction/example: http://www.loper-os.org/?p=568
The author walks you through building a calculator app in about 5 minutes, step by step.
Warning: There's a bit of ableist language tossed around in the last paragraph. Skip it, there's nothing worth reading there anyway.
You use the same kinds of tools you would use to build a slideshow, but you couple them with links, multimedia, and scripting.
Want a visual interface for your database of client data? Great! slap together a roladex card, and drop in a search function.
Go from concept to presentation ready in an hour or two (or less, if you've done this before!)
Hypercard was easy to use. Everyone who used it loved it. It was integral to many businesses daily operations.
Jobs killed it because he couldn't control it.
Microsoft doesn't ship any tools for building programs with their OS anymore, either.
They used to. There was a time when you could sit down at any windows or DOS machine and code up a program that would run on any other Windows or DOS machine.
But we can't have that anymore.
In the name of Ease of Use, they left out the Human aspect.
Use your computer how you're told to use it, and everything is easy.
Do anything new or novel and it's a struggle.
My nephew has an ipad.
He asked his dad how to write games. His dad didn't know. His dad asked me how to write games on an iPad. I told him not to bother.
My nephew asked me how to learn to write games.
I gave him a raspberry pi and a copy of pico 8.
Now he writes computer games.
He couldn't do that on his iPad.
Hypercard would be a perfect fit for the iPad and iPhone.
Imagine the things you could build.
But we aren't allowed to have computers that are fun to use, that are easy to build for, that are human centric, or human literate.
The last 10 years of development in computers were a mistake. Maybe longer.
Instead of making computers Do More, or making them Feel Faster, we've chased benchmarks, made them more reliant on remote servers, and made them less generally useful. We brought back the digital serfdom of the mainframe.
In the first episode of computer chronicles (https://www.youtube.com/watch?v=wpXnqBfgvPM) the mainframe guy is real adamant about how mainframes are good and micros are bad.
The host, a microcomputer legend, disagrees pretty strongly.
Later, when they talk about the future of networking, the mainframe guy talks about it as a return to mainframes. The micro guy talks about BBSs, peer to peer networks.
The mainframe guys are winning.
(this is not to say that I think mainframes are bad. I don't. Mainframes can be really good and interesting! Plato was wonderful, as were some of the early unix mainframes.
But IBM style Mainframe culture is The Computer as a thing you Use but don't Control culture, and I am very against that.)
I have to step away for a while. I'll continue this later.
@ajroach42 I want to respond, elaborate, & discuss at length here. I spent about 10 months some years ago immersed in the computing literature around the history of debuggers, during which I went from EDSAC to Visual Studio, but also all the other half-dead ends ends of computing history such as, e.g., Lisp machines.
Naturally, I came out of it a Common Lisper, and also naturally, with Opinions about modern computing.
Up for the discussion? It could get wordy and over a few days. :)
First, I want to say this: older computer systems - considered as systems - were generally more capable.
But to be clear, they were limited in use for those who didn't take an interest in learning them. I'm talking about things that weren't Windows 3.1+.
@ajroach42 @ciaby This was the Great Debate that was largely won by Microsoft. "Everyone can 'use' a computer.". That is to say, everyone can operate the appliance with preinstalled software. *everyone*. Apple pioneered the notion, but it turns out to be the preferred mode for businesses, who really rather don't like having specialized experts.
When you have sysadmins, there are no driver problems. There are no printer problems. There are no problems, as a matter of fact: it's all been taken care of by the admins.
This is exactly how executives like it.
Apple does the same, with their iPhone.
Apple is the sysadmin, metaphorically.
I am employed as a support engineer and a sysadmin, and I still run in to driver issues, printer issues, etc.
I take care of them, eventually, when I can.
But, even after doing this for 10 years, I still encounter problems that I can't solve (because there isn't a solution.)
but the metaphor of Apple as sysadmin, I'll accept. I disagree with someone else admining my phone, but that's another issue.
Hi, I'm probably near the age of @pnathan, and while I'm not a lisper anymore (ages went from my emacs fluency) I agree with all he said.
To give some context, I'm a polyglot programmer currently working on a brand new operating system http://jehanne.io
Now, the assumption that you seem to share is that people cannot learn how to program. I used to think this too.
Now however I realized that it's like we were scribas of Ancient Egypt arguing that people cannot write.
sorry for digging up this old thread, but I have one remark that's been on my mind since I saw your post:
I knew how to read and write when I was 4. I don't remember how I learned it, but I guess I wanted to learn it, or found it fun.
Are not all people like that? Do other people only learn to read when forced to at school?
Is there a correlation between programmers and people who learnt to read before school?
Would homeschooling be better? In the best case it probably would, but what about the average case and worst case? Would homeschooling-as-default reinforce the divide between the rich and the poor?
Or maybe we should go for master-and-padawan model, where you learn by helping someone do what you want to learn?
@ajroach42 @Shamar @ciaby @pnathan @Wolf480pl Part of the problem is that bureaucracies are extremely bad at producing high performance when results are difficult to measure. This is how we get bad teachers who can't be fired, because the bureaucracy can only fire based on easily measurable things, and the unions won't allow measurement of even things that can be measured, often for good reasons.
@ajroach42 @Shamar @ciaby @pnathan @Wolf480pl And people who would be really good teachers often end up doing something else because they don't want to work in a system that sucks the life out of them.
There are bureaucracies that do a better job of educating than the average US school district. I'd submit that none of them do a great job of educating. Education really needs to be continuous and ambient.
@Wolf480pl @ajroach42 @Shamar @ciaby @pnathan I am not a fan of "well it works for you but it won't work for us because our requirements are special," but I think that when it comes to things like education and welfare there are qualities of the US that are both special and non-optional. And the diversity angle is one nearly everyone misses.
@Wolf480pl @ajroach42 @Shamar @ciaby @pnathan Ah sorry what I meant was that in general I don't like the class of argument I myself was trying to make, which is that just because solution X works for country Y that doesn't mean it will work for country Z because we're different" without pointing to the exact ways that there are differences. In particular scale is often used as a difference that requires no explanation even though IMO you have to show WHY something won't scale.
@Wolf480pl @ajroach42 @Shamar @ciaby @pnathan I'm sure that's true. One common problem that I think is pretty evident in the US is that people think of education as being something that happens in school, that's the school's/government's responsibility, rather than everyone's responsibility and happening everywhere and all the time.
As for kids that do not like to go to school, I have three daughters and the eldest is in 4th elementary. She is pretty good at everything (evidently she got her mother's genes) including math and science (where I was pretty good too, but for my joy, her math talent beat my own!)
Still, each damn morning she don't want to go to school. Each day she has homeworks to do, it take much much more time to start doing them than to do them.
Kids are kids!
Substantially less diversed countries in Europe? Who say this should come in Italy and travel it for a year: she will not interact two days with the same culture.
Where I live, while we all speak Italian more or less in the same way, each village has its own dialect (a sort of local language). At work at times we enumerate the different names we would use to say common things like rabbit or bread or unmarried girl or unmarried old man... LONG LIST!
@Shamar @Wolf480pl @ciaby @pnathan Italy is an interesting case being formed from a bunch of separate city-states, but the cultures that are there have typically been in their given regions for a long time, no? Even so, I think Italy has some of the same challenges as the US due to the diversity of cultures, and many of the same problems result.
What you said is true but partial: Italy was a place of several mix of cultures for at least 3 thousand years. We are a deep genetic mix of north african and indo european people. And our cultures have a comparable complexity. I was not kidding: each Italian village has its language or its set of traditions. We can live in peace because we like such differences.
I don't know the US enough to say if your comparison holds. But it's the first time I've read it.
Even in Italy, where you can literally breath, drink and eat so many different cultures each few kilometers, politicians disregards and continuously demage culture and schools.
The fact is that people with a good culture are harder to manipulate.
Also the Italian school has been demaged by the cold war between USA and URRS: we have had a complex set of effects here, including politics damaging the education system to fight comunism among teachers.
homeschooling is not practical nor possible to do well at scale.
it relies on having at least one highly educated & disciplined parent who throws their career & potential in the trash to teach their children. to be clear, that largely means women.
I had an *excellent* homeschooling education and I have 0 desire to suggest that anyone should pursue that path who isn't wealthy already.
fix the frigging school system.
@Wolf480pl @ajroach42 @seanl @Shamar @ciaby to be even more blunt, most people aren't that unique or that interested in learning what's needed for general life success and achievement. That's the point of a regularized mandated curriculum: to ensure, on average, people know enough to be good citizens.
the legendary lack of care for education produces the antipathy towards education in the USA.
if you want excellent education, you must make rich kids go to public schools.
I pretty much agree.
Up to 13 years ago I would have agreed completely.
However now I have some doubts about religious schools: maybe it's just that my conversion to Catholicism poses a bias, but I see a value in schools that have a religious bias.
IFF the quality of the education is equivalent to that of public school, the bias proposed to students differs from the mainstream consumerism, teaching them to take everything with a grain of salt.
@Shamar @pnathan @Wolf480pl @ajroach42 @ciaby From a political standpoint, forcing the rich to send their kids to public schools would certainly improve the quality of the public schools. Or it might just get the rich to move elsewhere.
In Palo Alto, one of the richest cities in the US, the wealthy largely send their kids to the public schools. Reason being that Palo Alto itself is exclusive, so the public schools are essentially private. They have parks that don't allow non-locals.
@Shamar @pnathan @Wolf480pl @ajroach42 @ciaby And that's what you'll get if you ban private schools & home schooling. Many more communities will start looking like Palo Alto, and public schools in poor communities will continue to suck. And then the rich will fight any requirements to bus kids around, funding for said buses, etc.
@ajroach42 @Shamar @pnathan @Wolf480pl @ciaby I'm not sure how it is in other states, but in California the quality of the local schools is one of the top drivers of where upper middle class people choose to live. In my family's case we picked the school first and then moved near the school. Even if people didn't relocate right away, over time the clustering would happen, and it would be worse because there'd be no private school fallback.
@seanl @ajroach42 @pnathan @Wolf480pl @ciaby @Shamar In the city I live in (Newark, OH), the public school system has a perception of being where you send your kids to become drug dealers, whereas the religious schools are where you send your kids to learn. (Most of them are Catholic in my area, AFAIK.)
@Shamar @pnathan @ajroach42 @seanl @ciaby
from what I've heard about catholic schools in Poland, they're the greatest source of atheists. If you're kinda-religious, and your parents send you to a religious school, you'll see the church's hypocrisy from so close that you won't want to have anything in common with Catholic Church anymore.
This happens in Italy too.
There is also another explanation to this fact: when you see that the world outside the school follow certain goals and values and your teachers propose to follow completely different ones, you realize that you can choose. Or even go for your own road, being skeptical about botb the world and the religion.
On the other hand Faith is not something you learn.
I was a bad atheist myself and I didn't learn to believe.
@pnathan @ajroach42 @seanl @Shamar @ciaby
so what you're saying is:
- to achieve success in a modern society, you need to have some basic skills that everyone is expected to have, before you reach the age of 18
- most people don't want to learn those skills before the age of 18
- we should force them to learn those skills so that they can be successful
Only on the surface.
When you force a group of kids to do a scientific experiment,and then another, and then another one too, and then you let them alone in the lab, what do you think they will do?
On my first chemistry lab the first question to the teacher that one of my classmates did was: how can I build nitroglycerine?
The teacher said: oh you could, we have everything you need, here. But I won't teach you the procedure, you'll have to discover yourself!
Hackers are difficult to manipulate, they respect what you do and ignore what you represent, so in a way they can become annoying to the leaders of any organisation. OTOH not every annoying person is an hacker. Also hackers can be pretty well integrated in the same organisations, because the whole group benefits from their original perspectives.
The Hollywood sociopathic hacker is a misrepresentation: more hackers would be a problem for the power.