Andrew (R.S Admin) is a user on You can follow them or interact with them if you have an account anywhere in the fediverse.

This is a thread about computers.

I have a lot to say.

I might not be able to finish right now.

I'm going to post unlisted.

I make no secret of the fact that I love old computers, and that I think modern computers have lost their way in terms of providing utility to users. To that end, I write about, and think about, computers and the way to move computers forward, without losing site of the fact that computers should serve their users. I grew up with hand-me-down computers, from Atari to Apple to Dell, and in the process I got to experience a sizable portion of computer history very quickly, in my teen years.

This left me with Opinions.

I write about things that are informed by these opinions often. When I talk about building a World Wide Web analog without the internet, about reviving the BBS or when I lament the fact that Gopher was doomed to obscurity by the modern web, it is in response to my experiences with an array of computers from the dawn of the home computer revolution up through to the modern age. There was a time when computers were magical.

I had come, in recent months, to suspect that I might just be an old fuddy-duddy. I'm approaching 30 years old, and I had begun to feel like I was looking at modern computers and modern software through the lens of someone who was being left behind, shouting at the sky, shaking my fists at the kids on my lawn. I was coming to the conclusion that my love of these computers of my childhood, and of ones that I had never had the chance to interact with, was some kind of rose tinted nostalgia.

I had not fully subscribed to this theory, but it seemed more likely that I was romanticizing something that was actually Not Great that it was that nearly every modern software and hardware platform had inexplicably gotten shittier.

I am now prepaired to say, with conviction, that every modern hardware and software platform has gotten shittier, and that it's not inexplicable. I'm going to try to explain how I came to this conclusion, and give some potential explainations.

First, let me lay out a little bit about my technical literacy and my profession, this might help explain some of the assertions that I'm going to make. I started using computers, and the internet, in 1995. Our family computer, my first computer, ran Windows 3.11 (for workgroups). Later, in the late 90s, I was given an Atari 400 and reams of books and magazines on basic, followed shortly by an Apple II GS and dozens of disks of software.

Later still, I started collecting computer detritus, and assembling frankenstiend linux boxes, and emulating some of the the machines I read about in the magazines I had as a kid.

I loved computers. I loved making weird little programs, and silly applications and games. I'd build things in GW Basic or Freebasic, and distribute it to my friends on floppy disks. Even in the latter half of the 00s, I was passing half broken games around on floppy disks (or collections on CD-Rs, when I could talk someone in to buying some for me.) Computers were, by and large, ubiquitous in my life. Nearly everyone had an old one they didn't want, and a new one they didn't understand.

For a teenager and an aspiring computer programmer, the 00s were a great time to learn.

I collected cast offs from neighbors, from thrift stores, from office upgrades. I rebuilt them, installed useful or fun software on them, and sold them or gave them away. All of my friends had computers of their own, because I had access to these machines, and I cared enough to outfit them with the appropriate software.

(It must be said, at this point, that 'useful' and 'appropriate' are relative terms. In 2009 I gave a good friend a computer that had been built for Windows 98. It was running Puppy Linux from a CD, and saving data to a USB flash drive over USB 1.1. It did word processing, email, and basic web browsing. It had a whopping 64MB of RAM, and was covered in glitter, googley eyes, and carpet samples. But it was free, and it wasn't useless, and that was important.)

I went to school to become a programmer, and discovered that I don't enjoy programming as it exits today. I understand it well enough, and I *can* do it, but I don't *want* to. I make websites, and I build tools to help other people use computers.

I make my living as a systems administrator and support engineer. (and I'm looking for a new gig, if you're hiring.) That's a fancy way of saying that I solve people's computer problems. Professionally, I'm responsible for identifying and mitigating the shortcomings of various computer systems.

Guess what?
There are a lot of these shortcomings. Like, a lot. More than I ever expected.

Some of these shortcomings are legitimate bugs. Some of them are bafflingly short sighted or poorly considered architectural decisions. Just as many are cases of a divergence between the needs of the user and the abilities of a program. Modern programs are often feature incomplete, poorly supported, and difficult or impossible to customize. Modern computers are often slow, and cranky. I'm responsible for handling the fallout of this unfortunate situation.

I've seen how revolutionary a computer can be, if it is designed with the needs of the user in mind, and how disastrous the same can be when it is not. I've seen computers used to empower people, and used to oppress. I've seen computers be Good, and the consequences of when they are not.

So that's who I am, and my experience with computers so far. Those are my credentials, and my qualifications.

Before we go any further, let's talk about The Computer Chronicles.

The Computer Chronicles was a TV show that ran from the early 80s through the early 00s. Over it's nearly 20 year run, The Computer Chronicles covered nearly every facet of the newly developing Computer industry. It was hosted by people with Opinions.

The guests were, frequently, people who were proud of the things they made, or the software they represented.

Watching the developer of CP/M and DR DOS talk to a mainframe engineer who worked at IBM in the 50s about the future of computers as seen from the 1980s was eye opening.

On the one hand, this show serves as an excellent introduction to, or reminder of, the capabilities of computers 35 years ago. It helps us see how far we've come in terms of miniaturization, while also demonstrating again that, in many ways, there is nothing new under the sun.

Andrew (R.S Admin) @ajroach42

Before the advent of the internet, reporters were writing their stories on laptops and sending them in over phone lines, 25 years before the release of the iphone HP released a computer with a touchscreen, three years before microsoft released he first version of windows Apple and Visicorp demontrated GUIs wih features that Windows wouldn't be able to approach for another 9+ years.

· Web · 7 · 12

And, of course, I'm reminded again of Douglas Engelbart's 1968 "Mother of all Demos", in which he demonstrated the mouse, the GUI, instant messaging, networked gaming, and basically every other important development of the following 50 years.

It took 5 years for Xerox to refine and miniturize Engelbart's ideas to the point that they thought they could market them, and another 10 years before Apple refined and further miniturizaed the same ideas, and brought us the Mac.

Nothing is ever new.

The whole video of Engelbart's Online System (NLS) is available on youtube. Some of it is *really* interesting. Most of it is unfortunately dry. It's easy to forget that this was 50 years ago, and also mindblowing that it was only 50 years ago.

Anyway, back to Computer Chronicles. In an episode about Word Proccessors, the man they were interviewing said "There's a lot of talk about making people more computer literate. I'd rather make computers more people literate." There's a phrase that resonated with me in a big way.

It sounds like the kind of semantic buzzword shuffling so common in standard corporate speak, but I got the impression that the guy that said it, believed it. He believed that computers had gotten powerful enough that they no longer had to be inscrutable.

There were others working around the same time on similar ideas, or at least from a similar philosophy. Working to make computers, if not intuitive, at least comprehensible. I think this is a noble goal.

The computer is often thought of as a tool, but it is more like a tool shed, in which we store a collection of tools, a source of power, and a workspace.

The tools of the 60s and 70s were primitive, partially because of the limited space and limited power our toolbox could provide for them, but also because our ideas and understanding of how these tools should work were limited by the audience who was using the tools.

That is to say, in the 60s and 70s, computers were weak and slow and computer users were also computer programmers. A small, tight knit circle of developers and computer scientists were responsible for the bulk of the progress made in that time, and the idea of designing tools for non-technical users was never considered.

Computer culture had, by and large, a kind of elitism about it as a result of the expense and education required to really spend much time with a computer. This changed, slowly, starting in the mid 70s with the development of the Microcomputer Market and CP/M.

Computers became more affordable, slowly. Affordable computers became more powerful, quickly. Within 10 years, non-technical users were interacting with computers on a daily basis. It was against the beginnings of this backdrop that the phrase I mentioned earlier was coined. "Human Literate Computers" or "Human Centered Computing."

Ease of Use was the holy grail for a lot of computer companies. A computer that was so easy to use that they could sell it to grandma. But, to me at least, Human Literate and Easy to Use are distinct ideas. Many modern applications are Easy to Use. Netflix is Easy to Use. Facebook is, for all it's faults, pretty easy to use. The iPhone, the iPad, and ChromeOS are super easy to use.

Well, they are easy to use as long as you use them in the prescribed way. As long as you let them tell you what you want to do, instead of the other way around.

That, IMO, is the distinction.

I think that many of the steps towards demystifying the computer of the 80s and 90s did good work, but ultimately, the computer industry left the whole idea behind, in favor of making some tasks Very Easy while making other tasks Practically Impossible, and turning everything into a surveillance device.

When I was a kid I was brought up with computers that showed you how they worked.

You booted in to a command prompt or a programming language, or you could get to one, if you wanted to.

I got to play with GW Basic and qBasic and, a little, with hypercard.

I got to take apart software and put it back together and make things that made people happy.

I got to make things that I needed. I got to make things that make me happy.

Today, the tools to do that are complex to compensate for the vast additional capabilities of a modern computer, but also to reinforce technical elitism.

I often wonder why Hypercard had to die.

It was because Jobs wanted the Computer to be an Appliance. A thing only used in prescribed ways.

Letting people build their own tools means letting people control their own destiny.

If I can make what I want, or if someone else can make what they want, and then I can take it apart and improve it, why would I pay for an upgrade? Why would I pay you to build something that doesn't meet my needs?

I'm mentioning hypercard specifically because I've been relearning hypercard recently, and it is *better* and more useful than I remember it being.

It's honestly revelatory.

Hypercard, if your unfamiliar, is powerpoint + instructions.

Here's a great introduction/example:

The author walks you through building a calculator app in about 5 minutes, step by step.

Warning: There's a bit of ableist language tossed around in the last paragraph. Skip it, there's nothing worth reading there anyway.

You use the same kinds of tools you would use to build a slideshow, but you couple them with links, multimedia, and scripting.

Want a visual interface for your database of client data? Great! slap together a roladex card, and drop in a search function.

Go from concept to presentation ready in an hour or two (or less, if you've done this before!)

Hypercard was easy to use. Everyone who used it loved it. It was integral to many businesses daily operations.

Jobs killed it because he couldn't control it.

Microsoft doesn't ship any tools for building programs with their OS anymore, either.

They used to. There was a time when you could sit down at any windows or DOS machine and code up a program that would run on any other Windows or DOS machine.

But we can't have that anymore.

In the name of Ease of Use, they left out the Human aspect.

Use your computer how you're told to use it, and everything is easy.

Do anything new or novel and it's a struggle.

My nephew has an ipad.

He asked his dad how to write games. His dad didn't know. His dad asked me how to write games on an iPad. I told him not to bother.

My nephew asked me how to learn to write games.

I gave him a raspberry pi and a copy of pico 8.

Now he writes computer games.

He couldn't do that on his iPad.

Hypercard would be a perfect fit for the iPad and iPhone.

Imagine it!

Imagine the things you could build.

But we aren't allowed to have computers that are fun to use, that are easy to build for, that are human centric, or human literate.

The last 10 years of development in computers were a mistake. Maybe longer.

Instead of making computers Do More, or making them Feel Faster, we've chased benchmarks, made them more reliant on remote servers, and made them less generally useful. We brought back the digital serfdom of the mainframe.

In the first episode of computer chronicles ( the mainframe guy is real adamant about how mainframes are good and micros are bad.

The host, a microcomputer legend, disagrees pretty strongly.

Later, when they talk about the future of networking, the mainframe guy talks about it as a return to mainframes. The micro guy talks about BBSs, peer to peer networks.

The mainframe guys are winning.

(this is not to say that I think mainframes are bad. I don't. Mainframes can be really good and interesting! Plato was wonderful, as were some of the early unix mainframes.

But IBM style Mainframe culture is The Computer as a thing you Use but don't Control culture, and I am very against that.)

I have to step away for a while. I'll continue this later.

@ajroach42 I want to respond, elaborate, & discuss at length here. I spent about 10 months some years ago immersed in the computing literature around the history of debuggers, during which I went from EDSAC to Visual Studio, but also all the other half-dead ends ends of computing history such as, e.g., Lisp machines.

Naturally, I came out of it a Common Lisper, and also naturally, with Opinions about modern computing.

Up for the discussion? It could get wordy and over a few days. :)

@pnathan for sure.

I haven’t gotten in to lisp machines yet, but I’m always down for discussion.

@ajroach42 @pnathan
This thread is going to be gold :)
(I'm replying here so that I won't forget about it...)

@ciaby @pnathan I hope you enjoy! I'm looking forward to the discussion as well.

@ajroach42 @ciaby
OK, so, I'm about a decade older than you, Andrew: I taught myself QBasic in the mid 90s, got online late 90s, never really looked back.

First, I want to say this: older computer systems - considered as systems - were generally more capable.

But to be clear, they were limited in use for those who didn't take an interest in learning them. I'm talking about things that weren't Windows 3.1+.

@ajroach42 @ciaby This was the Great Debate that was largely won by Microsoft. "Everyone can 'use' a computer.". That is to say, everyone can operate the appliance with preinstalled software. *everyone*. Apple pioneered the notion, but it turns out to be the preferred mode for businesses, who really rather don't like having specialized experts.

@ajroach42 @ciaby It is my contention that Windows (& *nix) computer systems are designed to be administrated and managed by sysadmins, and the user experience in this case is great.

When you have sysadmins, there are no driver problems. There are no printer problems. There are no problems, as a matter of fact: it's all been taken care of by the admins.

This is exactly how executives like it.

Apple does the same, with their iPhone.

Apple is the sysadmin, metaphorically.

@ajroach42 @ciaby

Here is the fundamental conundrum of computers: to use at an expert level - to really make the machine work for you, you must become an expert too, and usually a programmer, even ad hoc.

Efforts to avoid and deny this have occurred for *decades*.


Some of Engelbarts work.

Algol (ish)


Chris Granger's 'Eve'.


FPGA designers with CAD addons.

Embedded system CAD tooling

numerous academic papers

@ajroach42 @ciaby

all these systems collapsed at a point: the point where the fundamental reality of the problem met the fundamental reality of the machine.

programming had to occur.

Apple solved this by making so many programs available on the iThings for so many niche issues, that programmers would code what was needed and the user didn't have to care anymore about surmounting the issue.

Same for businesses & windows, essentially.

@ajroach42 @ciaby

so here's the problem: you're right. computers are easier to use, fsvo of use.

but the truth was, back when computers were harder to use, in the 90s... people really hated learning how to use them. there was an immense demand for not having to think (there's a book called "don't make me think" about this whole problem).

so we have this weird place where no one outside of the "'elite" wanted to care, and they resented being made to care.

so apple won by fulfilling that.

@ajroach42 @ciaby let's talk about lisp machines as I understand them - being born in their heyday.

lisp machines presumed the user and the programmer were the same person. user had root on everything, and everything was in lisp, and was mutable.

this worked GREAT, basically. multiprocessing, security, meh, whatever.

total control in the hands of the user. to be honest, most programmers at that time were not ready for it, didn't want it, and the machines were 10x the cost)

@ajroach42 @ciaby Also, lisp machines were made by hippie engineers who were really bad at business. so that didn't work out.

but you have this enormous tension between Lisp "we expect you to come up to our level, here's the manual, we'll answer all your Qs", and Windows/Java "here's the basics, don't poke yourself with the sharp bits"

@ajroach42 @ciaby

as an example of Lisp-world, for instance - it had debuggers that essentially ran as in-process monitors that could take and trigger recovery actions based on logical conditions - in '92. we don't have that today, and in languages which are compiled, it will never exist.

@pnathan @ajroach42 @ciaby i'm not sure but, doesn't IntelliJ allow to rule-based breakpoints? And I think one can (instead of halting) execute arbitrary code on breakpoint.

I'm talking about the JetBrains IntelliJ IDEA IDE for Java.

@upshotknothole @ajroach42 @ciaby
I'll try to remember to test that tomorrow. JetBrains really does great work, it's possible they've done it! I don't have the reference to the '92 implementation handy, but it was ridiculously sophisticated, not just a simple conditional/monitor of a simple slot.

@pnathan @ajroach42 @ciaby 'Don't make me think' is about graphical data organization (mostly on web pages).

I don't remember it touching on the problem of general computer usability (because reading a web page is very passive, skmilar to reading a newspaper)

@upshotknothole @ajroach42 @ciaby
but that is exactly the problem of computer usability, at the first pass. if you can't be troubled to read a static page, then..... where do you go?

gotta raise the conciousness.

@pnathan @ajroach42 @ciaby we will always need the sort of low-level interactivity.
Overall, with easy-to-read websites that somehow accellerate the reader into thinking "hey, this system is larger then it seems. Reading into it or at least hiring someone who read into it will crease business values" value is created.

@pnathan @ciaby @upshotknothole I guess I just have more faith in users than you do?

rant Show more

@pnathan @ajroach42 @ciaby
A sort of side dilemma with this is that, by turning computers into magic boxes for making increasingly complex layers of tasks accessible to average people, this understanding gap just widens. Average users become increasingly disconnected from even a baseline understanding of the processes and design patterns at work in computing, and the knowledge of the "elites" becomes ever more rarified.

@pnathan @ajroach42 @ciaby
How can there ever be reasoned popular discourse about the practical, moral, and political implications of modern computing, if you have to be a developer or programmer to even understand the basic concepts?

@cossimo @ajroach42 @ciaby this is an incredibly important point and its part of why I reluctantly support "everyone must code" efforts in schools, despite its attachment to the jobs ideal.

it's analogous to the idea that in a lab at school, you encounter ideas of safety and ideas relevant to the discipline, even if you never do anything with it again.

but, then again, we can describe the effects of computing without being a programmer. This is, I think, the lesson of the environmental movemen

@cossimo @ajroach42 @ciaby You don't have to be a chemist to demand that a paper mill not put outputs into your drinking water.

Likewise, you don't have to be a programmer to note that Facebook & Twitter's algorithms are outrage machines and should be regulated for the good of our society.

@pnathan @ajroach42 @ciaby
Very true. The "outrage machine" is a pretty easily understandable by-product of FB and Twitter, because it is so overt. By contrast, I think average people have much less of an understanding, for example, of the APIs, tracking pixels/widgets, apps, etc., the FB and Twitter algorithms use to collect and aggregate data about them, or how that data gets used to tailor their everyday experience.

@pnathan @ajroach42 @ciaby
I think most people are still fairly ignorant (perhaps willfully so) of how closely they are tracked and how their phones make their every action a data point.

Similarly, to riff on the chemistry example, most people are blissfully ignorant about all the *stuff* that gets put in their food and most of the inhumane or unsustainable process that are used to create it.

@pnathan @ajroach42 @ciaby
The more seamlessly invisible a technology is, the more people willfully ignore it, no matter how dangerous it is.

Anyway, didn't mean to drag this (awesome) thread off on such a tangent.

@cossimo @ajroach42 @ciaby I don't think it's precisely a tangent though: invisibility and lack of understanding - or lack of desire to understand - helped build the problem we have today.

if all users really cared deeply about understanding and collapsing the user/programmer division, then we'd probably all be using a Linux core with a Lisp machine on top; everyone would intuitively understand algorithms and how the net worked.

but they prioritize other things, WHICH IS FINE.

@pnathan @ciaby @cossimo

Lisp machines also failed economically.

CP/M was better and more widely used than DOS, but IBM took DOS anyway.

It doesn't matter what users want, if it's not offered to the users.

We have software monopolies today.

@pnathan @ciaby I'm with you on how and why this happened.

You seem to be discussing it as if it was inevitable, though. I'm firmly of the opinion that it was not inevitable, and that compromises were possible.

Right now, there is very little space for the users in the middle. It's all concentrated at the edges. You're a coder or a user. There's no middle, and there *could* be.

@ajroach42 @ciaby

Right, middle, What was the compromise, given the users desperate not to think though?

The effectual compromise made was Linux - that lets off the pressure from Microsoft & Apple and directs all these maker-types over to a system that fits them.

@ajroach42 @ciaby
I argue that Linux is closer to the old paradigm - users and programmers are much closer and there is a strong pressure to be "some" kind of programmer, even if its just a scary terminal shell occasionally.

@pnathan @ciaby that’s fair.

I guess I need to add ‘design an idea software package for Linux, and write documentation’ to my list of future projects. :-D

@pnathan @ciaby the users desperate not to think are not the only users.

My point is that we have tools for programmers, tools for users who just want to do what they are told, and nothing (or very little) for the folks in the middle.

Maybe it’s a smaller group than I think, but I doubt it.

@ajroach42 @ciaby
tools for people in the middle: what would that be?

if it's mathematics/business, that'd be excel.

if its programming, then VBA is still a thing, yes?

why don't you hone in on what you really want from a tool? what does it do? if it's 'general purpose computing', then beware - a lisp macro & a library might be the right way to go. :)

@ajroach42 @ciaby Speaking of, I'm going to focus on writing code for the next hour to grind on my stupid business idea before bed.

@pnathan @ciaby I fell asleep before we could continue this conversation, so I'm catching up today.

When I say tools for people in the middle I mean tools for development that do a little hand holding. Hypercard, Pico-8, GW-Basic.

Right now, we have a culture that tells people that Programming is Hard (because it often is, even with the 'easy' tools)

Some kinds of programming could be much easier, if we'd let them.

@ciaby @pnathan
Add MS Access to that list of tools in the middle.

In IT we hate it because it encourages users to have their data scattered and not backed up, and full of misspelled duplicate records. But it sure lets a certain type of user get their real work done.

@gcupc @pnathan @ciaby Access is a great example of the kind of system I think that we need to refine and produce more of.

It's not great, and it's overwhelming, but if it's done right it can be *very* useful to some use cases.

I want more of that, but with the lessons we've learned on version control and the like baked in.

@ajroach42 @ciaby @pnathan @gcupc Maybe the Access frontend on top of something like CouchDB?

@ajroach42 @ciaby I want to also argue that there's an eternal september problem inherent in the situation.

but here we have a core issue: should a user be a programmer? at all? if so, we are easing them towards the "elite", or so it would be said.

or, alternatively, is this a consciousness raising exercise where this OS - UnicornOS - raises the consciousness of the user to deeply engage with the Computer?

what should Unicorn do, anyway? See the conclusion of:

@ajroach42 @ciaby the more you ask Unicorn to interop with the existing world, the more you constrain to the limitations and expectations of the existing system, which tends to remove agency of the operator.

I frankly think its time to build a new OS from the assembly on up to empower people, but I'm loathe to take that on when I'm dependent on a company to pay mortgage and health insurance

@pnathan @ciaby This is a good point, but I think it deserves scrutiny.

I am employed as a support engineer and a sysadmin, and I still run in to driver issues, printer issues, etc.

I take care of them, eventually, when I can.

But, even after doing this for 10 years, I still encounter problems that I can't solve (because there isn't a solution.)

but the metaphor of Apple as sysadmin, I'll accept. I disagree with someone else admining my phone, but that's another issue.

@ajroach42 @ciaby your users pay you so they don't have to care about sysadmin issues. their world is great!

@ajroach42 @ciaby I'm glossing over the 1% failures to get at the core point: sysadmins are designed into the windows and unix world so users can focus on their core competency.

@ajroach42 @ciaby

Hi, I'm probably near the age of @pnathan, and while I'm not a lisper anymore (ages went from my emacs fluency) I agree with all he said.

To give some context, I'm a polyglot programmer currently working on a brand new operating system

Now, the assumption that you seem to share is that people cannot learn how to program. I used to think this too.
Now however I realized that it's like we were scribas of Ancient Egypt arguing that people cannot write.

@Shamar @ajroach42 @ciaby I'll eyeball your work.

people can program. people do program. where there is a will there is a way.

and there are many many ways to program.

arguably most are terrible, and the ones that condesendingly target newbies produce the worst systems overall.

@pnathan @ajroach42 @ciaby

Thanks! 😃

What I mean is that #history can teach us a lot about the present (and the #future) if we are able to interpret it with the right eyes.

Why peasants were unable to write in Ancient Egypt but they are able to now?

I think the main reasons are:
1. the writing system was too "primitive"
2. writing was functional to the #power structure back then.

What does this means for us?

That the #complexity of #programming is not necessarily inherent to the matter.

@Shamar @ajroach42 @ciaby

That the #complexity of #programming is not necessarily inherent to the matter.

here is where I disagree.

the complexity of understanding the "web stack" is incidental; the compelxity of understanding the concept of distributed computing and comms protocols is fundamental.

or something as simple as rendering bits to the screen. raster? vector? what abstraction do you choose to execute the display mechanism. now you have a model.

@Shamar @ajroach42 @ciaby ... continuing. Next year, maybe you want a different model, so you break off and redo it a bit. Now you have to figure out how to juggle two incompatible models in your code, and you're on your way to inventing an abstract interface system.

even if you're doing assembly!

@Shamar @ajroach42 @ciaby

here's my claim: software is crystallized thought, with all the complexities, ambiguities, and changing over time of thoughts. we can gut the whole shaky tower of modern computing, and we'll still be confronted with the core problem (even assuming a workable and standard bit of hardware for the engineering problems, themselves non-trivial sometimes)

@pnathan @ajroach42 @ciaby

You are in a way quoting my favourite #programmer, my favourite #hacker: Edsger Wybe Dijkstra.

For sure, "computational thinking" is as hard as #math is.
For sure, hardware issues exist.

But the point is that, despite all the progress that we see when we look at our #smartphone after reading about #ENIAC, we are still using #hieroglyphics.

The way we #think is strongly dictated by what we know.

We should get and habit to #challenge them.

@Shamar @ajroach42 @ciaby

EWD was probably the most astute prophet of software engineering that has lived to date.

let me challenge you: what is the secret knowledge which, knowing, would unlock the door?

@Shamar @ajroach42 @ciaby ah but that doesn't get anywhere until we start digging.

what is simple? is it the ability to point and click a mouse? is it a keyboard key?

both of those have deep wells of complexity and knowledge to make happen, despite surface simplicity.

or is it a transistor, which accumulation of produces unspeakable complexity?

@pnathan @ajroach42 @ciaby

You are confusing #simple with #easy. #Simplicity can be very hard to achieve.

Also, you are assuming I have that knowledge clear in my mind.

I've not.
I've just a natural inclination at finding the orthogonal axes that govern complex problems, thus I'm pretty good at moving from a point to another in such multidimensional systems (aka solving problems or forsee and avoid them).

I'm an hacker from the past, like everybody here.

But even if I don't know the ...

@pnathan @ajroach42 @ciaby

But even if I don't know the #solution, I see the #problem very well. Everyday. Clear.

I fight with it in my own #mind.

The problem is that we do not yet have a #math able to describe #simplicity.

For example, simplicity composes well.
Simplicity can stack.
Simplicity is deep. More it's fractal.

Simplicity is what I seek in any piece of code I write (see for example another project of mine: ).
We need more #CS and #math #research for it.

@pnathan @ajroach42 @ciaby

I find the current mainstream stack frustrating. Very frustrating.

I recently realized that the web is still a weapon of the USA DARPA (that sometimes backfires).
And #Javascript so far is the apex of the militar technology so far: we run all over the world code under the control of USA companies (that in turn are in control of the USA government).

But this just scares me.

What frustrate me ...

@pnathan @ajroach42 @ciaby

What frustrate me is the total resignation of people to this state of things, as if it's the current shit was the best possible stack that we can conceive.

And #WebAssembly is coming!

No guys, no... we have to throw all this away and start from scratch from the lesson learned.

We can do it better.

And we CAN.

(sorry for the passionate rant... it's pretty evident I suffer a lot from this state of things)

@Shamar @ciaby @pnathan The web isn't all bad, and it's not all bad technologies, but it isn't all good either.

All I'm asking is that we take a step back and examine our modern software with a more critical eye towards how we could improve it for future generations.

I'm not sure why this has become so controversial.

@ajroach42 @pnathan @ciaby

Sorry, I didn't want to seem controversial.

I'm have been reflecting on these specific topics for a while now, so I joined the discussion in the hope to share an interesting perspective.

@Shamar @ciaby @pnathan

Oh, sorry. I wasn't clear in my post.

I agree with most of what you've said. We disagree on some nuance, but that's fine.

I was saying that other people have found my statements on this subject very controversial, and I'm not sure why.

@pnathan @Shamar @ajroach42 @ciaby
I'm gonna repost my newly relevant diagram.

there isn't a bright line between programming and passive computer use. all UIs are programming languages. most are simply shitty, overly constrained languages that make simple tasks nearly impossible.

@enkiv2 @pnathan @ajroach42 @ciaby

Nice charts.

However, I know Unix quite well, but I would not say that the effort decrease with task complexity. It's pretty unlikely, because you would reach negative effort soon.

Also I think the ideal UX plot over such axes would be something like a 45° rotated hyperbola, like this

I don't think such curve can be beated.

@enkiv2 @pnathan @Shamar @ajroach42 @ciaby Those are good charts that depict a useful idea, but I don't think what they describe is what most people know as a “learning curve”—which would be proficiency (as a percent of the tool’s available capability, y-axis) across time spent (x-axis).

@pnathan @ciaby @ajroach42 @joeld @Shamar You're right. They describe the difficulty of solving a problem with a tool vs the inherent difficulty of that problem -- which is essentially an inverse of the learning curve for tools that can address all problems. Tools that fail by making tasks that aren't easy impossible, of course, have a misleadingly good-looking learning curve.

@pnathan @ciaby @ajroach42 @joeld @Shamar (The catch being: they can't actually do anything, so having 100% mastery over them isn't actually valuable.)

@pnathan @ciaby @ajroach42 @joeld @Shamar I cross-posted this thread to, producing a comment thread over there that is at turns useful and infuriating:

@pnathan @ciaby @Shamar

To the user who wants to display bits on the screen, it shouldn't matter unless/until they want to display bits in a way that one format handles over the other.

I can see how and why it matters to someone building more complex systems, but if all I want to do is have a text input box, why do I need to care about anything else you said?

@ajroach42 @ciaby @pnathan

To get #freedom.

It's more or less the difference between grasping at reading words so that you can better serve your Lord with the shopping list, and being able to write a #political article on a newspaper to fight for your #rights.

#Programming today is just like #writing and #reading a couple centuries ago.

It's a matter of #power and freedom.

It's not just about being able to code, it's about #thinking as a #programmer.
Thinking as an #hacker makes you #free.

@Shamar @ajroach42 @ciaby One system that I have been curious about is Oberon OS. Apparently it was extremely successful but external pressures collapsed it.

@Shamar @pnathan @ciaby I never said people can't learn to program.

I'm saying that some people don't want to learn to program, and that what we call "programming" is needlessly difficult for some tasks, in the name of corporate profits.

@Shamar @pnathan @ciaby I feel like you think this was a clever point, but I don't understand what you mean.

Programming is a specialty, and some people have other specialties. Expecting them to also become expert programmers because our current expert programmers can't be arsed to make extensible and understandable tools is unreasonable.

@ajroach42 @ciaby @pnathan

This is the assumption I challenge.

For sure programing is a speciality right now.

But it's a speciality just like reading, writing and counting.

Not everybody can be a novelist, nor a professional mathematician.

But people should be able to program, just like they are able to read, write, compute a volume, reason about an average speed, a length...

Programming is harder then math because we are still using primitive tools.

It's sad that we are happy with them.

@Shamar @pnathan @ciaby

Some kinds of programming (just like some kinds of math) will remain hard.

But better tools are what I'm after, yeah.

@ajroach42 @ciaby @pnathan

We need a lot of #research.

We should #hack more.

And we need better #math too.

It will take some centuries.

Much more, if each generation keep being satisfied with the shit it slightly improve (or messup, as we did with the #web when we give out #XHTML for #Javascript).

Because, to me, the tool we need are as different from today mainstream tech as our writing system is from Egyptian #hieroglyphs.

I feel that it's not only a matter of research, but also a point of throwing away some tech that we take for granted (x86, for example) and rebuild from scratch with different assumptions in mind. In the current economic system I find it quite hard to do...
@pnathan @ajroach42

@ciaby @pnathan @ajroach42

You are largerly undervaluating the #power of #technology.

Most programmers are not well versed about #history, and it's a pity. There's a lot to learn for us, from history.

Technology is probably the most powerful and effective way to change the world. Most changes in human organizations have been allowed by technological innovations: from fire to boats, from bronze to iron, through argricolture, writing, counting, roads, from sword to guns...

Technology can ....

@ciaby @pnathan @ajroach42

Technology can change the world for the better or for the worse.

It can disrupt "the current economic system".

And that's why #capitalists are in a hurry to keep #hackers under control.

So, I don't think that the "current economic system" should be a problem for hackers.

We CAN throw away the web.

I really think it (I work with browser all the day, I know the stack pretty well...).

From scratch, with the lessons learned, it will take a fraction of what it took.

@Shamar @ajroach42 @pnathan
That's possible, and somehow is already happening.
What I'm talking about, however, goes much deeper than that. I'm talking about open hardware infrastructures, where every component is documented, there are no binary blobs or proprietary firmware .
Very important is also the instruction set, because what we have now (x86/amd64) is incredibly bloated and full of backward compatibility shit.
RISC-V is a step in the right direction. If only the hardware wasn't so expensive... ;)

@ciaby @ajroach42 @pnathan

Interesting point. You're right.

We cannot actually trust the hardware, either.

But... I'm a software guy.
I'm more concerned about the way we connect and use the hardware than the hardware itself.

Indeed, everything I've had to learn about hardware while developing my x86_64 OS has been a pain.

So, YES!

We need more research on the open hardware too.

There are only 2 things that I'd like to preserve keep: little endiannes and 64bit longs.

Please. 😇

@Shamar @ciaby @ajroach42 Bold claim: open source or non-open source hardware doesn't matter when deployed at scale.

the essential problems today are, in a sense, all software, mediated by the scale.

@Shamar @ciaby @ajroach42 that said:

we have these *inter-twingled* issues: the hardware is manky, the software is manky, and the incentives to improve are perverse.

My reckoning is that there is a space today for a sort of New System, a Unicorn OS, where the whole thing is largely rebuilt. Does the web have to exist? does tcp/ip? are there better systems?

here we see we make choices and one prioritizes those who take the time to learn the system and one ...doesn't

@pnathan @ciaby @ajroach42

What is Unicorn OS?

Oh... you cited Oberon some toot ago.

I like it a lot (but I have to admit that I've never tried it on real hardware).

I love that Wirth still work on Oberon-07 language, and that he keep REMOVING features.

Oberon inspired Pike for Plan 9 UI. I started from Plan 9, and frankly I'm not brave enough to throw away TCP/IP as Wirth did.

Still... the Wirth's approach (hack, hack, hack, challenge all assumptions, keep it simple!) is what we need.

@Shamar @ciaby @ajroach42 UnicronOS : the magic OS that we're talking about that solves the problem.

with a sparkling dash of rainbow over it, because, you know it's magic. :)(

@pnathan @ciaby @ajroach42

Oh... the most funny definition of vaporware I've ever seen!

UnicornOS: the first #vaporware with a #rainbow! 🤣

(Disclaimer for any actual developer of an OS called Unicorn: I'm just kidding... the joke was simply too good... sorry)

@pnathan @ciaby @ajroach42

I do not know actually.
I literally know nothing about #hardware.

But my insight is that probably, #cheep #OpenHardware and #simple #distributed #software #systems could change the world.

I have a dream: one low power mail server in every house.
End to end mail encryption everywhere.

Unfortunatly no one seem interested in such a huge business opportunity.


@Shamar @ciaby @ajroach42

ah jeeze man, think of the sysadmin needs.

the mail servers fail. the administration is confusing because docs aren't perfect, so it gets misconfigured. the network goes down. baby pukes on server and it fails to boot. server is overloaded by volume of spam.

then the task is outsourced to a guy interested in managing the emails....... whoop whoop we're recentralizing.

@pnathan @ciaby @ajroach42

Oh, no I can't think that it's not possible.

It's not easy, but we buy and sell firewalls, routers, wifi spots... we can sell mail server too.

And with E2E encryption by default, do we really need spam filters?

@Shamar @ciaby @ajroach42 my Inner Young Geek wants to argue that actual configurable systems are actually not used in the home outside and that mail servers cross that barrier between appliance and administrating-needing machine.

but let's not rabbit trail onto that. ;-)

more my contention and question is: should we expect a member of cyberspace to be knowledgable in minor sysadmin?

I argue yes! we expect people to be able to refill their oil in cars, right?

@pnathan ... no?

There's a whole industry out there of shops that only exist because people don't change their own oil.

@ajroach42 changing oil isn't refilling oil.

one you just stick a can of oil in, the other requires draining the system, changing the filter, etc. much more specialized tooling & environment to do it right.

6 @pnathan @ajroach42

We do not learn car hacking at primary school because it's a specific skills set with little effect on a person growth.

Computational thinking is a completely different things.

Even if you don't code, the ability to think clearly about a problem, decomposing it into small pieces, debugging an argument... are all skills useful beyond the computer use.

It's literally a matter of power and freedom, because who is able to do that has a strong advantage over everybody else.

@pnathan @ajroach42

As an #hacker myself, I feel it as a #moral duty to spread such #power.

Nobody should be able to exploit #ignorance, because we know that ignorance is the precondition of #curiosity.

@Shamar @pnathan @ajroach42

in the UK they are now starting to teach this at junior school level (this is for children at biological ages 5-7, which is called Key Stage 1 here)

When I grew up in 1980s it was only taught in high school at age 14+, to those who had opted to take Computer Studies (a introductory CS course)



a better metaphor is cooking. everybody is expected to know enough about cooking to feed themselves. some people cook at a much more expert level, and people who are capable of feeding themselves pay those experts to feed them occasionally. cooking for yourself has benefits over eating out even if you aren't very good, because you can cater to unusual preferences.

@enkiv2 @pnathan @ajroach42
cooking for yourself also keeps the cost of eating out down, because professionals are competing with free. if all professional chefs started doing something (like cooking 'rare' burgers as well-done to avoid liability), home cooking isn't subject to those rules.

It's possible because cookbooks are mostly for the intermediate talented-amateur cook.

@enkiv2 @pnathan

And even still, we have tools (frozen dinners, spice blends, hamburger helper) to help folks that can't cook well still manage to cook what they want.

I want the Hamburger Helper of modern software development.

@pnathan @ciaby @ajroach42


Give us a little cheap fanless server and we will move the world!

@Shamar @ciaby @pnathan Pi 0 W?

$10 + a power supply.

But you still have to deal with NAT, or you have to deal with IPv6, or you have to not deal with the internet.

At scale, yes.
Although I feel that software is not evolving because:
The effort to develop new OS is too great, given the amount and complexity of the modern hardware (and closed specs).
Without a new OS, you can't develop new paradigms, and so we're stuck with ideas from the 70s (unix mostly, plus VMS-influenced Windows).
Programming languages are going to use the OS, and that's why we're not really progressing...
My proposal: simpler hardware, open and documented. Build on top of that. No backward compatibility. :)
@ajroach42 @Shamar

@ciaby @ajroach42 @Shamar I agree that backward compatibility has to be nixed for real research and change to occur.

now I have to debug a piece of code that is like the reification of all bad backend possibilities combined.

@ciaby @pnathan @ajroach42

What if everything was a #file for true?

What if all you need to support an #OS (and all #hardware it can handle) were 16 syscalls?

Keep this in mind and give a look at
(#Jehanne still needs 26 system calls, but I welcome suggestions to polish it further :-D)

@Shamar @ajroach42 @pnathan
I was actually looking at it before and find the concept quite interesting :)
Can I run it in a VM easily?

@ciaby @ajroach42 @pnathan

Follow the readme, and you should get it on QEMU pretty fast.

(First build takes 30 minutes due to GCC, later it takes 3 minutes at most)

@Shamar @ajroach42 @ciaby @pnathan

The direction this seems to be heading in is that we need to make a LISP (or Scheme) web browser.

I'm planning on starting soon...

@ixn @Shamar @ajroach42 @ciaby why should you make a web browser?

why not a gopher system for browsing files?

and for interactive work, why not dig through one of the old remote windowing & data transfer systems, and use *that* approach?

@ixn @Shamar @ajroach42 @ciaby believe me, the modern web is a *windowing* system with *HTTP* calls as data transfer protocol. kill the HTML/JS/etc side of it, and use a stateful connection

Or, why not just use SFTP, with a FUSE filesystem on the server?
You can use a file manager and text editor, and you can have interaction, as well as authentication through SSH.
I think it even supports FIFOs...

@Shamar @ajroach42 @ciaby @pnathan

sorry for digging up this old thread, but I have one remark that's been on my mind since I saw your post:

I knew how to read and write when I was 4. I don't remember how I learned it, but I guess I wanted to learn it, or found it fun.
Are not all people like that? Do other people only learn to read when forced to at school?
Is there a correlation between programmers and people who learnt to read before school?

@Wolf480pl @Shamar @ajroach42 @ciaby @pnathan I don't remember learning to read either, but different people learn at different paces, and learning later has little correlation to academic performance. The main correlation is being forced to learn to read causing later lack of interest in reading.

@seanl @Shamar @ajroach42 @ciaby @pnathan
Yeah, seems to make sense.
I was forced to learn to calculate integrals, and I hate integrals, and I forgot them already.

@seanl @Shamar @ajroach42 @ciaby @pnathan
btw. doesn't the school system seem to you like it's designed to destroy curiosity in children?

@Wolf480pl @pnathan @ciaby @Shamar @seanl and to turn them in to obedient workers who don't ask questions, and accept ridiculous punishments as a matter of course.

@ajroach42 @seanl @Shamar @ciaby @pnathan
IOW, hackers are a danger to the state (or even society) ?

@ajroach42 @seanl @Shamar @ciaby @pnathan
OTOH, I've seen state money being spent to pic the best students and provide them with an individualized education path, so that'd mean the state actually wants to support hackers... weird...

@ajroach42 @seanl @Shamar @ciaby @pnathan
maybe the school system isn't malice, but incompetence?

@ajroach42 @seanl @Shamar @ciaby @pnathan
The incompetence part is IMO self-explanatory, so let's focus on malice.
Any ideas who and why doesn't want there to be many hackers?

@Wolf480pl people who's power would be threatened by people who think for themselves?

The owners of capital.

@ajroach42 @seanl @Shamar @ciaby @pnathan
Or maybe let's consider the incompetence.
It surely is hard for a single person to keep 30 children occupied, let alone teach them something.

Would homeschooling be better? In the best case it probably would, but what about the average case and worst case? Would homeschooling-as-default reinforce the divide between the rich and the poor?

Or maybe we should go for master-and-padawan model, where you learn by helping someone do what you want to learn?

@Wolf480pl @pnathan @ciaby @Shamar @seanl this isn’t addressing the issue in any substantive way, unless you’re trying to bait someone in to saying we should abolish the education system for the good of education.

@seanl @Shamar @ciaby @pnathan @Wolf480pl (I do not mean to imply that that is your goal. Perhaps I should have phrased my standby differently.)

@Wolf480pl @pnathan @ciaby @Shamar @seanl the problem is underfunding and mismanagement.

That’s the problem everywhere, but especially in education.

@ajroach42 @seanl @Shamar @ciaby @pnathan
but assuming we have infinite money and perfect management, how many children do you think should be in a single class? And how do we get enough good teachers to make that happen?

@ajroach42 @Shamar @ciaby @pnathan

Also, as @seanl said, everyone has a different pace of learning. Moreover, we want to promote curiosity. IMO the school model where it's scheduled that "in year X, all children learn Y" doesn't fit that requirements well.

@ajroach42 @Shamar @ciaby @pnathan @Wolf480pl Part of the problem is that bureaucracies are extremely bad at producing high performance when results are difficult to measure. This is how we get bad teachers who can't be fired, because the bureaucracy can only fire based on easily measurable things, and the unions won't allow measurement of even things that can be measured, often for good reasons.

@ajroach42 @Shamar @ciaby @pnathan @Wolf480pl And people who would be really good teachers often end up doing something else because they don't want to work in a system that sucks the life out of them.

There are bureaucracies that do a better job of educating than the average US school district. I'd submit that none of them do a great job of educating. Education really needs to be continuous and ambient.

@ajroach42 @Shamar @ciaby @pnathan @Wolf480pl Actually I think the way you get educated people is by having a culture that values learning. American culture does not, and that's why we have a shitty educational system. I don't see how you can fix the educational system without shifting the culture.

@seanl @ajroach42 @Shamar @ciaby @pnathan
Why do you assume we're talking about American (and by that you probably mean USian) culture and education system?

@Wolf480pl @ajroach42 @Shamar @ciaby @pnathan Well, Andrew is in the US. But I'm just talking about what I know. If your solution can't work for everyone what's the point?

@Wolf480pl @ajroach42 @Shamar @ciaby @pnathan Most arguments that point to European countries really boil down to "Well just have a smaller, substantially less diverse country that is less anti-intellectual and everything will be fine."

@Wolf480pl @ajroach42 @Shamar @ciaby @pnathan I am not a fan of "well it works for you but it won't work for us because our requirements are special," but I think that when it comes to things like education and welfare there are qualities of the US that are both special and non-optional. And the diversity angle is one nearly everyone misses.

@seanl @ajroach42 @Shamar @ciaby @pnathan
well I'm not saying there are solutions that work for EU but not for US. The problem is present in EU too.
I understand that you have experience only with education system of USA. Sorry for the knee-jerk reaction.

@Wolf480pl @ajroach42 @Shamar @ciaby @pnathan Ah sorry what I meant was that in general I don't like the class of argument I myself was trying to make, which is that just because solution X works for country Y that doesn't mean it will work for country Z because we're different" without pointing to the exact ways that there are differences. In particular scale is often used as a difference that requires no explanation even though IMO you have to show WHY something won't scale.

@seanl @ajroach42 @Shamar @ciaby @pnathan
What I was afraid of is that aside from common problems present education systems in both US and EU, there are some other issues in US that are significantly greater, and therefore the common issues go unnoticed there.

@Wolf480pl @ajroach42 @Shamar @ciaby @pnathan I'm sure that's true. One common problem that I think is pretty evident in the US is that people think of education as being something that happens in school, that's the school's/government's responsibility, rather than everyone's responsibility and happening everywhere and all the time.

@Wolf480pl @ajroach42 @Shamar @ciaby @pnathan If anything I think this is less a problem among segments of the US population than it is in a lot of other countries, particularly ones with "better" education systems.

@seanl @Wolf480pl @ajroach42 @ciaby @pnathan

As for kids that do not like to go to school, I have three daughters and the eldest is in 4th elementary. She is pretty good at everything (evidently she got her mother's genes) including math and science (where I was pretty good too, but for my joy, her math talent beat my own!)

Still, each damn morning she don't want to go to school. Each day she has homeworks to do, it take much much more time to start doing them than to do them.

Kids are kids!

@seanl @ajroach42 @Shamar @ciaby @pnathan
I think at this point we should consider not just how US culture doesn't value lerning, but also how it compares to other cultures.

@seanl @Wolf480pl @ajroach42 @ciaby @pnathan

Substantially less diversed countries in Europe? Who say this should come in Italy and travel it for a year: she will not interact two days with the same culture.

Where I live, while we all speak Italian more or less in the same way, each village has its own dialect (a sort of local language). At work at times we enumerate the different names we would use to say common things like rabbit or bread or unmarried girl or unmarried old man... LONG LIST!

@Shamar @seanl @ajroach42 @ciaby @pnathan
And you should come to Poland and notice that, except the mountains and the seashore, people are more or less the same everywhere.

@Shamar @pnathan @ciaby @Wolf480pl @seanl this has been a fun conversation, but I think I’m done talking about education for the time being.

Thank you to everyone who contributed, and also please drop me from subsequent replies.

@Shamar @Wolf480pl @ciaby @pnathan Italy is an interesting case being formed from a bunch of separate city-states, but the cultures that are there have typically been in their given regions for a long time, no? Even so, I think Italy has some of the same challenges as the US due to the diversity of cultures, and many of the same problems result.

@seanl @Wolf480pl @ciaby @pnathan

What you said is true but partial: Italy was a place of several mix of cultures for at least 3 thousand years. We are a deep genetic mix of north african and indo european people. And our cultures have a comparable complexity. I was not kidding: each Italian village has its language or its set of traditions. We can live in peace because we like such differences.

I don't know the US enough to say if your comparison holds. But it's the first time I've read it.

@seanl @ajroach42 @ciaby @pnathan @Wolf480pl

Even in Italy, where you can literally breath, drink and eat so many different cultures each few kilometers, politicians disregards and continuously demage culture and schools.

The fact is that people with a good culture are harder to manipulate.
Also the Italian school has been demaged by the cold war between USA and URRS: we have had a complex set of effects here, including politics damaging the education system to fight comunism among teachers.

@Wolf480pl @ajroach42 @seanl @Shamar @ciaby

homeschooling is not practical nor possible to do well at scale.

it relies on having at least one highly educated & disciplined parent who throws their career & potential in the trash to teach their children. to be clear, that largely means women.

I had an *excellent* homeschooling education and I have 0 desire to suggest that anyone should pursue that path who isn't wealthy already.

fix the frigging school system.

@Wolf480pl @ajroach42 @seanl @Shamar @ciaby to be even more blunt, most people aren't that unique or that interested in learning what's needed for general life success and achievement. That's the point of a regularized mandated curriculum: to ensure, on average, people know enough to be good citizens.

the legendary lack of care for education produces the antipathy towards education in the USA.

if you want excellent education, you must make rich kids go to public schools.

@Wolf480pl @ajroach42 @seanl @Shamar @ciaby in areas such as the puget sound where rich people get to siphon their kids off to private schools means that they let public schools go hang,

ban private schools both secular and religious.

@pnathan @Wolf480pl @ajroach42 @seanl @ciaby

I pretty much agree.

Up to 13 years ago I would have agreed completely.

However now I have some doubts about religious schools: maybe it's just that my conversion to Catholicism poses a bias, but I see a value in schools that have a religious bias.

IFF the quality of the education is equivalent to that of public school, the bias proposed to students differs from the mainstream consumerism, teaching them to take everything with a grain of salt.

@Shamar @pnathan @Wolf480pl @ajroach42 @ciaby From a political standpoint, forcing the rich to send their kids to public schools would certainly improve the quality of the public schools. Or it might just get the rich to move elsewhere.

In Palo Alto, one of the richest cities in the US, the wealthy largely send their kids to the public schools. Reason being that Palo Alto itself is exclusive, so the public schools are essentially private. They have parks that don't allow non-locals.

@Shamar @pnathan @Wolf480pl @ajroach42 @ciaby And that's what you'll get if you ban private schools & home schooling. Many more communities will start looking like Palo Alto, and public schools in poor communities will continue to suck. And then the rich will fight any requirements to bus kids around, funding for said buses, etc.

@Shamar @pnathan @Wolf480pl @ajroach42 @ciaby If anything, banning private schools would *benefit* the rich because instead of forfeiting the taxes they spend on public education, they'd now be getting them back.

@seanl @ciaby @Wolf480pl @pnathan @Shamar Assuming they were willing to relocate, or able to redistrict.

@ajroach42 @Shamar @pnathan @Wolf480pl @ciaby I'm not sure how it is in other states, but in California the quality of the local schools is one of the top drivers of where upper middle class people choose to live. In my family's case we picked the school first and then moved near the school. Even if people didn't relocate right away, over time the clustering would happen, and it would be worse because there'd be no private school fallback.

@Shamar @ciaby @seanl @Wolf480pl @pnathan having worked in public and religious schools, most religious schools in the US are regressive and every one I’ve ever set foot in (many) have been horribly abusive.

I lose so much respect for people that subject their kids to that stuff.

@ajroach42 @pnathan @Wolf480pl @ciaby @Shamar A couple of the most respected schools in my area are religious. A majority of the students in both schools aren't even members of the religion of the school.

@seanl @Shamar @ciaby @Wolf480pl @pnathan my perspective is probably tainted by growing up in the rural south, where ‘religion’ is wielded as a weapon against education.

@ajroach42 @pnathan @Wolf480pl @ciaby @Shamar Yeah "religious school" to me (here) means Catholic or Jesuit, i.e. religions that value education. For many other (Christian) religions "religious school" seems like an oxymoron.

@seanl @ajroach42 @pnathan @Wolf480pl @ciaby @Shamar In the city I live in (Newark, OH), the public school system has a perception of being where you send your kids to become drug dealers, whereas the religious schools are where you send your kids to learn. (Most of them are Catholic in my area, AFAIK.)

@ajroach42 @pnathan @Wolf480pl @ciaby @Shamar @seanl been on a christan school in germany.
Have to say that we have a centralized curriculum and no matter how religious the schools can't touch it.

Just my 2ct

@Shamar @pnathan @ajroach42 @seanl @ciaby
from what I've heard about catholic schools in Poland, they're the greatest source of atheists. If you're kinda-religious, and your parents send you to a religious school, you'll see the church's hypocrisy from so close that you won't want to have anything in common with Catholic Church anymore.

@Wolf480pl @pnathan @ajroach42 @seanl @ciaby

This happens in Italy too.

There is also another explanation to this fact: when you see that the world outside the school follow certain goals and values and your teachers propose to follow completely different ones, you realize that you can choose. Or even go for your own road, being skeptical about botb the world and the religion.

On the other hand Faith is not something you learn.

I was a bad atheist myself and I didn't learn to believe.

@pnathan @ajroach42 @seanl @Shamar @ciaby
so what you're saying is:
- to achieve success in a modern society, you need to have some basic skills that everyone is expected to have, before you reach the age of 18
- most people don't want to learn those skills before the age of 18
- we should force them to learn those skills so that they can be successful

@Wolf480pl @pnathan @ajroach42 @seanl @ciaby

Success is such a limited concept.

People should be forced to learn what they need to be free.

This includes math, history, geography, computational thinking and hacking.

For their own interest and the interest of everybody else.

We all need free citizens.

@Shamar @pnathan @seanl @ciaby
>force someone to learn hacking
>force someone to be curious
isn't this an oxymoron?

@Wolf480pl @pnathan @seanl @ciaby

Only on the surface.

When you force a group of kids to do a scientific experiment,and then another, and then another one too, and then you let them alone in the lab, what do you think they will do?

On my first chemistry lab the first question to the teacher that one of my classmates did was: how can I build nitroglycerine?
The teacher said: oh you could, we have everything you need, here. But I won't teach you the procedure, you'll have to discover yourself!

@Wolf480pl @pnathan @seanl @ciaby

It worth noticing that my teacher was not crazy: internet was not yet a thing in Italy back then.

@Shamar @Wolf480pl @seanl @ciaby geez a lot of talk in the last 3 hours.

the point of having a nationalized standardized curricula is to ensure all parents have a stake in its success.

palo alto, ca and similar communities are pathological and dangerous to the body politic

@pnathan @Shamar @seanl @ciaby
There's a lot more to a school system than just curriculum.
Anyway, what's the point of parents having a stake in the curriculum being good if they have no influence on it?

@Wolf480pl @pnathan @ciaby @Shamar @seanl Having come out of a G&T program, I can attest that it's less about supporting us, and more about keeping us from causing problems for other people (and, honestly, keeping us alive.)

@Wolf480pl @ajroach42 @seanl @ciaby @pnathan

Hackers are difficult to manipulate, they respect what you do and ignore what you represent, so in a way they can become annoying to the leaders of any organisation. OTOH not every annoying person is an hacker. Also hackers can be pretty well integrated in the same organisations, because the whole group benefits from their original perspectives.

The Hollywood sociopathic hacker is a misrepresentation: more hackers would be a problem for the power.

@Shamar @ajroach42 @seanl @ciaby @pnathan
Ok, so hackers are a danger to those with power.
But would the society be able to function if, say, 50% of people were unaffected by any manipulation by those with power? Or would it turn into a chaos?

@ajroach42 @seanl @Shamar @ciaby @pnathan @Wolf480pl uh… yes.
That's the sole goal of the current school system.
It was invented in the industrialization and never got overhauled. Idk what you even expect from that background

@Wolf480pl @pnathan @ciaby @Shamar I couldn't write when I was young, because of motor control issues, but I could read before I started school.

Everyone I know wanted to learn at some point. Most of them still do, but feel beaten down by the oppressive march of the clock.

@pnathan @ajroach42 @ciaby it’s so bad that many error messages just say “this broke, contact your sysadmin” which isn’t helpful to anyone

@queerhackerwitch @ciaby @pnathan I agree with this a lot.

It's like: I'm the sysadmin, and I don't know why it broke. Now what?

@pnathan @ajroach42 @ciaby

Just dropping in to say that this thread is absolute gold :+)

I haven’t experienced this much lively yet civil engagement on social media... ever? You all (and others who have piped in along the way) have presented a lot of great points and interesting opinions and I look forward to reading through the thread in its entirety when I get the chance this weekend.

I look forward to the blog post!

Alright, I'm back for a bit.

I have a few dozen replies to get through. If I'm still awake after that, I'll continue my thoughts. I have a lot more to say, but I'm not sure if I have the energy tonight. Might have to pick it back up after work tomorrow.

@Famicoman No!

That's super interesting. I'm bookmarking this to go through in my downtime tomorrow.

If it is what it appears to be, then I can't wait to spend all my time on it.

Lots of people in the replies to my original thread had lots of very negative comments about computer users vs computer programmers, and some of them seem to think that every human alive (excepting themselves) is some kind of half creature, incapable and undeserving of tools designed to meet their needs.

This is the technoelitism I mentioned earlier.

I'm done having that conversation. If you wanna talk about how computers should remain complicated, or how people should just learn to use the tools we foist on them, go somewhere else.

I'm not really even interested in talking about programming tools beyond lamenting the loss of programming as a fundamental part of the computing experience, rather than a niche secondary thing.

My core point here is mostly about the ways computing has gotten worse with it's most recent evolution, and what we can do about that.

There is no easy answer.

There is no single answer.

Anyone who claims otherwise is either selling something, or misunderstands the fundamental issue that people are unique, and therefore solutions most also be unique.

The computers of my childhood and my early teen years afforded me the same or greater utility than the computers I use today in all but one respect: communication.

I like a lot of parts of the internet. I like that it enables me to download software, to access media, to research, and to talk with people.

The internet does good, valuable things. Wifi and cellular data are revolutionary.

But they come at the cost of massive surveillance, an increased reliance on remote servers for what should be local or peer-to-peer activities, and marginal increases, or outright decreases, in actual utility on nearly every other front.

I don't want to sound as if everything modern is horrible, or even as if things would be better if we went back to the old ways.

The world has changed, our needs have changed. The old ways couldn't keep up, and the modern software gets the job done most of the time.

What I want, what we need, is a fundamental shift in how we approach software, returning to the ideals that saw the computer as a force, ultimately, for good.

@ajroach42 the "internet" as we know it now is a lot bigger, but the bigger problem is, everyone is trying to make money off everything now. Very few spaces on the internet aren't trying to make money off the user in an underhanded way.

Part of that is going to mean exposing more of the underlying complexity of the computer to the users (when they want to see it.)

Part of it is going to mean redefining our networks, our relationship to networks, and our tolerance for surveillance.

Things that should be peer-to-peer must be allowed to be peer-to-peer. Federated systems must rise.

Tangentially related:

A former resident of the USSR draws parallels between the russian revolution and the modern internet.

@ajroach42 I've still got a lightning talk (5 or 10 minutes) under my belt, "Users vs. Programmers" to give at Akademy or something similar. Not this year, though, I don't have the spoons. What I want to do is prove that every programmer is also a user unless they make all their own tools.

@ajroach42 I need an animator to make me a tiny clip in which a stick figure kicks the A out of COMPLAINTS, pushes the I to the left, and sits down in the gap to reveal the letters ME on their shirt, making the word into COMPLIMENTS.

@ajroach42 I feel like, fundamentally, the approach to what a computer should be *for* has changed.

I'd argue that, from the 1930s to the early 1970s or so, computers were primarily built to solve some sort of task first and foremost. There was plenty of experimentation with those computers, and there were occasionally purely experimental machines, but the industry was focused on meeting business and governmental needs.

@ajroach42 The mid 1970s personal computer revolution, was a fundamental shift, though.

People wanted a computer, not necessarily to do a certain task for them, but *for the sake of having a computer*, so they could tinker with it.

The earlier large computer users were programmers as well in many cases, and some were enthusiastic, but ultimately, they used a computer because they were paid to make the computer owner's job easier.

Not so for the 1970s personal computer owner.

@ajroach42 However, as the tinkerers tried to figure out what to do with their new computers, and actually made them useful, they became useful to people who wanted a tool, not a toy, and that's where the pressure to strip out toy functionality started creeping in, in the 1980s.

That's why it feels like we've returned to the mainframe model of control - because the average 2018 computer owner and the average 1968 computer owner want the same thing.

@ajroach42 thats because the people in power benefit from that economically.

@thegibson More or less.

Except that it's less like the Mainframes I love (university systems and the like) and more like a giant timesharing system where everything you do is tracked.

@ajroach42 Some people thought this is a nice idea. Especially the plan9 folks (I cannot find the right source to the interview I saw)


I think Ease Of Use got in the way of Control Your Own Machine. This was promoted by Microsoft and other large companies in their own interest and then mistakenly chosen by the users.

The truth is computers are not easy to use. Ease Of Use does not make them easy to use. It takes some learning to "get" (grok) computers.

@hairylarry @ajroach42 Moreover 'ease of use' tends not to mean, clear interfaces and mechnisms but 'usable without training' which has been a lie all along.

@hairylarry @ajroach42 I think that the need to convince people to buy your software over and over, and now to subscribe to updates, means you need to be constantly adding "features". Commercial software can never be "done".

@hairylarry I agree!

And we've settled for ease of use for business reasons, financial reasons, even though it's worse for users in the long run.

Learning to drive a car is also difficult and frustrating and dangerous, but people manage to do that.

There is a better balance available between ease of use and local control, and we gotta find it.

I don't think we've settled for ease of use as much as we've settled on "increased engagement"... How to keep peoples eyeballs glued to your product.

Gopher is very easy to use, more so than a typical website today... But a gopher hole gets you where you need,and then you're done.

Most websites have a goal of keeping you there longer.

Disagree. Firstly I'm not persuaded that the mainframe is a bad thing, and secondly pervasive computing infrastructure allows us to do things we couldn't before.

The question is who controls it it and to whom are they answerable? Mastodon is like Usenet: control is distributed. There are problems with that as Usenet found, but they're radically different from the problems creared by monopolists like Facebook.

@simon_brooke in the next post in this thread I said that I didn’t think all mainframes were a bad thing, and that the problem was down to who controlled them.

But federation is weird and it’s entirely possible that you didn’t click through to see that one.

I agree that modern ubiquitous computing should allow us to do things we’ve never done before. I lament that those things are mostly ‘spy on people in novel ways’ so far.

Fair enough, I didn't see that. But I'd plead there's far more to pervasive computing than new ways of spying on OTHER people. I really appreciate the many new ways of spying on myself. It allows me to track and map my cycling, correlate my mood with local weather, compute my blood clotting factor and adjust my medication, remind myself when I need to leave home to get to events, and advise me the best way to get there avoiding traffic. It makes my life better.

@simon_brooke I couldn't agree more.

My concern is mostly that all of those things rely on other people, and that those other people can't be trusted.


Finally found time to read this whole thread. Quite a piece of writing. I answered your last post in the thread but all I said was stuff you were already talking about earlier that I hadn't read yet.

@hairylarry I'm still trying to get it turned in to a blog post, but I got distracted by my new toy.

l snagged a Macintosh Plus at an antique store.

@ajroach42 I agree with what you’re saying, but could you explain what you mean by that last line? “We brought back the digital serfdom of the mainframe.”

@Thepunkgeek there was a time when, at least for certain definitions of the word computer, no one could afford home computers, so they had terminals and dialed in to corporate or educational mainframes.

They had no control over these systems. They were serfs, limited by the will of the sysadmin who controlled what programs they could run, how long they could stay logged in, and what files they were allowed to access. They had no privacy.

@ajroach42 this is a very insightful outlook, thank you for sharing your thoughts!

@ajroach42 thanks! I agree with you that users should be able to understand the various components that go into their OS, do you think that the best way to approach this is to have users set up their own OSes? I'm kind of thinking an archlinux type approach, but where the set up would be interactive in the sense that an explanation of each step is provided.

@colelyman User's should never be required to do anything that is not strictly necessary, but should always be given the option and an explanation as to the impact of the actions that they are taking.

So, should users set up their own OS? If they want to, I guess. If it will deliver value to them, rather than just making things harder for them.

@ajroach42 just a heads up; I would love to read these thoughts in a blog, but there's no way I can keep up with them on my home timeline here

@technomancy No worries. I wasn't sure if I was going to do this as a thread, or if I was going to go straight for a blog post.

I decided to start with a thread in order to organize my thoughts. I'll blog it up tomorrow or Friday and post to my regular blog.

@theoutrider @ajroach42 and don't forget mario maker, which was nintendo (somewhat) embracing the mario ROM hacking scene!

@chr @theoutrider Nintendo is super frustrating with this stuff.

They embrace the community, do something novel, get people on board, and then get scared and cut it off.

I love SmileBASIC on 3DS! It truly feels like a little Apple or Commodore and can be surprisingly powerful. If it had external I/O (save/load/print), I think I could use it as a primary computer.

@ajroach42 where there was once a linear learning curve there is a blockade now.
People start from zero, as always, learn their way up but then once they write nested if statements within giant Excel documents they hit a ceil.
There's no easy path for replacing the excel code with, say, python.

The excel community greatly celebrated the introduction of the SWITCH statement.
(I'm not sure about the version but I think it was introducted in Excel 2016 ??)

@upshotknothole @ajroach42 Ceilings are everywhere. During the late '90s I solved most of my programming problems using Perl. Then everything had to be Object Oriented, and I just couldn't get it. Until today. I can somewhat understand and debug OO code in various languages, but I'm not able to write anything useful. Everything seems opaque and indirect and unnecessarily complex.

@galaxis @ajroach42 i have to admit that i never thought about OO that way.

I always thought it was just some way to think about memory and instructions on memory.

@ajroach42 Raspberry Pi == the new C64/Beeb.

Lots of IT careers will be started with the Pi, just as they were with the C64/Beeb.

@profoundlynerdy @ajroach42 I'm all for all the little things that spark an interest in coding in young minds, but in all honesty I'd expect the JavaScript console in all non-mobile browsers will introduce two orders of magnitudes more people to coding than Raspberry π?

@22 @ajroach42 Because JS is limited to the browser in a lot of ways. Yes, I know you can run JS from a console if you want, but that's not obvious to the novice.

Python, by contrast is a good general purpose programming language that is syntactically easy to read (like BASIC, unlike Perl) and not so low level that you have to worry about pointers (as in C) directly.

@profoundlynerdy thanks for weighing in! What do you think of James Hague’s gentle arguments (which I agree with and practice myself, as a professional JavaScript and Python dev, full disclosure 😝):

@22 I understand his argument on the graphics front.

1. The lack of a GUI keeps things down to the brass tacks. If you give a new programmer a GUI they'll play with the GUI and not much else.

2. While JS dominates for client-side scripting in the browser, I think its going to have competition soon. I expect browsers will add Python interpreters soon-ish as MS Office recently did, if I'm not mistaken.

As for standalone executable portability, yeah, that's a weak point. Learn Python, then learn C.

@profoundlynerdy If I may continue to impose on your time—

1. I shudder to think how many kids we keep away from math and engineering by making them think that they have to be good at grade school arithmetic and calculation first. Similarly, I am ecstatic that visual (Processing.js, Cinder, OpenFramekworks, Scratch) and audible (Sonic Pi) programming draw so many in. As a learner and a teacher, I do not believe beginners should be made to prove themselves with stodgy, boring "basics".

2. No matter what language one uses to make the browser dance (I've used Elm, ClojureScript, TypeScript, and plain JS), you have to learn the "browser way of doing things" (async, DOM, etc.). More than just JavaScript I meant that the browser is my preferred ecosystem to teach beginners and intermediates—nothing I've seen excites as many and as much as making universal webapps.

(JS also rocks server-side. I find myself writing CLI apps/scripts just as often in Node as Python.)

@22 Yeah, there is a lot to learn... other technologies. I hate DOM. Hahaha

@profoundlynerdy Full transparency—I write C, C++, JavaScript, Python, Julia, Matlab for a living, I like all languages and everything that brings people into this rewarding hobby and career. I love Raspberry πs! And yes, each learner is an individual with their own interests, their own hidden eigenvectors that can propel them into a lifelong love of computing, so no blanket statements here. I just look at students' excitement about browsers today and can't see the dark age that @ajroach42 sees.

@22 @profoundlynerdy I didn't say anything about dark ages.

Browsers are neat! The inspector is a great tool.

Browsers have problems of their own, but those problems are solvable, and divorced from the OS and bundled software issues I was specifically talking about here.

We also might be talking about different age groups. I work with my nephew, and I do Code.Org\\hour of code with 2nd and 3rd graders.

To them, the inspector is pretty scary, but Code Combat or Pico-8 can go a long way.

@ajroach42 Super-duper thank you for your thoughtful remarks. You've convinced me to keep a Raspberry π+display+io with Pico-8 and Sonic Pi prepared. Yes, I'm mostly familiar with the high schoolers or older crowd (see for a friend's story), and younger kids with only a shared tablet is a whole other level of complexity, I now see. Thank you! @profoundlynerdy

@22 This.

I taught myself BASIC to a limited degree in the 90s on a Commodore 128D and Apple 2 clone. I was pushed away from programming by people who expected I needed to be a math whiz.

Even as a dyslexia sufferer with math issues as a result, I'm still able to "grok" Python just fine. So much for the nay sayers.

@profoundlynerdy We are super-happy to have you coding with us <3!

John Taylor #Gatto in "Underground history of American education" railed against that cruel confounding of skills:

"#Pepys could only add and subtract right up to the time of his appointment to the Admiralty, but then quickly learned to multiply and divide to spare himself embarrassment. … You can learn what you need, even the technical stuff, at the moment you need it or shortly before."


@profoundlynerdy You know, this is probably where we get all these stupid people asking math/puzzle questions on coding interviews—this absurd 1990s notion that skill in grade school math (not really math, more like calculation) is correlated with skill in coding. There's no valid reason to ask for Gauss' sum formula tricks to see if you can code—if there was, you might as well as about KD-trees or FFTs. If you like people who like math, fine, but don't pretend you're interviewing for coders.

@22 I couldn't agree more.

Your'e much better off asking algorithm questions that are actually relevant:

* What is the purpose of a linked list?
* How might I optimize this to use less memory? Insert [example of a giant left-hand grab of SQL data that uses 2GB data]. Expected answer: stack/queue one 2 MB record at a time [example code].

Yadda Yadda Yadda...

@profoundlynerdy @22 I definitely agree with this sentiment.

JS is hard to read. It's hard to write. I've been doing web development for 10 years, and I still regularly stuff up little JS stuff in dumb ways (partially because I don't actually touch JS often, and partially because it's syntactically dense.)

@22 @profoundlynerdy

I genuinely don't know. Maybe!

But I know a bunch of kids that don't have access to a non-mobile browser. They have ipads and android tablets.

A few of them have chrome books, which is better than nothing.

But developing for the web is non-intuitive, and requires access to a server. It's not great for beginners learning on their own.

@ajroach42 this is very much in the same spirit as pico-8. for iPad and iPhone

@ajroach42 that said, raspberry pi + pico8 is probably the more forward-looking and flexible choice!

@bunnyhero That's really cute! It also addresses what is normally my primary concern with iOS dev environments by giving you a way to share your applications.

I was under the impression that that was very much against apple's rules.

@ajroach42 same. i'm surprised that app lets you share code

@ajroach42 Please forgive me for wading into your thread half-way without reading all of it (I promise I come in good faith), but—"He couldn't do that on his iPad"—I don't think this is true:

- Not to name-drop but John Carmack has talked about writing Lisp (Scheme actually) on iPad:


@22 @nolan Neat!

Still more complicated than I'd want to put in front of an 8 year old, but it's still neat stuff.

@ajroach42 I think, as other have said, this isn't exactly true?

Grasshopper ( is a way to create games on an iPad

Hopscotch is Another.

Heck, Apple even released "Swift Playgrounds"

@PrincessRaspberry I'm not saying it's impossible, but rather that it isn't worth the effort.

I appreciate the links! I hadn't heard of these before, so I'm looking forward to exploring them.

My larger point re:ipad is less "you can't make things" and more "It's harder than it should be to make things, and harder still to share them."

'possible' and 'manageable' are worlds apart, you know? But that's a larger discussion.

@ajroach42 my kids are writing games on their iPhones and a spare iPad using Python (via Pythonista) and dabbling with Lua (via Codea).

@crc How do they like Pythonista?

Is it hard to share code between devices?

What about codea?

I haven't had the chance to explore either in great detail, although I was aware of them peripherally.

I'm glad to hear that they are usable, and I'll make a point to explore them further.

@ajroach42 They *love* Pythonista.

It's not a bad environment for learning; there's a number of examples, auto indention / reformatting, code completion, a visual GUI layout tool, some iOS-specific modules, and the full Python docs included.

For code sharing, they use Air Drop to share projects and Working Copy to store history.

We haven't done much with Codea. The environment is really nice, but I haven't used Lua much, so I'm less comfortable helping them in it.

.@ajroach42 You probably won't see this, but you can indeed code in Pythonista on an iPad. It's a good Python dev environment with graphics libraries.

Quit being an anti-Apple reactionary.

@ajroach42 That “what’s a computer?” ad should be taken as a warning siren, not aspirational motivation.

@ajroach42 The "Post-PC" era really is just the "consumer, not creator" era, which is particularly and darkly ironic considering Apple's success at marketing themselves as the purveyor of hardware and software for 'creatives'.

Splendid idea - the gift that keeps on giving.

@ajroach42 Alan Kay has criticized the iPad for being a consumption-only device. The problem is, I can see kids who wouldn't want to use a raspberry pi to make games because their friends don't have one

@meff In this case, we've largely seen the opposite.

Other kids want one, because he has one. (Of course, then we run in to the problem that, in this case, the Pi is a linux system, and that it really expects someone to know what they are doing in order to configure it. )


Getting the software configured is non-trivial. It's not hard, exactly, but it's beyond the average 8 year old, and also beyond many parents without previous linux experience.

I've been toying with the idea of putting together a distro for the pi to simplify getting starter, or auditing the available distros so that I could recommend one to the parents that, through the grapevine, end up asking me about it.

@ajroach42 That is really cool. Are you envisioning a Pi-based game-making platform that comes out of the box with a text editor and a way to run the game you're making?

@meff Pretty much, yeah.

A text editor, a few different game making environments, a simple UI for jumping between them.

I built something similar and had good success with it, I just haven't had the time.

@ajroach42 Were I in the position of guiding a young person toward programming, writing games, etc., I would consider the firefox as the platform, and show them how to get to the javascript console. This would have many of the benefits of Hypercard and the BASICs of the even more distant past, mainly instantaneous gratification, getting visible results with relatively little effort, and the opportunity to play with with stuff.

@emdeesee I addressed why I think this is a bad idea elsewhere in this thread, in passing, but I'll be glad to get in to it in more detail if you'd like.

First: comparing JS to hypercard demonstrates to me that you've never used hypercard. The difference in levels of complexity between the two are massive. In all seriousness, go watch a hypercard tutorial, or open a stack in the editor and get a feel for how simple it is to work with.

@emdeesee 2: JS is massive and complciated and difficult to read. It can be an introductory programming language, but I don't think it makes a good one, and especially not for an 8 year old. He's doing very well with lua, and has been considering Python.

@emdeesee 3: In order to share the things you've made in the development console in firefox, you have to copy them out of the development console and in to a text editor, save them, and then either view them locally or upload them. The process is complex. Web hosting is more complicated than swapping files on flash drives or dropbox (I recognize that you could share the website files on flash drives or via dropbox, but I'd still argue that this is more complex than swapping .p8 files)

@ajroach42 Presumably, one does not jump directly in to metaclasses or even, well, classes when one starts with Python as a learning language. Similarly, one need not embrace all the complexity of javascript or web development to make the browser do a little dance or print "haha butts" 1000 times like I would never have done on the display TRS80 at my local Radio Shack.

Pico-8 looks pretty neat; I'm just not familiar with it. It looks like great way to have fun with programming.

@emdeesee There's also TIC-80 which is open source.

The reason I like these platforms is because they reveal the source code for their games, and integrate an editor. This means that you can start by modifying someone else's simple program rather than trying to write your own from scratch.

@ajroach42 I should also say that when I was new to programming, printing "haha butts" 1000 times on the machine at the mall was the height of technical achievement. That probably colors my pedagogical notions, and what a new learner might feel is satisfying.

Pico-8 looks fantastically advanced to me.

@emdeesee Oh no worries. It's a huge thread, and I didn't go in to it in a lot of detail.

@ajroach42 I've used Hypercard extensively, making both apps and informational decks, though I was already an adult when Hypercard emerged. It was easy to get to do simple things and difficult to get to do complex things. But you *could* do complex things, which always felt like an accomplishment when it worked out.

It was also limited to the most expensive personal computing platform of the time.

I have nothing against Hypercard; it's fine.

@emdeesee Neat! So then you should be able to recognize the massive difficulty presented by doing simple things in JS in firefox as opposed to doing those simple things in hypercard.

Also, note that I'm not advocating that we bring hypercard back exactly. I'm lamenting that there isn't a modern analog in terms of simplicity of use.

@ajroach42 I think I do understand. My original response was thinking "out loud" about how I'd help a young person into the activity of using a programming language to do fun and cool things today...

@emdeesee If I was working with someone a little older, with a little bit of a technical background, I'd agree with you.

For this kid, at this time, it wasn't a good solution.

If you try it, I'd be curious to know how it goes.

@ajroach42 Unfortunately, my experience with this was with an unmotivated learner, and the results were ... unfulfilling for both of us.

We tried Logo and Python; this was around 2005, with a 12 year old, my daughter. Now she's learning Python on her own, and occasionally comes to me with questions, and to say, "I wish I'd let you teach me to program back in the day." 😀

@emdeesee That mirrors my experiences of trying to teach people before I started using Pico-8 (and games on as the base.

@ajroach42 That could not be more touching. I also say this as a non-programmer parent.

@ajroach42 it's really sad that such a powerful device is… readonly.

@hirojin Apparently it's not entirely read only.

As I mentioned elsewhere in the thread, I've done video, photo, and audio editing on my ipad without too much issue.

There are apps that enable programming environments. Things like Pythonista.

But they are less friendly than I would prefer.

@ajroach42 the audio/vidio stuff's primary purpose is to allow you to "generate" "content" for "platforms"

@ajroach42 pico 8 seems cool but is there an open source version?

@ajroach42 there are tons of apps on the ipad that are made for kids to let them learn how to code and make simple games. the problem is, that the ipad is made so you can't mess with the underlying system and apps aren't easily portable to other platforms. but then again pico8 has the exact same problem.

@tauli pico-8 apps will run in pico love, which is user modifiable.

There’s also tic-80, which I discovered recently.

The thing that appealed to me about pico-8 is that there was an existing library of thousands of games across an amazing array of complexity, all of which have their source code available and modifiable in app.

So yeah, pico-8 isn’t directly user modifiable, but it’s a stepping stone.

@ajroach42 don't get me wrong, i think setting up something like pico-8 is far better than an ipad-app.

i just wish there was more kid-friendly material on programming real machines. i see stuff like pico-8 and tic-80 more as educational toys.

i held some programming tutorials for kids (age 10 to 15) using #Haskell. if you present it in a beginner-friendly manner, that works really well. probably should make publicly available material about that.

@tauli On that I can agree, it's a toy.

It's a pretty useful toy, though.

The next project he and I are working on is using an ESP8266 (wifi microcontroller with Lua dev environment.) I think it'll be fun, but I'm not sure what we're going to do with it yet.

@ajroach42 i'm pretty sure that en ESP8266 can drive WS2812 LEDs.

@tauli I was thinking we'd connect it to a motor and have it receive instructions via wifi.

I'll explain what kinds of things it can do, and see what he comes up with.

@ajroach42 btw. if you are interested in pico-8 and tic-80, you probably also want to check out Pixel Vision 8

@ajroach42 I think there is something synister in tablets and smartphones that whispers into ears "Don't bother creating anything, it is too much work and you are bad at things anyways, just consume this instead". Opensource communities and hardware on the other hand bellow "MAKE STUFF" like a warcry while brandishing how-tos and guides.

@ajroach42 I didn't realize this was a whole thread, but I just read it and I can't agree more!

@ajroach42 I think org mode (in emacs), especially with its literate programming capabilities, presents a vision of the future you guys want. No other computer interface I have encountered is so powerful, hackable, and easy to understand. Org mode is relatively easy to use while providing an incredible range of capability because it doesn't insert layers of UI that abstract what the user sees and does from what the computer sees and does. It just uses super simple markdown esque symbols.

@ajroach42 I mean, org isn't a programming language, but it is a powerful tool to enable creation of a wide variety of text based content

@ajroach42 Except that he could totally do that on his iPad. And he could make games that are more responsive and user-friendly with better graphics. And actually put it on App Store for everyone else to play. 😉

@ajroach42 I disagree with the main theme of the thread as well. There are some factually incorrect statements like your comment about modern programs and that modern computers are slow. Even the new iPad Pro is a faster machine than most PCs from just 5-6 years ago. Yes, modern software is complex but that’s because they’re crammed with features, not cos they’re feature incomplete. Those features enable you to go from concept to presentation in just minutes instead of hours, for eg

@ajroach42 I’ll tell you an example. Everyday, every time my iPhone X unlocks with facial recognition or animates emojis by reading my gestures, I get lost in awe at how fast the object recognition and classification must be happening behind the scenes and how powerful the hardware is. Would have been impossible to achieve this efficiency even a couple of years ago on a mobile device.

@ajroach42 well of *course* not - he just needs to get a Mac (only $1k) and install XCode and some other bits then join the Apple Developer program (only $99/yr!) and then he can make games for the iPad. Easy!

@elomatreb this is a really great point!

Even though batch files were relatively simple, you could do some moderately impressive things with them.

TBH, I have to assume that modern windows can still run batch files, but I don't actually know.

And how many modern applications have hooks for batch files the way older apps did?

Time to explore.

@ajroach42 Modern stuff probably less so, as microsoft favors Powershell for modern Windows scripting (and for good reasons, IMO). But the principle remains, the interpreter ships with windows

@elomatreb I haven't thought about trying any scripting on a modern windows machine.

I'll play with it, and see what I can pull off.

@ajroach42 @elomatreb Didn't they integrate PowerShell with .NET or something? Much better language for Windows scripting vs batch at any rate.

@ajroach42 But still, it's awesome that you can download VS Code on Windows or Xcode on macOS (and VS Code on macOS :P) and get a professional C++ developer environment *for free*.

@22 Oh sure! I'm not saying that it's all gloom and doom these days.

But those are tools for Serious Developers, and I'm talking about learners, kids, and people who have priorities beyond development.

@ajroach42 it's also what what Myst was originally built on (as well as Cyan Worlds' previous two things)

@ajroach42 the origin story for Cyan Worlds is legitimately just so lovely

@ajroach42 in grad school we used hypercard to mock up AI concepts.

Because you can't pseudocode AI, as you need to know what *really* happens. And this was a way to pseudocode the output.

Obviously it was not ideal but it was a way to see if the problem was in the concept or in some possibility of coding.

@ajroach42 Jobs has always been about driving people to keep on buying "the new". That is one reason I dislike Apple so much. It isn't the good hardware, it is the hardware that is forcibly obsoleted to force you to keep buying.

@dmoonfire I agree.

The original compact mac lineup had an incredibly long lifespan. An office I worked in in 2004 still had 4 in daily use. (because hypercard.)

@ajroach42 Jobs claimed he wanted to make 'bicycles for the mind' but everything he worked toward was 'putting the mind on the Monorail at Disney World'.

@ajroach42 more broadly, if developing small pieces of software to fill specific, humble needs becomes a thing almost anyone can do, then why would we need a software priesthood of overpaid, infantilized coders to hack through the thicket of competing corporate strategies? we'd need a smaller number of dedicated expert systems bridge-builders but nothing like what we have now.

@drwho How wild is it that Jobs let something like that out of the building?

@ajroach42 Back then? Don't know. Those were the days of "take the case off of your 1541 drive and cut the following traces on the PC board to change the device number."

Strictly for the hardcore, as it were.

@drwho @ajroach42 Was it really for control or did he just want to focus on his own vision of what usability meant?

@seanl @drwho I don’t understand the question within the context of the post it is responding to.

Could you provide additional context?

@ajroach42 @drwho I was trying to respond to "How wild is it that Jobs let something like that out of the building?", or rather his later killing of it. It may have shown up someplace else in the thread since I'm guessing the thread structure itself is nonlinear unlike Masto's display of it.

@ajroach42 @drwho But I now see that that's essentially what you're saying - that he thought the computer should be an appliance. So please disregard ;-)

@seanl @drwho In the Mac days, Jobs trusted his engineers to tell him what the computer needed.

The programmers switch was almost certainly an example of thing they used internally that his engineers made a case for keeping.

But I wasn’t there, wasn’t born yet. So I can’t say for sure.

@ajroach42 also, as a final comment it looks like someone is making an opensource hypercard called vipercard


This is how I learned as well.. back in the C=64 , TI-99/4A, and Apple ][c days..

Ah, childhood!


I hope you are preserving this thread in one place. It is a thoughtful and valuable series of observations.

I am sorry that you felt it was good to remove from public timelines AND I am glad I'm following you so I get to see it.

Thank you.

@Algot I posted unlisted because it's a Lot of posts, and I didn't want to clog the FTL. Feel free to boost.

@Algot I will be posting it as a blog post eventually, once these ideas are a little less raw.

I'll post about that when it's ready.


Thank you.

Much of what you say resonates with my own experiences.

TRS-80 was my beginning point --> Apple II+ --> Mac --> DOS --> Windows --> GNU/Linux --> 3D printing and RPi

I feel like I'm getting back to my beginnings.


Easy to use may actually stand in the way of being a learning tool when understanding the tool itself is the goal.

@ajroach42 I'm kind of sad that one part of that demo never caught on, that being the chording keyboard. Having a one handed chording device makes quite a bit of sense when combined with a mouse in the other hand.

@LilFluff yes! The cykey or the microwtiter made good strides here and then disappeared.

@ajroach42 my sister used to regularly use a brailler which gave me a bit of a bug for the idea at a young age (if you haven't seen one, there are 8 regular keys (for the 6 and 8 dot varieties of braille), a space bar, and on some models a single character advance/backspace keys or else a clutch&slider to move along the line. If you want to type an "r" you simultaneously press keys 1, 2, 3, and 5 to emboss those dots at the current position.)

@ajroach42 with the six dots of English Braille you can type all 26 letters of the alphabet, numbers, punctuation, and several common two and three letter combinations. Numbers are done using a character that says the following is a number and then a-j are reused for the ten digits. Likewise there's a capital sign that says the following letter is a capital. So despite six dots only having 64 combinations, standardized English Braille has around 250 'characters'.

@LilFluff The microwriter used something akin to brail chording, IIRC.

I *really* wanted one when I was in highschool, but I'm less enamored these days. I'd rather see a modern recreation, I think.

@ajroach42 I started using computers in 1982 and had a similar experience to you. I think your experience is rare among folks who started using computers in '95, but not at all rare in someone who started using them 10-20 years earlier.

@ajroach42 I read what you wrote but have not dug through all the responses, so please forgive me if I say something that's already been mentioned.

I think a lot of the problem is that the nature of the people programming computers has changed. Back in the 60s-80s, nearly everyone writing software was a tinkerer. Nowadays "programmers" are mass produced, and by and large they are neither tinkerers nor engineers; they are laborers.

@ajroach42 So we now have a system for churning out software using laborers. Naturally, the vast majority of tools are built for those laborers.

@ajroach42 The tools are also built *by* laborers. The thing you want would have to be a product of an entirely alien evolutionary line. I think Smalltalk is a pretty good example of a tool like that that is still in active use. But it stays small because if you want to actually work in programming (i.e. be a laborer) or build tools for the laborers, Smalltalk isn't particularly useful for it.

@seanl Thats an excellent dissection of the problem.

@ajroach42 I've been thinking about the idea of trying to provide something analogous to that experience of booting to the OK prompt on modern hardware without severely restricting functionality, and I think Genode might be on its way to providing a good base for such a thing, because it could run both your "shell" with very little underneath as well as VMs running full-blown OSes that could be started from the shell.

@LilFluff Speaking of chording keyboards, you might find an interest in