Computers could be good, but they aren't.
That's the gist of it.
I guess I mean Good with a capital G, as in "a force for good in the world", but I also mean good with a lowercase g, as in "not super shitty to use, or think about".
I'm not going to waste a lot of bits talking about how computers are bad. I've done this a lot before, and you probably already agree with me. I'll quickly summarize the high points.
What's wrong with (modern) computing?
- Computers spy on us all the time
- Computers are insecure, while pretending not to to be.
- Computers enable new modes of rent seeking, further exasperated by shitty patents and worse laws
- Computers/the modern internet encourage behaviors which are bad for our mental health as individuals.
- Computers and the modern internet, in concert with modern capitalism have built a world essentially without public spaces.
You know, all that bullshit.
I've spent a lot of time thinking about what the next 30 years in computing might look like, the successes and failures of the last 30 years, and the inflection point at which a computer is Good Enough for most tasks.
I've spent a lot of time thinking about the concept of planned obsolescence as it applies to computing, and what modern computing might look like without the profit motive fucking everything up.
I'm just a dude.
I'm a sysadmin. I spend a lot of time using computers, and specifically I spend a lot of time fixing machines that are failing in some way.
But I'm just some dude who thinks about stuff and imagines futures which are less horrible than present.
I've said that as a way to say: I don't claim to have The Answer, I just have some ideas. I'm going to talk about those ideas.
So how did we get from the gleaming promise of the digital age as imagined in the 70s to the harsh cyberpunk reality of the 20s?
Centralization, rent seeking, planned obsolescence, surveillance, advertising, and copyright.
How do we move forward?
Re-decentralization, a rejection of the profit motive, building for the future/to be repaired, building for privacy, rejecting advertising, and embracing Free software.
Personally, there's another facet to all of this:
I use and maintain computers professionally, and recreationaly.
Sometimes, I want to do something that doesn't feel or look like my day job. Unfortunately, most of my hobbies feel and look a lot like my day job.
To that end, I have some weird old computers that I keep around because they're useful and also because they're Vastly Different than the computers I use at work.
My #zinestation, mac plus, and palm pilots fall in this camp.
I can do about 80% of what I want to use a computer for with my 486 based, non-backlit, black and white HP Omnibook.
Add my newly refurbished and upgraded Palm Lifedrive, and I'm closer to 95%.
Let's run through those tasks:
- Listen to music (The palm is a 4GB CF card with a 2GB SD card, basically.)
- Watch movies (I have to encode them specially for the palm, but the lifedrive makes a great video iPod.)
- Read books (plucker is still pretty great)
- RSS (ditto above)
- Email (via some old DOS software the name of which I'll have to look up, and lots of effort on getting my mail server configured correctly. + an ESP32 based modem. This took some doing and I still don't love how I'm doing it. I'll do a blog post about it eventually.)
- Social (mastodon via brutaldon via lynx via telnet over tor to an onion endpoint I run in my home, not ideal, or via BBS)
- Write (via Windows 3.1 notepad)
- Consult reference material (via the internet or gopher over my esp32 modem with the appropriate DOS software and SSL proxy, or more likely, via a real hacky thing I wrote to mirror a bunch of websites to a local web server.)
- Develop (frankly Via GW-BASIC, although I'd love to start doing real programming again.)
- Games (this is the thing the omnibook is worst at! I consider that a strength most of the time, but I do have a lot of parser based IF games on it.)
There was a time in the recent past when I used a Pentium MMX laptop as my only computer outside of work for weeks at a time.
It could do basically everything I wanted it to do, including some far more impressive games.
It's batteries gave out, finally, but until then it made a decent little computer.
The only real problem I run in to in these setups are the hoops I have to jump through because I'm the only one using them, and because (wireless) networking technology has advanced in ways that are not backwards compatible on the hardware level, while leaving laptops without a clear upgrade path.
No one gets upset that their typewriter can't browse the internet, you know?
But a computer isn't an appliance, it's an everything machine, and as an Everything machine, if it can't do the New Shiny Thing we have to throw it away and get a new one.
That's the mentality I'm trying to break out of.
I want to define a(n extendable!) list of tasks that I feel like will be relevant indefinitely, and build a machine to last 30 years.
Which, finally, brings us back to the ESP32!
Basically, the ESP32 is a simple microcontroller (that is to say, it's a computer! It's just not a computer the way we usually think about it.)
It's really cheap, like $3 or $4 for a simple board. There are folks making software for it already to treat it like a desktop computer.
It's not completely open or completely standardized or capable of everything I want out of my #PersonalComputer but ...
They get most of the way there on every count, and they have built in wifi and are so very cheap.
It would be entirely possible to base a new paradigm of multi-decade computers on the ESP32, but built in such a way as to be agnostic to the actual hardware (that is to say, you follow the write once run anywhere model of Java, and use the ESP32 as the host of a virtualized environment OR you build for the ESP32, and then emulate the ESP32 on newer hardware in the future)
This is basically the approach that Infocom took in the 80s when they developed text adventure games for every computer on the planet.
They invented a fake computer, compiled all their games for that fake computer, and then wrote emulators for that fake computer for every major machine of the era.
As a result, basically any computer can run an infocom game.
Now, is the ESP32 a good home for a multi-decade computer?
It's a little more limited than I would have picked (I'd have probably stopped in the early multimedia era), but it's also way less power hungry than what I would have picked, and frankly significantly cheaper and easier to understand.
So I'm going to spend the next few months exploring the ESP32 as the basis for a purpose built computer that inherits the legacy of the tinkerers of the microcomputer era.
Principles I plan to adhere to in my ESP32 exploration:
- Offline first, but networked
- Understandable is better than "easy to use"
- don't make assumptions about available hardware
- Don't re-invent the wheel without a good reason
- don't try to build a modern computer
- Decide what this machine should do, make sure it's good at those things.
Chip-8 - https://en.wikipedia.org/wiki/CHIP-8 - Chip 8 is a virtual machine from the 70s for making games and software portable. It's part of the reason your graphing calculator plays games.
The 100 year computer project (https://thedorkweb.substack.com/p/the-100-year-computer) that sent me careening back down this path has a lot in common with chip-8 (and the article mentions it by name.)
We've talked a little about hardware. We've talked a little about use cases, but we should probably dig deeper in to that.
The remaining piece of this puzzle is software, which I think is closely tied to, but ultimately separate from, use cases.
I'll talk about that now, a bit, until I fall asleep.
So first things first, it's late and this might be incoherent.
In order for a decade spanning computer to be remotely useful, it needs software that speaks common protocols and file formats.
These protocols and file formats should be open standards, well defined, and well documented.
In my current use cases, I mostly use plaintext files, csv, and HTML. When I need to use a more specialized software, I convert between an old file type and a new file type using a piece of open source software on a more modern system.
This takes the form of, for example, antiword or pandoc, running on Linux.
Ultimately, there are still these least common multiple kind of standards out in the world, and they're likely to stick around forever (can you imagine a world without plaintext files?), And converters to get back to these platform agnostic file types are likely to stick around too.
But filesystems, transfer protocols, etc? These things change, and often with good reason. Our system will need to keep up.
My solution to this problem so far has been intermediary computers.
One example: Fetch the emails or the RSS feeds on my laptop, convert them, shuttle them over serial to the old machine.
Another: use the older machine as a serial terminal to a more modern box. Do my actual work on the more modern box.
This extends the life of the older machines, and lets me access modern conveniences when I want them, but it doesn't actually provide a model for a computer that will remain relevant.
I dunno what the answer is here, but I suspect it's something like
1 - define a native format for networked data that you're willing to support.
2 - provide several common network interfaces including serial/uart and wifi.
3 - be willing to whack another machine on to the serial port and let it translate a new hardware or software protocol when the existing ones are no longer supported. (Such as what I do now with the omnibook and my ESP32 wifi modem.)
And 4 - be willing to adapt.
The point of this project, as I see it, is to provide a standard set of tools and protocols and file formats that should be relevant and workable for decades.
If 2.4 and 5ghz wifi stop being supported by new radios in ten years, or WPA2 is replaced with WPA3 or whatever, we write new firmware, install a new radio, or migrate all our data (stored in open formats), and software (open source and largely platform agnostic), somewhere else.
Last bit for tonight: peripherals.
PS/2 is fine. USB2PS2 adapters are cheap and easy and can be made by hand. You could even wire up a couple of USB ports that just concert straight to PS/2.
I don't love the idea of trying to support multipurpose USB 1 or 2, much less USB 3 or C, so I won't.
Printers are good and important. They're also pretty complicated. Serial printers exist, and lots of printers that don't speak serial do at least use postscript, so I'm confident we'll sort printing.
I guess one other thing to talk about is operating systems.
The thing is, I don't care.
As long as the OS itself isn't a hindrance to what I want, I don't care what it is.
Several folks are running CP/M on this hardware. I dunno why you'd want to do that/why you'd standardize around that, but I'm a DR fan and I like CP/M so let's fuckin go.
It's frustrating when a 10 year old computer can't get on facebook or watch 480p on youtube anymore, even thought it could five years ago.
The computer hasn't changed.
Facebook and youtube have become more complicated. They didn't need to, but they could get more complicated because the average computer got faster, and the average internet connection got faster over that time span.
So a computer that could do X lost it's ability to do X as a result of a third party.
So a portion of the multi-decade computer platform would need to be multi-decade support from network services.
When I was a web developer, we used to call this idea graceful degradation, or progressive enhancement.
The idea is: don't assume JS will be there, but feel free to use it if it is. Specifically, this means Don't Break if CSS or JS or images are missing.
We abandoned that idea in web development, and frankly most web developers never embraced it to begin with.
But if we can embrace decentralization, and specifically embrace a more peer-to-peer relationships, and open standards for communication, we can ensure that every multi-decade computer can communicate with every other multi-decade computer.
The multi-decade web?
The peer-to-peer web.
A web independent of the internet.
@ajroach42 This thread stirs up so many "yes!" thoughts in me, where to start?
I make a habit of periodically working from a #VT510. A realy VT510, says "digital" on the front and everything. It is connected to a Raspberry Pi on a cart, and from there I ssh to my "real" workstation (though the newer Pi4s are getting to be nearly capable enough of that on their own.) One of the biggest hurdles I face is UTF-8, but it is fantastic focus mode and a bit of a stop back in time.
@ajroach42 One problem with #UUCP is its security model is, shall we say, from an earlier era. Neither strong cryptography nor strong authentication are built in, and both are a must these days. However, #NNCP is a modern replacement that might be good: http://www.nncpgo.org/index.html
Also I have been re-reading some of @joeyh's blog posts about offgrid living. Can't help but love modern always-on Internet, also can't help but think "I'll go online when the sun says I will" has a deeper purpose too.
@ajroach42 Great ideas. Thanks for sharing, as always.
You mention using other computers as peripherals to provide input to the "brain" ESP32, but since it is so cheap and powerful enough, why not use other ESP32s for this task?
One could group protocols and APIs according to some taxonomy and, theoretically, provide a purpose-built OS who's purpose is to translate one of those groups into the new common 100-year computer format for the "brain" ESP32.
Like a local ESP32 cluster.
@groovestomp certainly, especially today.
I was imaging a time when something that was required of a computer in order to remain moderately relevant outstripped the ability of the esp32, but that's a long road.
@ajroach42 Hi, even though I write this message on Pinafore, on recent hardware, I’m working most of the time on an Acer AspireOne, which feels both as a needless luxury for vim, pandoc, xelatex, catgirl, and amfora, and a pain when I need to use the web (which I need to do, because my university restricts emails to Microsoft’s restricted web client; for example I can’t add a reply-to field for obvious reasons).
When you mention an emulator, of course, uxn comes to my mind, for graphical components; there’s a WIP port for ESP32 here: https://github.com/max22-/uxn-esp32
@ajroach42 Then I’ve been critized for denaturing everything, so take this with a grain of salt; but I believe the web has been a turning point and that Google has played a prominent role in it.
It isn’t just that the web always requires more resources. It’s made the internet more centralized, and I believe Google saw it and chose to promote it for this reason when they made their search engine.
Most websites have a different interface, the difference may be slight or confusing as hell.
@ajroach42 People don’t like to discover new websites because they expect them to be slow, poorly made, confusing, and so on. Of course on a client/server model, without federation every website has a different database, but because they also have a different interface, which sometimes is even considered as a client, both are mixed up, so this literally creates repression, in psychoanalytic terms.
@ajroach42 Social media make it worse because most Twitter users regret having created their accounts but this is another question.
@ajroach42 And then I love Alpine Linux, I’ve thought as well it was like the grandfather’s axe (I was thinking about a chainsaw), but there’s the internationalization problem. Generally nerds who make a successful project think it’s good enough but this is only limited to their countries. My grandma *could* run Alpine Linux. Installing a font has for a long time been much easier on Linux distributions than on Windows. But she can’t be independant without internationalization.
@ajroach42 But basically, the web is a shopping mall, and we shouldn’t put critical services on it, is all I’m saying.
@ajroach42 a depressingly large number of people think that computers (mobile devices included) "wear out" with age and use due to this phenomenon. The fact is that in the past 20 years or so that the reason the same computer is slower at doing the same tasks now compared to then is almost entirely due to surveillance and targeted marketing practices.
Even if you're not mining bitcoin literally most of the energy expended by most computers is driven by this BS.
And even if you avoid that, you've got the issue if your computer doing a ton of stuff in background for security reasons instead of everything being coded for speed as the main and often only consideration.
@msh @ajroach42 And that's why I have to love iOS devices. Don't get slower with age. Just gifted an iPhone 6 (with a small crack in the screen) to an older relative for free, to replace the buggy, pile of crap android device they were using.
Their Android phone was so buggy and slow they could barely send text messages -- the iPhone 6 is fast and quick as the day it was made. They're apparently loving it.
@mdm @poindexter @ajroach42 Apple is closest to being acceptable for software support than any android vendor has ever been. They still get failing grades for actively making their hardware less repairable and more closed than could ever be justified.
Google recently pushed further with 5 years of monthly updates for Pixel 6 & 6pro (including all phone component firmware - which FairPhone can only do for a couple of years on the FP4)
Hopefully other #android devices will follow (Samsung went from 3 to 4 years of updates on top end devices recently)
The Pixel 6 & 6 pro are powerful & expensive. Will be interesting to see if Pixel 6a arrives & pricing
@dazinism @msh @poindexter @ajroach42 I don't trust any of these companies to actually follow through with these promises -- I make my opinions of them based upon their actual past behavior, which, for updates, hasn't been great.
I don't believe Apple is supporting phones for longer than everyone else out of the goodness of their hearts (it's possibly just because their hardware is similar in between models), but the fact remains that nobody else even comes close.
For years Google have consistently delivered on the timelines they guaranteed for device updates.
Generally they do a month or 2 longer than guaranteed
In one case (a tablet a few years back) they did 6 months extra
Theres numerous options for keeping devices going longer. You dont get component firmware updates (many laptops & PCs dont get more than 2 years of these) after Google drop support, kernel & operaring system can be.
My favourite for older … 1/2
…devices is divestos.org unlike others they do verified boot where possible
@dazinism I'm glad to see things moving in the right direction, though I can't help but feel the vendors are being dragged kicking and screaming due to overwhelming consumer demand and threat of government regulation rather than wanting to respect their customers.
Considering the maturity of the industry now 5 years is really a bare minimum and 10 years would be a good ultimate goal. I just retired a daily driver notebook that was 9 years old so why not? A bit absent from this discussion is repairability too.
@mdm @poindexter @ajroach42
My point was that laptops (and netbooks) arent really any better. Like your 10 year old netbook theres still folks who are using the 10 year old Samsung Galaxy S2 as their main phone
Its likely neither the S2 or your netbook has been getting proper updates to component firmware or drivers for most of their life
Maybe "cheap" devices do get updates for a reasonable amount of time, and many expensive devices don't. (Juicero et al.)
Cheap is good, because it enables adoption, which can encourage open source support.
Locked firmwares and DRM are at least partially to blame for the situation we are in now, and that happens in ever market segment.
Very much this. I think the connection between cheap and supported is pretty weak and that the most premium devices are vulnerable to loss of support due to closed hardware and firmware.
The challenges in supporting consumer devices are often created rather than inherent. If they were designed to be more openly accessible than community support would be able to better fill the gaps left by vendors in the long term.
I'm all for cheap, just as far as I've seen (got best knowledge about android phones) cheap devices do not get good support/updates, some more expensive ones also dont
As far as I'm aware, I think due to their massive complexity (& huge number of different phones), community extended support of androids is best effort (& lacking)- e.g. some devices have open drivers, but I dont think anyones taken up proper maintenance after the vendor stops
…the amount of resources available from the community or what the vendor can or wants to put into the device. Smaller companies like Fairphone may want to do long support, but they havent been able to do timely or full updates on their phones, guess because they dont have the resources
Given problems gathering enough resources to properly look after complex devices I think reducing complexity is worth exploring
Liking the path https://betrusted.io is taking @ajroach42 @mdm @poindexter
@abloo Lol! The S7 was actually the phone that broke me. 😆 Went iOS after that.
The fact that I had a flagship, unlocked Android phone that wasn't getting security updates anymore after a single year.... smh.
@dazinism @msh @mdm @poindexter @ajroach42 i see mobile & netbooks very differently. i can go install a fresh 5.15 Linux kernel on my parents netbook. but most mobile phones have a Google kernel with a rats nest of patches, which then the SoC vendor layers on another massive sea of patches, & finally the phone vendor makes another small mess of patches.
i'm sympathetic to both vendors. supporting these devices is hard as heck. it's a huge effort. i won't expect more from them.
the system needs to change. there's still a mess of soc vendor patches, but google at least is trying to get closer to mainline. from last month & two years ago:
there's talk about Project Treble creating a stable intermediary layer for SoC vendors to write drivers to. rather than target Linux. not sure how i feel about this but it could mean devices become supportable, long term, that they can be evergreen. like a netbook is/has been.
@ajroach42 I think you've not mentioned the one motivation behind all that JS and images that companies like that have: tracking users. Most of the JS that's loaded on web pages is for tracking people. That also goes for images that get loaded but which users don't see. JS especially has enabled this sort of digital surveillance.
Load MX or Antix as live media from a flash drive and it will be able to use facebook or Youtube again...
Without permanently installing the new operating system...
@ajroach42 Yes, that’s pretty annoying.
I would not mind if the web required (free) software updates. But different from vp9 and av1 as codecs, h265 requires much, much more resources to decode than h264, so you lose video support on old computers when the platforms update to it.
@ajroach42 did you happen to see John Carmack (of Doom/Quake fame) talk about his success getting Oculus to unlock the Go headset during sunsetting?
he has some wonderful ways of talking about these sort of concerns- about a desire for someone to, 10 years from now, 20, notice a dusty headset in a closet, pick it up, & be able to have maybe not a current experience, but be able to still update the OS, get apps & games on it, try it out, see what it's good for.
@ajroach42 Necro-toot reply, but...
I was very surprised recently when my dual PIII machine turned out to be _almost completely useless_ for the web. Firefox barely even started at all, much less loaded pages! Not very many years ago (like, inside five?), it was painful but not impossible.
I used that computer from 1998 until 2007 or so, upgrading drives and RAM.
@elb I have had decent luck with early XP era gear, as long as I can get at least 2GB of RAM in to it, and as long as I run a 32 bit linux distro (but those are getting hard to find.)
It's SLOW online, but it only gets unusable when you're on Windows, IME.
I mean, it's slow enough that it is essentially useless, and finding 32 bit builds of stuff is getting harder and harder.
@ajroach42 I'm running Debian 11 on that machine; I haven't run Windows on anything I use regularly in almost 30 years.
I don't think it has that much RAM, I think it's something like 768 MB. That plays a big role, for sure. I don't know how high it can go, but that's the DIMMs I had.
@elb RAM is the biggest limiter for sure.
I struggled to get p3s to play nice on the web with less than 1GB circa 2012 or so.
I mean, I did it. 512 was enough most of the time, but it wasn't fun.
Today, though, 2GB is barely enough.
I have a computer lab with 10 desktops in our computer lab with 2GB, and they can do a dozen things at once, as long as none of those things is the web.
Or they can do one web browser and nothing else.
A social network for the 19A0s.