I haven't talked about my goal for personal computing in a while.

With Sundog nerdsniping me in to attempting to turn the LibSSH-ESP32 port in to the basis of a full fledged SSH client for the ESP32, I guess I should spend a few minutes talking about why I bother with this bullshit.

Computers could be good, but they aren't.

That's the gist of it.

I guess I mean Good with a capital G, as in "a force for good in the world", but I also mean good with a lowercase g, as in "not super shitty to use, or think about".

I'm not going to waste a lot of bits talking about how computers are bad. I've done this a lot before, and you probably already agree with me. I'll quickly summarize the high points.

Follow

What's wrong with (modern) computing?

- Computers spy on us all the time
- Computers are insecure, while pretending not to to be.
- Computers enable new modes of rent seeking, further exasperated by shitty patents and worse laws
- Computers/the modern internet encourage behaviors which are bad for our mental health as individuals.
- Computers and the modern internet, in concert with modern capitalism have built a world essentially without public spaces.

You know, all that bullshit.

As I said, it's a summation. There's nuance. There are more problems.

That list should serve as an okay shorthand for the kind of thing I'm talking about.

Computers? They're bad.

But I'm here, talking to you, through a computer. I derive my living from computers. I spend most of my free time in front of a computer.

In spite of all the ways computers are lowercase b bad, computers enable a lot of Good.

I believe in the potential of computers, in our digital future.

I've spent a lot of time thinking about what the next 30 years in computing might look like, the successes and failures of the last 30 years, and the inflection point at which a computer is Good Enough for most tasks.

I've spent a lot of time thinking about the concept of planned obsolescence as it applies to computing, and what modern computing might look like without the profit motive fucking everything up.

Sidebar:

I'm just a dude.

I'm a sysadmin. I spend a lot of time using computers, and specifically I spend a lot of time fixing machines that are failing in some way.

But I'm just some dude who thinks about stuff and imagines futures which are less horrible than present.

I've said that as a way to say: I don't claim to have The Answer, I just have some ideas. I'm going to talk about those ideas.

Sidebar over.

So how did we get from the gleaming promise of the digital age as imagined in the 70s to the harsh cyberpunk reality of the 20s?

Centralization, rent seeking, planned obsolescence, surveillance, advertising, and copyright.

How do we move forward?

Re-decentralization, a rejection of the profit motive, building for the future/to be repaired, building for privacy, rejecting advertising, and embracing Free software.

Personally, there's another facet to all of this:

I use and maintain computers professionally, and recreationaly.

Sometimes, I want to do something that doesn't feel or look like my day job. Unfortunately, most of my hobbies feel and look a lot like my day job.

To that end, I have some weird old computers that I keep around because they're useful and also because they're Vastly Different than the computers I use at work.

My , mac plus, and palm pilots fall in this camp.

I can do about 80% of what I want to use a computer for with my 486 based, non-backlit, black and white HP Omnibook.

Add my newly refurbished and upgraded Palm Lifedrive, and I'm closer to 95%.

Let's run through those tasks:

Palm:
- Listen to music (The palm is a 4GB CF card with a 2GB SD card, basically.)
- Watch movies (I have to encode them specially for the palm, but the lifedrive makes a great video iPod.)
- Read books (plucker is still pretty great)
- RSS (ditto above)

Omnibook:

- Email (via some old DOS software the name of which I'll have to look up, and lots of effort on getting my mail server configured correctly. + an ESP32 based modem. This took some doing and I still don't love how I'm doing it. I'll do a blog post about it eventually.)
- Social (mastodon via brutaldon via lynx via telnet over tor to an onion endpoint I run in my home, not ideal, or via BBS)
- Write (via Windows 3.1 notepad)

- Consult reference material (via the internet or gopher over my esp32 modem with the appropriate DOS software and SSL proxy, or more likely, via a real hacky thing I wrote to mirror a bunch of websites to a local web server.)
- Develop (frankly Via GW-BASIC, although I'd love to start doing real programming again.)
- Games (this is the thing the omnibook is worst at! I consider that a strength most of the time, but I do have a lot of parser based IF games on it.)

There was a time in the recent past when I used a Pentium MMX laptop as my only computer outside of work for weeks at a time.

It could do basically everything I wanted it to do, including some far more impressive games.

It's batteries gave out, finally, but until then it made a decent little computer.

The only real problem I run in to in these setups are the hoops I have to jump through because I'm the only one using them, and because (wireless) networking technology has advanced in ways that are not backwards compatible on the hardware level, while leaving laptops without a clear upgrade path.

...

This feels like a kind of rambling sidebar, but there's a point:

Most tasks that computers are used for on a daily basis could be completed on much less powerful hardware if there wasn't a profit incentive in the way.

So, circling back to the original point: I'm imaging a world in which computers are different.

Specifically, different in that they are designed to be cheap, easily repaired or replaced, and to just Do Their Job forever.

(This requires defining the job they are supposed to do.)

No one gets upset that their typewriter can't browse the internet, you know?

But a computer isn't an appliance, it's an everything machine, and as an Everything machine, if it can't do the New Shiny Thing we have to throw it away and get a new one.

That's the mentality I'm trying to break out of.

I want to define a(n extendable!) list of tasks that I feel like will be relevant indefinitely, and build a machine to last 30 years.

Which, finally, brings us back to the ESP32!

See thread: retro.social/@ajroach42/105397

Basically, the ESP32 is a simple microcontroller (that is to say, it's a computer! It's just not a computer the way we usually think about it.)

It's really cheap, like $3 or $4 for a simple board. There are folks making software for it already to treat it like a desktop computer.

It's not completely open or completely standardized or capable of everything I want out of my but ...

They get most of the way there on every count, and they have built in wifi and are so very cheap.

It would be entirely possible to base a new paradigm of multi-decade computers on the ESP32, but built in such a way as to be agnostic to the actual hardware (that is to say, you follow the write once run anywhere model of Java, and use the ESP32 as the host of a virtualized environment OR you build for the ESP32, and then emulate the ESP32 on newer hardware in the future)

This is basically the approach that Infocom took in the 80s when they developed text adventure games for every computer on the planet.

They invented a fake computer, compiled all their games for that fake computer, and then wrote emulators for that fake computer for every major machine of the era.

As a result, basically any computer can run an infocom game.

Now, is the ESP32 a good home for a multi-decade computer?

I dunno!

It's a little more limited than I would have picked (I'd have probably stopped in the early multimedia era), but it's also way less power hungry than what I would have picked, and frankly significantly cheaper and easier to understand.

So I'm going to spend the next few months exploring the ESP32 as the basis for a purpose built computer that inherits the legacy of the tinkerers of the microcomputer era.

Principles I plan to adhere to in my ESP32 exploration:

- Offline first, but networked
- Understandable is better than "easy to use"
- don't make assumptions about available hardware
- Don't re-invent the wheel without a good reason
- don't try to build a modern computer
- Decide what this machine should do, make sure it's good at those things.

Show newer

@ajroach42 This code has been getting published to Github btw, @enkiv2 likes linking to them!

@alcinnz @enkiv2 It has, yeah. It's not Free Software though. Just something that was archived.

Show newer

@ajroach42
Support for the Extensa core is being merged into LLVM, so your language and compiler options are going to be nice, and ESP-32s are going into lightbulbs so there's plenty potential for salvage

I've been working with zig as a front end for C to trying setting up a cross compiler environment. I've always hated builds so this project is especially easy to procrastinate on, but if one of us gets that working then it's pretty easy for someone to build that code for RISC-V or other LLVM supported architecture later

Show newer

@ajroach42

"Most tasks that computers are used for on a daily basis could be completed on much less powerful hardware if there wasn't a profit incentive in the way....

I want to define a(n extendable!) list of tasks that I feel like will be relevant indefinitely, and build a machine to last 30 years."

I mean to me this only leads one place; getting emacs to run on the absolute most minimum hardware :P

If you are making do on limited hardware you are mainly doing text editing at that point.

@ajroach42

Emacs is already over 30 years old, it will be around as long as computers are around and coupled with org mode and it can do anything that involves text within a single wholistic workflow.

If a computer was designed around emacs/org mode you certainly wouldnt try to use it like a shiny new computer (and fall into that expectation trap that would keep you from truly interfacing with this computer as a new experience).

Anyways heres a pamphlet about the one true god that is emacs

@Alonealastalovedalongthe most of what I do, most of what most people do, with a computer is editing text and then sending that text to places.

Databases, spreadsheets, web pages, emails, IMs, all text at heart.

I imagine emacs has already been ported to the ESP32, but I haven't verified that.

I'm considering a slightly different approach, one that aims to be more proscriptive, but the emacs life is valid.

Long and a bit rambly, sorry 

@ajroach42 I have a powerful gaming desktop, a slightly less powerful gaming laptop, a good ish 2012 Thinkpad, and a bunch of Raspberry Pi's and 99% of what I do can be done on the Pi's (watching Internet things, and especially with the 4GB models that's more than enough to have 1 stream plus a few chats, plus Fedi, and probably still play Doom). There's only a few games I can't play on my Thinkpad, which has no GPU (and comes from the first gen of intel CPUs where they thought maybe we should make the GPU slightly more powerful than enough to display Win7 Aero)

You can do everything, slowly, on a Raspberry Pi, and you've been able to do it since the first Pi came out. 8 years later, and 8x the power per Pi, they're still considered "low power" and yet when I got my first RPi in 2012 the gaming rig I *dreamed* of having had 8GB RAM in it... Chillblast custom PCs could spec a max of 32GB at the time!

If we somehow convince enough people that something like a Pi is enough, then companies will *have* to bring their usage down.

Schools were meant to adopt the things (at least in the UK) but it never really happened because Microshaft has their Office suite, and programming languages so deeply ingrained it'd be like pulling the floor out from under the ICT curriculum...

re: Long and a bit rambly, sorry 

@MxCraven @ajroach42

I’ve found exactly three use cases where a modern Pi has unacceptably low performance:
1. Video games (some more so than others)
2. Running a web browser
3. I/O from the SD card

Everything else is plenty good enough.

@ajroach42 I remember when QVC was a big computer reseller and they were always pimping the latest and greatest hardware to their customers, who were basically all retired old people, telling them they needed the latest PENTIUM (or whatever it was at the time) to check their AOL email and read the latest news on the AOL AARP forum.

So much wasted power for people doing very little.

@ajroach42 I used to think we'd naturally get to Star Trek the Next Generation as our tech brought us towards post scarsity. But it takes humans to do it.

@ajroach42 How are you getting Tor on DOS? Or have you somehow built that functionality into the WiFi modem?

@ajroach42 what kind of viable business models can companies adopt so they'll resist the urge to use built-in obsolescence?
(Well other than proper management instead of oligarchic fat cat management)

@vesperto decentralized, worker owned, cooperatively managed, much smaller scale manufacturing coupled with some changes in consumer habits are the only path forward I see.

@ajroach42 free software is not enough. we need libre silicon and integrated circuits.

@theruran @ajroach42
So the closest we get to libre hardware with ARM (ignoring how ARM makes money and who's buying it) is FairPhone and Librem, which apparently have some binary blobs but they're disabled?

Then there's a RISC-V system scheduled for release next year. Totally libre hardware and it runs... Which Linux? ChromiumOS?

So maybe we set our sights a little lower, not that ESP-32 is problem free...

hackaday.com/2016/10/27/basic-

(Not all undocumented features could be this pleasant)

@yaaps @ajroach42 I did a class project this semester on this system: betrusted.io/

I don't think everything there is libre hardware but it's about as close as is feasible today. They go through great lengths to employ evidence-based security and also discuss some of its limitations.

@theruran Finaly such device has been industrialized. I'm been trying to set up such project fo
r years, but failed to find associates to do so. @yaaps @ajroach42

@yaaps @theruran @ajroach42 be careful with RISC-V stuff. The standard instruction set is open; implementations of that instruction set need not be. Moreover, there's no requirement that extensions to that instruction set architecture be nonproprietary either.

@vertigo @theruran
I think the use cases for @ajroach42 are more concerned with bulk commercial data harvesting and less concerned with state actors. It's an interesting use case for hardware with development sponsored in China. When your activity is technically legal production of culture even though your motive is to disrupt capital, accessibility is more relevant

@drwho @ajroach42 Great write-up!

I was just thinking... I think it's been mentioned before: to move away from silicon because Big Tech has a death grip on it. There may be cleaner and safer processes with other semiconductor materials? Thought I saw experimentation with graphite, too.

@theruran @ajroach42 Thanks! It's a little dated, I haven't updated it in about six years, but I tried to give the general principles legs for the long haul.

There is some experimentation with graphite - graphenes in general, really. Experiments in using carbon nanotubes as transistors are still ongoing. Personally, I think rod logic might be a better application for them but at best I'm a somewhat well read laybeing in that particular field.

@ajroach42 honestly kinda tempted to take one of my open source project/services and offer two ways for schools to exchange value for it. 1) paid subscription and 2) forming a club of students interested in helping maintain it.

@ajroach42 retro computing won't save us, we need a revolution that focuses on the needs of working class people, Black people, people of color

@ajroach42
It's interesting looking at this from computers, i suppose this more representative of the larger problems that stem from neo-liberalism which encourages making profits from everything. Vanada Shiva talks about the patenting of seeds by way of gmos, Seeds are the best examples of opensource-ness, and trying to monetized and close source seeds are leading to horrible effects for the crop and farmers. but it bring this up b/c she notes, that the close-sourceness of the web has lead

@ajroach42 to hindering the development of software and hardware. Because like seeds the web works best when it's decentralized and open. I dunno, but I rather have solarpunk than cyberpunk, but... those are some stray thoughts, it's interesting hearing douglass ruskoff talk about the dream of net becoming a nightmare...

@ajroach42 >So how did we get from the gleaming promise of the digital age as imagined in the 70s to the harsh cyberpunk reality of the 20s?

By being collectively naive enough to believe that adding unlimited technology to existing patterns wouldn't dramatically exaggerate existing problems and proclivities.
desuyone.png

@ajroach42 Sounds like you'd enjoy World Wide Waste by Gerry McGovern!

@ajroach42 So what your saying is that the problem with modern computing is that they are behaving more like people. Make no mistake, people are the problem.

@ajroach42 Computers per se not. But the operating systems most people use. Windows and Mac spy and sell advertisings and personal information.

Simply don't use these advertisement and lockin operating system.

@comrad "simply"? That's bold phrasing, that I find patronizing. Tread carefully.

It's not just windows and mac. It's also Ubuntu, android, and nearly every major commercial website. The only way to escape surveillance is to radically redefine how you interact with a computer. Nothing simple about it.

Sign in to participate in the conversation
R E T R O  S O C I A L

A social network for the 19A0s.