Show more

Before we go any further, let's talk about The Computer Chronicles.

Show thread

The Computer Chronicles was a TV show that ran from the early 80s through the early 00s. Over it's nearly 20 year run, The Computer Chronicles covered nearly every facet of the newly developing Computer industry. It was hosted by people with Opinions.

The guests were, frequently, people who were proud of the things they made, or the software they represented.

Show thread

Watching the developer of CP/M and DR DOS talk to a mainframe engineer who worked at IBM in the 50s about the future of computers as seen from the 1980s was eye opening.

On the one hand, this show serves as an excellent introduction to, or reminder of, the capabilities of computers 35 years ago. It helps us see how far we've come in terms of miniaturization, while also demonstrating again that, in many ways, there is nothing new under the sun.

Show thread

Before the advent of the internet, reporters were writing their stories on laptops and sending them in over phone lines, 25 years before the release of the iphone HP released a computer with a touchscreen, three years before microsoft released he first version of windows Apple and Visicorp demontrated GUIs wih features that Windows wouldn't be able to approach for another 9+ years.

Show thread

And, of course, I'm reminded again of Douglas Engelbart's 1968 "Mother of all Demos", in which he demonstrated the mouse, the GUI, instant messaging, networked gaming, and basically every other important development of the following 50 years.

It took 5 years for Xerox to refine and miniturize Engelbart's ideas to the point that they thought they could market them, and another 10 years before Apple refined and further miniturizaed the same ideas, and brought us the Mac.

Nothing is ever new.

Show thread

The whole video of Engelbart's Online System (NLS) is available on youtube. Some of it is *really* interesting. Most of it is unfortunately dry. It's easy to forget that this was 50 years ago, and also mindblowing that it was only 50 years ago.

Show thread

Anyway, back to Computer Chronicles. In an episode about Word Proccessors, the man they were interviewing said "There's a lot of talk about making people more computer literate. I'd rather make computers more people literate." There's a phrase that resonated with me in a big way.

Show thread

It sounds like the kind of semantic buzzword shuffling so common in standard corporate speak, but I got the impression that the guy that said it, believed it. He believed that computers had gotten powerful enough that they no longer had to be inscrutable.

Show thread

There were others working around the same time on similar ideas, or at least from a similar philosophy. Working to make computers, if not intuitive, at least comprehensible. I think this is a noble goal.

The computer is often thought of as a tool, but it is more like a tool shed, in which we store a collection of tools, a source of power, and a workspace.

Show thread

The tools of the 60s and 70s were primitive, partially because of the limited space and limited power our toolbox could provide for them, but also because our ideas and understanding of how these tools should work were limited by the audience who was using the tools.

Show thread

That is to say, in the 60s and 70s, computers were weak and slow and computer users were also computer programmers. A small, tight knit circle of developers and computer scientists were responsible for the bulk of the progress made in that time, and the idea of designing tools for non-technical users was never considered.

Show thread

Computer culture had, by and large, a kind of elitism about it as a result of the expense and education required to really spend much time with a computer. This changed, slowly, starting in the mid 70s with the development of the Microcomputer Market and CP/M.

Show thread

Computers became more affordable, slowly. Affordable computers became more powerful, quickly. Within 10 years, non-technical users were interacting with computers on a daily basis. It was against the beginnings of this backdrop that the phrase I mentioned earlier was coined. "Human Literate Computers" or "Human Centered Computing."

Show thread

Ease of Use was the holy grail for a lot of computer companies. A computer that was so easy to use that they could sell it to grandma. But, to me at least, Human Literate and Easy to Use are distinct ideas. Many modern applications are Easy to Use. Netflix is Easy to Use. Facebook is, for all it's faults, pretty easy to use. The iPhone, the iPad, and ChromeOS are super easy to use.

Show thread

Well, they are easy to use as long as you use them in the prescribed way. As long as you let them tell you what you want to do, instead of the other way around.

That, IMO, is the distinction.

I think that many of the steps towards demystifying the computer of the 80s and 90s did good work, but ultimately, the computer industry left the whole idea behind, in favor of making some tasks Very Easy while making other tasks Practically Impossible, and turning everything into a surveillance device.

Show thread

When I was a kid I was brought up with computers that showed you how they worked.

You booted in to a command prompt or a programming language, or you could get to one, if you wanted to.

I got to play with GW Basic and qBasic and, a little, with hypercard.

I got to take apart software and put it back together and make things that made people happy.

Show thread

I got to make things that I needed. I got to make things that make me happy.

Today, the tools to do that are complex to compensate for the vast additional capabilities of a modern computer, but also to reinforce technical elitism.

Show thread

I often wonder why Hypercard had to die.

It was because Jobs wanted the Computer to be an Appliance. A thing only used in prescribed ways.

Letting people build their own tools means letting people control their own destiny.

If I can make what I want, or if someone else can make what they want, and then I can take it apart and improve it, why would I pay for an upgrade? Why would I pay you to build something that doesn't meet my needs?

Show thread

I'm mentioning hypercard specifically because I've been relearning hypercard recently, and it is *better* and more useful than I remember it being.

It's honestly revelatory.

Show thread

Hypercard, if your unfamiliar, is powerpoint + instructions.

Here's a great introduction/example: loper-os.org/?p=568

The author walks you through building a calculator app in about 5 minutes, step by step.

Warning: There's a bit of ableist language tossed around in the last paragraph. Skip it, there's nothing worth reading there anyway.

Show thread

You use the same kinds of tools you would use to build a slideshow, but you couple them with links, multimedia, and scripting.

Want a visual interface for your database of client data? Great! slap together a roladex card, and drop in a search function.

Go from concept to presentation ready in an hour or two (or less, if you've done this before!)

Show thread

Hypercard was easy to use. Everyone who used it loved it. It was integral to many businesses daily operations.

Jobs killed it because he couldn't control it.

Show thread

Microsoft doesn't ship any tools for building programs with their OS anymore, either.

They used to. There was a time when you could sit down at any windows or DOS machine and code up a program that would run on any other Windows or DOS machine.

But we can't have that anymore.

Show thread

In the name of Ease of Use, they left out the Human aspect.

Use your computer how you're told to use it, and everything is easy.

Do anything new or novel and it's a struggle.

Show thread

My nephew has an ipad.

He asked his dad how to write games. His dad didn't know. His dad asked me how to write games on an iPad. I told him not to bother.

My nephew asked me how to learn to write games.

I gave him a raspberry pi and a copy of pico 8.

Now he writes computer games.

He couldn't do that on his iPad.

Show thread

Hypercard would be a perfect fit for the iPad and iPhone.

Imagine it!

Imagine the things you could build.

But we aren't allowed to have computers that are fun to use, that are easy to build for, that are human centric, or human literate.

Show thread

The last 10 years of development in computers were a mistake. Maybe longer.

Instead of making computers Do More, or making them Feel Faster, we've chased benchmarks, made them more reliant on remote servers, and made them less generally useful. We brought back the digital serfdom of the mainframe.

Show thread

In the first episode of computer chronicles (youtube.com/watch?v=wpXnqBfgvP) the mainframe guy is real adamant about how mainframes are good and micros are bad.

The host, a microcomputer legend, disagrees pretty strongly.

Later, when they talk about the future of networking, the mainframe guy talks about it as a return to mainframes. The micro guy talks about BBSs, peer to peer networks.

The mainframe guys are winning.

Show thread

(this is not to say that I think mainframes are bad. I don't. Mainframes can be really good and interesting! Plato was wonderful, as were some of the early unix mainframes.

But IBM style Mainframe culture is The Computer as a thing you Use but don't Control culture, and I am very against that.)

Show thread

I have to step away for a while. I'll continue this later.

Show thread

@ajroach42 I want to respond, elaborate, & discuss at length here. I spent about 10 months some years ago immersed in the computing literature around the history of debuggers, during which I went from EDSAC to Visual Studio, but also all the other half-dead ends ends of computing history such as, e.g., Lisp machines.

Naturally, I came out of it a Common Lisper, and also naturally, with Opinions about modern computing.

Up for the discussion? It could get wordy and over a few days. :)

@pnathan for sure.

I haven’t gotten in to lisp machines yet, but I’m always down for discussion.

@ajroach42 @pnathan
This thread is going to be gold :)
(I'm replying here so that I won't forget about it...)

@ciaby @pnathan I hope you enjoy! I'm looking forward to the discussion as well.

@ajroach42 @ciaby
OK, so, I'm about a decade older than you, Andrew: I taught myself QBasic in the mid 90s, got online late 90s, never really looked back.

First, I want to say this: older computer systems - considered as systems - were generally more capable.

But to be clear, they were limited in use for those who didn't take an interest in learning them. I'm talking about things that weren't Windows 3.1+.

@ajroach42 @ciaby This was the Great Debate that was largely won by Microsoft. "Everyone can 'use' a computer.". That is to say, everyone can operate the appliance with preinstalled software. *everyone*. Apple pioneered the notion, but it turns out to be the preferred mode for businesses, who really rather don't like having specialized experts.

@ajroach42 @ciaby It is my contention that Windows (& *nix) computer systems are designed to be administrated and managed by sysadmins, and the user experience in this case is great.

When you have sysadmins, there are no driver problems. There are no printer problems. There are no problems, as a matter of fact: it's all been taken care of by the admins.

This is exactly how executives like it.

Apple does the same, with their iPhone.

Apple is the sysadmin, metaphorically.

@pnathan @ciaby This is a good point, but I think it deserves scrutiny.

I am employed as a support engineer and a sysadmin, and I still run in to driver issues, printer issues, etc.

I take care of them, eventually, when I can.

But, even after doing this for 10 years, I still encounter problems that I can't solve (because there isn't a solution.)

but the metaphor of Apple as sysadmin, I'll accept. I disagree with someone else admining my phone, but that's another issue.

@ajroach42 @ciaby your users pay you so they don't have to care about sysadmin issues. their world is great!

@ajroach42 @ciaby I'm glossing over the 1% failures to get at the core point: sysadmins are designed into the windows and unix world so users can focus on their core competency.

@ajroach42 @ciaby

Hi, I'm probably near the age of @pnathan, and while I'm not a lisper anymore (ages went from my emacs fluency) I agree with all he said.

To give some context, I'm a polyglot programmer currently working on a brand new operating system jehanne.io

Now, the assumption that you seem to share is that people cannot learn how to program. I used to think this too.
Now however I realized that it's like we were scribas of Ancient Egypt arguing that people cannot write.

@Shamar @ajroach42 @ciaby I'll eyeball your work.

people can program. people do program. where there is a will there is a way.

and there are many many ways to program.

arguably most are terrible, and the ones that condesendingly target newbies produce the worst systems overall.

@pnathan @ajroach42 @ciaby

Thanks! 😃

What I mean is that #history can teach us a lot about the present (and the #future) if we are able to interpret it with the right eyes.

Why peasants were unable to write in Ancient Egypt but they are able to now?

I think the main reasons are:
1. the writing system was too "primitive"
2. writing was functional to the #power structure back then.

What does this means for us?

That the #complexity of #programming is not necessarily inherent to the matter.

@Shamar @ajroach42 @ciaby

That the #complexity of #programming is not necessarily inherent to the matter.

here is where I disagree.

the complexity of understanding the "web stack" is incidental; the compelxity of understanding the concept of distributed computing and comms protocols is fundamental.

or something as simple as rendering bits to the screen. raster? vector? what abstraction do you choose to execute the display mechanism. now you have a model.

@Shamar @ajroach42 @ciaby ... continuing. Next year, maybe you want a different model, so you break off and redo it a bit. Now you have to figure out how to juggle two incompatible models in your code, and you're on your way to inventing an abstract interface system.

even if you're doing assembly!

@Shamar @ajroach42 @ciaby

here's my claim: software is crystallized thought, with all the complexities, ambiguities, and changing over time of thoughts. we can gut the whole shaky tower of modern computing, and we'll still be confronted with the core problem (even assuming a workable and standard bit of hardware for the engineering problems, themselves non-trivial sometimes)

@pnathan @ajroach42 @ciaby

You are in a way quoting my favourite #programmer, my favourite #hacker: Edsger Wybe Dijkstra.

For sure, "computational thinking" is as hard as #math is.
For sure, hardware issues exist.

But the point is that, despite all the progress that we see when we look at our #smartphone after reading about #ENIAC, we are still using #hieroglyphics.

The way we #think is strongly dictated by what we know.

We should get and habit to #challenge them.

@Shamar @ajroach42 @ciaby

EWD was probably the most astute prophet of software engineering that has lived to date.

let me challenge you: what is the secret knowledge which, knowing, would unlock the door?

@Shamar @ajroach42 @ciaby ah but that doesn't get anywhere until we start digging.

what is simple? is it the ability to point and click a mouse? is it a keyboard key?

both of those have deep wells of complexity and knowledge to make happen, despite surface simplicity.

or is it a transistor, which accumulation of produces unspeakable complexity?

@pnathan @ajroach42 @ciaby

You are confusing #simple with #easy. #Simplicity can be very hard to achieve.

Also, you are assuming I have that knowledge clear in my mind.

I've not.
I've just a natural inclination at finding the orthogonal axes that govern complex problems, thus I'm pretty good at moving from a point to another in such multidimensional systems (aka solving problems or forsee and avoid them).

I'm an hacker from the past, like everybody here.

But even if I don't know the ...

@pnathan @ajroach42 @ciaby

But even if I don't know the #solution, I see the #problem very well. Everyday. Clear.

I fight with it in my own #mind.

The problem is that we do not yet have a #math able to describe #simplicity.

For example, simplicity composes well.
Simplicity can stack.
Simplicity is deep. More it's fractal.

Simplicity is what I seek in any piece of code I write (see for example another project of mine: epic.tesio.it/ ).
We need more #CS and #math #research for it.

@pnathan @ajroach42 @ciaby

I find the current mainstream stack frustrating. Very frustrating.

I recently realized that the web is still a weapon of the USA DARPA (that sometimes backfires).
And #Javascript so far is the apex of the militar technology so far: we run all over the world code under the control of USA companies (that in turn are in control of the USA government).

But this just scares me.

What frustrate me ...

@pnathan @ajroach42 @ciaby

What frustrate me is the total resignation of people to this state of things, as if it's the current shit was the best possible stack that we can conceive.

And #WebAssembly is coming!

No guys, no... we have to throw all this away and start from scratch from the lesson learned.

We can do it better.

And we CAN.

(sorry for the passionate rant... it's pretty evident I suffer a lot from this state of things)

@Shamar @ciaby @pnathan The web isn't all bad, and it's not all bad technologies, but it isn't all good either.

All I'm asking is that we take a step back and examine our modern software with a more critical eye towards how we could improve it for future generations.

I'm not sure why this has become so controversial.

@ajroach42 @pnathan @ciaby

Sorry, I didn't want to seem controversial.

I'm have been reflecting on these specific topics for a while now, so I joined the discussion in the hope to share an interesting perspective.

@Shamar @ciaby @pnathan

Oh, sorry. I wasn't clear in my post.

I agree with most of what you've said. We disagree on some nuance, but that's fine.

I was saying that other people have found my statements on this subject very controversial, and I'm not sure why.

@pnathan @Shamar @ajroach42 @ciaby
I'm gonna repost my newly relevant diagram.

there isn't a bright line between programming and passive computer use. all UIs are programming languages. most are simply shitty, overly constrained languages that make simple tasks nearly impossible.

niu.moe/media/uHnt_DOdjxWrSE0G

@enkiv2 @pnathan @ajroach42 @ciaby

Nice charts.

However, I know Unix quite well, but I would not say that the effort decrease with task complexity. It's pretty unlikely, because you would reach negative effort soon.

Also I think the ideal UX plot over such axes would be something like a 45° rotated hyperbola, like this wolframalpha.com/input/?i=x%5E

I don't think such curve can be beated.

@enkiv2 @pnathan @Shamar @ajroach42 @ciaby Those are good charts that depict a useful idea, but I don't think what they describe is what most people know as a “learning curve”—which would be proficiency (as a percent of the tool’s available capability, y-axis) across time spent (x-axis).

@pnathan @ciaby @ajroach42 @joeld @Shamar You're right. They describe the difficulty of solving a problem with a tool vs the inherent difficulty of that problem -- which is essentially an inverse of the learning curve for tools that can address all problems. Tools that fail by making tasks that aren't easy impossible, of course, have a misleadingly good-looking learning curve.

@pnathan @ciaby @ajroach42 @joeld @Shamar (The catch being: they can't actually do anything, so having 100% mastery over them isn't actually valuable.)

@pnathan @ciaby @ajroach42 @joeld @Shamar I cross-posted this thread to lobste.rs, producing a comment thread over there that is at turns useful and infuriating: lobste.rs/s/q31cqp/thread_abou

@pnathan @ciaby @Shamar

To the user who wants to display bits on the screen, it shouldn't matter unless/until they want to display bits in a way that one format handles over the other.

I can see how and why it matters to someone building more complex systems, but if all I want to do is have a text input box, why do I need to care about anything else you said?

@ajroach42 @ciaby @pnathan

To get #freedom.

It's more or less the difference between grasping at reading words so that you can better serve your Lord with the shopping list, and being able to write a #political article on a newspaper to fight for your #rights.

#Programming today is just like #writing and #reading a couple centuries ago.

It's a matter of #power and freedom.

It's not just about being able to code, it's about #thinking as a #programmer.
Thinking as an #hacker makes you #free.

@Shamar @ajroach42 @ciaby One system that I have been curious about is ethoberon.ethz.ch/white.html#S Oberon OS. Apparently it was extremely successful but external pressures collapsed it.

@Shamar @pnathan @ciaby I never said people can't learn to program.

I'm saying that some people don't want to learn to program, and that what we call "programming" is needlessly difficult for some tasks, in the name of corporate profits.

@Shamar @pnathan @ciaby I feel like you think this was a clever point, but I don't understand what you mean.

Programming is a specialty, and some people have other specialties. Expecting them to also become expert programmers because our current expert programmers can't be arsed to make extensible and understandable tools is unreasonable.

@ajroach42 @ciaby @pnathan

This is the assumption I challenge.

For sure programing is a speciality right now.

But it's a speciality just like reading, writing and counting.

Not everybody can be a novelist, nor a professional mathematician.

But people should be able to program, just like they are able to read, write, compute a volume, reason about an average speed, a length...

Programming is harder then math because we are still using primitive tools.

It's sad that we are happy with them.

@Shamar @pnathan @ciaby

Some kinds of programming (just like some kinds of math) will remain hard.

But better tools are what I'm after, yeah.

@ajroach42 @ciaby @pnathan

We need a lot of #research.

We should #hack more.

And we need better #math too.

It will take some centuries.

Much more, if each generation keep being satisfied with the shit it slightly improve (or messup, as we did with the #web when we give out #XHTML for #Javascript).

Because, to me, the tool we need are as different from today mainstream tech as our writing system is from Egyptian #hieroglyphs.

@Shamar
I feel that it's not only a matter of research, but also a point of throwing away some tech that we take for granted (x86, for example) and rebuild from scratch with different assumptions in mind. In the current economic system I find it quite hard to do...
@pnathan @ajroach42

@ciaby @pnathan @ajroach42

You are largerly undervaluating the #power of #technology.

Most programmers are not well versed about #history, and it's a pity. There's a lot to learn for us, from history.

Technology is probably the most powerful and effective way to change the world. Most changes in human organizations have been allowed by technological innovations: from fire to boats, from bronze to iron, through argricolture, writing, counting, roads, from sword to guns...

Technology can ....

@ciaby @pnathan @ajroach42

Technology can change the world for the better or for the worse.

It can disrupt "the current economic system".

And that's why #capitalists are in a hurry to keep #hackers under control.

So, I don't think that the "current economic system" should be a problem for hackers.

We CAN throw away the web.

I really think it (I work with browser all the day, I know the stack pretty well...).

From scratch, with the lessons learned, it will take a fraction of what it took.

@Shamar @ajroach42 @pnathan
That's possible, and somehow is already happening.
What I'm talking about, however, goes much deeper than that. I'm talking about open hardware infrastructures, where every component is documented, there are no binary blobs or proprietary firmware .
Very important is also the instruction set, because what we have now (x86/amd64) is incredibly bloated and full of backward compatibility shit.
RISC-V is a step in the right direction. If only the hardware wasn't so expensive... ;)

@ciaby @ajroach42 @pnathan

Interesting point. You're right.

We cannot actually trust the hardware, either.

But... I'm a software guy.
I'm more concerned about the way we connect and use the hardware than the hardware itself.

Indeed, everything I've had to learn about hardware while developing my x86_64 OS has been a pain.

So, YES!

We need more research on the open hardware too.

There are only 2 things that I'd like to preserve keep: little endiannes and 64bit longs.

Please. 😇

@Shamar @ciaby @ajroach42 Bold claim: open source or non-open source hardware doesn't matter when deployed at scale.

the essential problems today are, in a sense, all software, mediated by the scale.

@Shamar @ciaby @ajroach42 that said:

we have these *inter-twingled* issues: the hardware is manky, the software is manky, and the incentives to improve are perverse.

nymag.com/selectall/2018/04/da

My reckoning is that there is a space today for a sort of New System, a Unicorn OS, where the whole thing is largely rebuilt. Does the web have to exist? does tcp/ip? are there better systems?

here we see we make choices and one prioritizes those who take the time to learn the system and one ...doesn't

@pnathan @ciaby @ajroach42

What is Unicorn OS?

Oh... you cited Oberon some toot ago.

I like it a lot (but I have to admit that I've never tried it on real hardware).

I love that Wirth still work on Oberon-07 language, and that he keep REMOVING features.

Oberon inspired Pike for Plan 9 UI. I started from Plan 9, and frankly I'm not brave enough to throw away TCP/IP as Wirth did.

Still... the Wirth's approach (hack, hack, hack, challenge all assumptions, keep it simple!) is what we need.

@Shamar @ciaby @ajroach42 UnicronOS : the magic OS that we're talking about that solves the problem.

with a sparkling dash of rainbow over it, because, you know it's magic. :)(

@pnathan @ciaby @ajroach42

Oh... the most funny definition of vaporware I've ever seen!

UnicornOS: the first #vaporware with a #rainbow! 🤣

(Disclaimer for any actual developer of an OS called Unicorn: I'm just kidding... the joke was simply too good... sorry)

@pnathan @ciaby @ajroach42

I do not know actually.
I literally know nothing about #hardware.

But my insight is that probably, #cheep #OpenHardware and #simple #distributed #software #systems could change the world.

I have a dream: one low power mail server in every house.
End to end mail encryption everywhere.

Unfortunatly no one seem interested in such a huge business opportunity.

Sad.

@Shamar @ciaby @ajroach42

ah jeeze man, think of the sysadmin needs.

the mail servers fail. the administration is confusing because docs aren't perfect, so it gets misconfigured. the network goes down. baby pukes on server and it fails to boot. server is overloaded by volume of spam.

then the task is outsourced to a guy interested in managing the emails....... whoop whoop we're recentralizing.

@pnathan @ciaby @ajroach42

Oh, no I can't think that it's not possible.

It's not easy, but we buy and sell firewalls, routers, wifi spots... we can sell mail server too.

And with E2E encryption by default, do we really need spam filters?

@Shamar @ciaby @ajroach42 my Inner Young Geek wants to argue that actual configurable systems are actually not used in the home outside and that mail servers cross that barrier between appliance and administrating-needing machine.

but let's not rabbit trail onto that. ;-)

more my contention and question is: should we expect a member of cyberspace to be knowledgable in minor sysadmin?

I argue yes! we expect people to be able to refill their oil in cars, right?

@pnathan ... no?

There's a whole industry out there of shops that only exist because people don't change their own oil.

Show more

@pnathan @ciaby @ajroach42

Exactly!

Give us a little cheap fanless server and we will move the world!

Show more
@pnathan
At scale, yes.
Although I feel that software is not evolving because:
The effort to develop new OS is too great, given the amount and complexity of the modern hardware (and closed specs).
Without a new OS, you can't develop new paradigms, and so we're stuck with ideas from the 70s (unix mostly, plus VMS-influenced Windows).
Programming languages are going to use the OS, and that's why we're not really progressing...
My proposal: simpler hardware, open and documented. Build on top of that. No backward compatibility. :)
@ajroach42 @Shamar

@ciaby @ajroach42 @Shamar I agree that backward compatibility has to be nixed for real research and change to occur.

now I have to debug a piece of code that is like the reification of all bad backend possibilities combined.

@ciaby @pnathan @ajroach42

What if everything was a #file for true?

What if all you need to support an #OS (and all #hardware it can handle) were 16 syscalls?

Keep this in mind and give a look at jehanne.io
(#Jehanne still needs 26 system calls, but I welcome suggestions to polish it further :-D)

@Shamar @ajroach42 @pnathan
I was actually looking at it before and find the concept quite interesting :)
Can I run it in a VM easily?

@ciaby @ajroach42 @pnathan

Follow the readme, and you should get it on QEMU pretty fast.

(First build takes 30 minutes due to GCC, later it takes 3 minutes at most)

Sign in to participate in the conversation
R E T R O  S O C I A L

A social network for the 19A0s.