Gopher talk Show more
Gopher talk Show more
Gopher talk (goals) Show more
Gopher talk (goals) Show more
Gopher talk (goals) Show more
Gopher talk (Stuff other people have mentioned) Show more
Gopher talk (Stuff other people have mentioned) Show more
Gopher talk (why?) Show more
Gopher talk (why?) Show more
Gopher talk (w3c) Show more
@ajroach42 time to build some modern OSS gopher clients? I'm in
@remotenemesis :-D That's my hope.
I'm still a few days/weeks away from being able to contribute, but yeah. That's what I want.
I want to see some folks building new libre gopher clients.
Gopher talk (tldr) Show more
Gopher talk (tldr) Show more
@Sci most of the bad choices have been recent.
The web was a good platform until it wasn’t anymore.
Recent bad decisions: EME (rendering browsers essentially permanently inesecure to make Netflix happy), allowing css to be, essentially, a complete programming language, stuffing JavaScript in to the browser.
There have been other questionable or shortsighted choices (the use of the anchor tag for links, the competing image tags in early html—the worst one won)
@Sci I fully expect EME to be the worst of these choices, though. It’s going to be bad.
Beyond that, a lot of the bad web decisions were made by browser vendors and web developers, but many of those were pretty horrible too. Cross-site tracking cookies, flash, JavaScript.
The web shouldn’t have ever become an application layer. It should have remained a content delivery platform. Apps should be native, and hook in to the net via apis.
@ajroach42 It's been a long time since I was into the deep technicalities, so I'm playing catch-up a bit.
If I understand correctly, EME renders browsers insecure because it allows a remote vendor to install a decryption module into your browser, which could contain anything including malicious code or security vulnerabilities, yes?
And both CSS and Java because they push browsers beyond just displaying static documents, but allow code execution within them?
@Sci EME is a form of DRM that is/will be/is bundled in to web browsers.
Technically it's sandboxed and tested and should be "safe", but because it is a form of DRM it is protected by the DMCA making the disclosure of security vulnerabilities in the webbrowser that *might* be related to EME a felony.
The w3c was given the opportunity to stop this, and make members provide an exception for security research and/or accessibility. They refused.
@Sci So now every major web browser has a thing in it that it's illegal for anyone to look at, and we don't have even the most basic assurances that someone who discovers a flaw in EME (and there will be flaws) won't go to jail for disclosing it.
CSS and Javascript I'll address separately.
CSS is supposed to define how a browser displays elements on a page. It's now a programming language. Current CSS takes lots of computing power (which is bad) and can be used to hide/do malicious things.
Some things are easier and more secure because of CSS3. A lot of things are harder, and more complex (and less secure because they are more complex, if not because they are directly less secure.) This means that you've got to update your hardware more often. Modern CSS techniques also frequently wreak havoc with accessibility, because everyone is trying to reinvent the wheel.
@Sci Javascript.
Javascript is complicated. I am of the opinion that netscape made a mistake including it in browsers to begin with, but that's just me.
All the stuff I said about accessibility and hardware/performance issues goes double for JS.
Except that JS is a full programming language from the ground up. You can run modern applications in it. You can use it to emulate old computers.
IT's neat!
It's also a huge performance and security hole. Malicious JS can cause many problems.
@Sci That is not to say that I think Javascript in general is bad!
I think it's great. Having this almost universal platform for application delivery is really neat!
I don't think it should be required to view a news article, or to log in to mastodon, or send an email.
I think js should be downloaded from your web browser and then rendered in a separate application.
Browsers shouldn't assume Javascript is available. Browsers shouldn't know about JS.
@Sci You want AJAX features in your web page? Great! What you want is no longer a web page, it's now an application. We'll run it in a separate environment.
You want to mandate AJAX features so that I can read your news article or watch your video? That's probably actually sketchy!
And then you've got shit like: https://www.eff.org/deeplinks/2009/09/online-trackers-and-social-networks which illustrates the tracking problem back in 2009. (it's worse now.)
@Sci This is not a hopeless situation. I'm probably exaggerating the potential hazards for the average end user, but also it could potentially get a Lot worse that it already is.
We're basically waiting for one of these things to snap, you know? Things haven't broken yet, but they could without much warning. All the bad things are in place, waiting for a catastrophe.
@ajroach42 Since modern cybercrime is all about finding an exploit and automating it, the hazards for the average end-user would seem just as high.
Thanks for your replies. It's helped frame it a lot better. It's hard to imagine the net as other than an application layer already.
From a utility perspective it makes sense to have the browsers do the heavy processing rather than just using them as UI for remote server apps. But when that code can contain anything, & DMCA stops it being checked.. ugh.
IMO, we need more server side code, and more dedicated applications, and less reliance on JS to replace native browser controls.
Have you read this: https://www.baldurbjarnason.com/notes/under-engineering-websites/
It's not directly about the problems of the modern web that I discussed here, but it goes through a lot of the reasons that native browser functions get re-engineered in worse form by valley companies, which is 100% part of the problem.
You mean, like Dropbox being essentially Gopher but poorer?
Well maybe Gopher 2049 should add some of the Dropbox functionality then, minus the Javascript.
@h @ajroach42 I don't have experience with Gopher, but from what I read it sounds very similar to FTP in intent. A network of file systems linking to each-other, rather than documents linking to each-other with http.
@Sci Yeah, it's basically a menu system to sit on top of FTP.
here, try http://gopher.ofmanytrades.com
That's a web proxy for my gopher server.
@ajroach42 It does seem so essential and useful, and in the current climate the only thing that makes it a hard sell is that all resources are presented as equal. You can't skin their presentation in standard Gopher, can you?
@Sci Nope!
Well, no with caveats.
We totally could develop a system to let users or servers skin the presentation. Wouldn't even be that difficult.
Clients would have to support it, and we'd need to ensure it was a progressive enhancement, you know?
But I'm not sure it'd be worth it.
@ajroach42 I'm thinking along the lines that if I had two primary forms of content on a Gopher server and wanted to promote one of them over the other, I wouldn't be able to. It's good for library-style access, but not so good for individuals.
Gopher as a supporting layer under http sounds good to me. Automatically display all publicly accessible folders in Gopher-mode. Or go to http mode for more contextual arrangement of resources.
@Sci I guess maybe I don't understand what you mean.
You can control what things display in the menus, and the order that those things are displayed, and the text that is displayed around them. You can also organize your files so that the ones people want are easiest to find.
You can't change the color scheme or bold certain items in the menu, but I don't see what good that would do.
IF you want an HTML document to link to the files, write an HTML doc and serve it over gopher.
@Sci Unless I'm misunderstanding what you mean?
@ajroach42 I was thinking if I were to build something like http on top of Gopher, I'd probably treat the filesystem as a list of unique resources and reference them through it rather than as relative file locations.
I've not had to think in a structured way in a long while, so I'm liable to misuse a lot of terms.
@Sci Yeah, I guess I'm still not understanding your goal at all.
Sorry.
@ajroach42 I suppose I'm viewing Gopher as a concept rather than an implementation of that concept.
It's like FTP in the sense that it presents a list of resources available, in a hierarchy that maps almost exactly to a filesystem structure.
It's like HTTP in the sense that it's stateless, and content is served on a per request basis, over one TCP connection.
(the FTP protocol maintains two TCP connections: a control channel, and a data channel)
See gopher RFC1436:
https://www.ietf.org/rfc/rfc1436.txt
gopher URI scheme RFC4266:
https://tools.ietf.org/html/rfc4266
@ajroach42 I can see why they do it. It does mean they, more or less, only have to develop for one platform instead of multiple. Short-sighted but understandable.
I suspect going more server-side in the current climate would make the net even more centralised. Data centres and their connections would be forced to grow, and better cement their positions. Decentralisation first? Which I'd assume would involve improving end-user webserver deployment.
@Sci I dunno if it would actually lead to more centralization.
It depends on if we go with oldschool (PHP and a prayer) server side code or new school (18 containers) server side code.
Deploying apps used to take ten minutes and could be done on super simple shared servers.
See, the problem is scale. https://medium.com/@jkriss/anti-capitalist-human-scale-software-and-why-it-matters-5936a372b9d
Scale is a trap. Build little things, and make them talk to one another.
Decentralisation and more server side code go hand in hand.
Everyone is trying to implement it as safely as possible, but we'll *never* know if it's actually safe, because disclosing vulnerabilities in the EME is a felony.
I'm not trying to fearmonger here. Firefox is probably reasonably safe for most users.
But, so long as EME is in the browser, I will trust my browser even less.
@ayy Right.
But untrusted javascript crashes browsers or does malicious stuff all the time.
And then we fix it, because we can do security research on sanboxed javascript.
@ayy Like, you're not going to convince me that any sandbox is safe.
Malicious code can already break out of sandboxes in browsers to OSs in virtual machines, and then break out of those virtual machine OSs in to the host machine OS.
https://en.wikipedia.org/wiki/Virtual_machine_escape
Have a recent example: https://www.vmware.com/security/advisories/VMSA-2017-0018.html
But we only know about these things, so that we can patch them, because it's not illegal to do security research on these platforms.
EME is unsafe. Full stop.
Gopher talk (w3c) Show more
@ajroach42 I'm joking about the possibility of an EME equivalent to https://en.wikipedia.org/wiki/DeCSS
@a_breakin_glass oh right.
@a_breakin_glass we'll have that or something like it quick.
Alternately browsers still have an analog hole.
Gopher talk (DRM and stuff) Show more
We don't need more #DRM in our lives. We don't need more very complex software that no one understands.
Simple software, when possible, is better software.
the #w3c fucked us when they voted to allow #EME without even basic protection for security research.
But before that, Google and Facebook screwed us over by stuffing more and more javascript in to our pages, and normalizing more tracking everywhere.