@kick @enkiv2 @dredmorbius @freakazoid
It's more a matter of: the social problem cannot be fixed by a technical change, so we should employ a social change instead. No matter what we do on a technical level, we can't really move the needle on this.

@enkiv2 @kick @dredmorbius @freakazoid
Changing norms is harder than employing technical systems because power is not as lopsided. To change norms, you need buy-in from most participants; to change tech, you just need to be part of the small privileged group who controls commit access. This is why it's so important, though. Norms aren't set in stone but they'll only change if you can actually convince people that changing their habits is a good idea!

@enkiv2 @kick @dredmorbius @freakazoid
Most people online have had bad experiences with people weaponizing out-of-context information -- that's why technical solutions like RTBF exist. RTBF not actually working, while simultaneously pushing power into the hands of centralized corporate services, is obvious to most people too. Saying "it's impolite to dogpile on somebody without checking whether or not you've been misled first" is way less extreme.

@enkiv2 @kick @dredmorbius @freakazoid
Re: the speed at which norms can change, consider content warnings. They went from something that only a handful of folks with PhDs trying to work out experimental ways to avoid meltdowns in extreme circumstances having even heard of them to something that everybody is aware of & only jerks believe are never justified in a matter of ten years. We still argue about when they're justified but there isn't a serious contingent against using them at all.

@enkiv2 @kick @dredmorbius I'm not arguing for RTBF. I'm arguing for not making it impossible to unpublish content.

CWs are nowhere near universal and the fact that they're not proves my point quite nicely.

@enkiv2 @kick @dredmorbius There's also the fact that people deliberately exploit immutable systems to publish stuff that's damaging. For example, there's kiddie porn in the Bitcoin blockchain.

@freakazoid @enkiv2 @kick @dredmorbius
This is a fair point, though I wouldn't pick CP as a good example of infohazard. Depending on one's model, CP is contraband either because a market for it incentivizes abuse or because exposure to it incentivizes abuse. Under the former model, having it on the blockchain lowers abuse potential. Obviously a complex & emotionally charged topic (even more so than "if you burn a million dollars does the value of a dollar bill go up or down")

@enkiv2 @freakazoid @kick @dredmorbius
The risk profile of putting contraband or blackmail material on a blockchain is basically the same as the risk profile of keeping a copy on paper in a safety deposit box & periodically mailing out photocopies -- except that this latter *only* works for people with an incentive to store info indefinitely. In other words, it puts the power to select what gets remembered in the hands of whoever thinks they will want to distribute it in the far future.

@enkiv2 @freakazoid @kick @dredmorbius
Really, norm-based solutions can't work unless practically everything is immutable either. If everything is immutable then context can be retrieved in the future even if nobody thought to preserve it at the time. This functionally defangs blackmail because lies-by-omission are not backed up by layers of friction between everybody & whatever information was omitted.

Follow

@enkiv2 @kick @dredmorbius I don't see how things' not being unpublishable could defang blackmail. Blackmail will just apply to information that hasn't been published in the first place.

This goes beyond mere disagreement; this is a system I would kill to stop.

@enkiv2 @kick @dredmorbius This is the argument 4channers make against outlawing revenge porn. "Women just need to learn to stop allowing boyfriends to photograph them naked, or accept that naked pictures of them are going to be on the Internet."

No. We live in a society. You publish shit that hurts someone else, you get hurt yourself.

@freakazoid @enkiv2 @dredmorbius Why is it always 4chan users who get blamed for bad culture on the internet? It's literally the queerest place on the entire network, yet without exception it gets blamed for the things that redditors are primarily responsible for.
@freakazoid @dredmorbius @enkiv2 *No. We live in a society. You publish shit that hurts someone else, you get hurt yourself.*

This is a slippery and stupid slope, and it justifies what's currently happening to people like Snowden, Manning and Assange, despite them not doing anything that was actually morally wrong. I'd accept a claim like this with reduced scope, but as it stands that's way too wide.

@kick @enkiv2 @dredmorbius "sometimes people get punished for things we don't think they should be punished for" is not an argument in favor of not having any limits at all, so I'm not super interested in debating it.

@kick @enkiv2 @dredmorbius Super uninterested in a 4chan vs Reddit debate. I couldn't care less about 4chan getting blamed for terrible shit they aren't actually responsible for given all the terrible shit they (or rather the shitheads they allowed to take over) were responsible for.

@kick @enkiv2 @dredmorbius Actually I'm being too generous. The folks there were plenty comfortable with racist, homophobic, and transphobic language from the very beginning. If Moot had deliberately set out to build a Nazi indoctrination camp, I have no idea what he would have done differently.

@freakazoid @enkiv2 @dredmorbius

You're being kind of ridiculous, which is kind of frustrating to see from someone who otherwise has been mostly at least together, view-wise.

There's a board dedicated to queer people (three of them if you include boards dedicated to queer anime/manga/etc), 90% of boards have zero political discussion (I'm not joking about this, some boards even ban it if I recall correctly), and Moot wasn't "comfortable" with any of that stuff; he's not a Nazi nor Nazi sympathizer, hell, he works at Google now.

He (rightfully) believed that spaces where people can interact without identifying themselves are important, which is the correct view to have.

@freakazoid @enkiv2 @kick @dredmorbius
I'm not opposed to ramification for bad behavior. I'm trying to figure out how to encourage punishment to be equitable. Part of that is preventing motivated misrepresentation (and power asymmetry in misrepresentation). Right now would-be blackmailers choose what gets to become history, so they can spin anything as a sin.

@freakazoid @enkiv2 @kick @dredmorbius
I'm not sure, in that case, what risk profile you're talking about. Are we talking about a case where someone publishes something about themselves that they later regret? Where someone publishes something about themselves & another party takes it out of context? Or where someone publishes information about someone else without permission?

@enkiv2 @freakazoid @kick @dredmorbius
I can't think of an example of a problem that being able to unpublish only things that you yourselve have published will reliably solve, in a world where backups & blackmailers exist. (It solves the pseudo-problem of deciding that a post you've published is potentially risky and undoing it before it has actually caused a problem. I don't think that's what you're talking about, though.)

@enkiv2 @freakazoid @kick @dredmorbius
And, on the other hand, unpublishing what *other people* have published doesn't appear to be on the table. It has a lot of issues and complications, & is generally handled by lawsuits or by corporate simulations of lawsuit-style deliberation. It can be handled by admin fiat in federated systems but scaling to distributed systems means it becomes a per-post version of transitive blocking. (Cancel messages, etc.)

@enkiv2 @kick @dredmorbius My goal is to make it easy to indicate to people who don't want to publish stuff against the will of folks who are impacted by it that you'd like them to take it down.

@enkiv2 @kick @dredmorbius The archive.org situation is one: even though they will take stuff down on request you have to separately ask them and everyone else.

Yes, there will be attempts to abuse such a system, which is why it should not be legislated into place by government but built by people who want to have a robust publication system that at least makes an attempt to minimize harm.

@enkiv2 @kick @dredmorbius I think the big issue here is reachability vs discoverability. This was an issue Mark Zuckerberg did not understand when designing graph search, until Facebook employees practically revolted and told him that it was a bad idea to let people bypass permissions like friends list visibility just because it was possible to construct someone's friends list by scraping others' pages. It's also encountered when public records go online.

@freakazoid @enkiv2 @dredmorbius Graph search lasted for six years with full functionality, and it doesn’t seem like it was that bad of a solution for Facebook.

Also, it wasn’t designed by Zuckerberg, it was designed by Google employees.

(And further, it was a great idea. So much was dug up on politicians because of it that the world was in an undeniably better spot.)

@kick @enkiv2 @dredmorbius The solution they put into place for graph search was that you could only search edges that were accessible to you in both directions. It wasn't a fundamental problem with graph search, just a problem with how Zuck was thinking about the permissions model.

@kick @enkiv2 @dredmorbius Zuck was the product owner. I'm aware former Google employees designed the tech; I worked there for the entire time it was being designed and used the internal versions of the same technology.

Please take your arrogance elsewhere.

@freakazoid @enkiv2 @dredmorbius You worked there? Using your logic from elsewhere in this thread: why were you willingly ruining society?

Facebook was controversial from the outset, and by the year that that product was launched, people knew what it was doing pretty well (and it's not like Facebook employees couldn't get jobs elsewhere).

@kick @enkiv2 @dredmorbius @freakazoid I suspect that probably every competent spy agency is delighted with their chance to blackmail current and future politicians with the data they scraped, or to figure out how to get agents close to them, or who their family members are. Journalists don't seem to be doing much with this data, although I could be wrong about that? I suppose nobody can publicly admit to having it at this point.

@kragen @enkiv2 @dredmorbius @freakazoid They did for years for source hunting, if I remember correctly, though I'll admit I may not remember correctly.

@kick @enkiv2 @dredmorbius @freakazoid Oh, interesting! I'd like to find out more if you find something.

@kragen @enkiv2 @dredmorbius @freakazoid Luckily, the Wikipedia article looks like it mentions an occurrence (I wasn’t aware of this one, actually). Bellingcat (which is a low-volume but very interesting investigative publication) apparently used it pretty heavily.

https://twitter.com/N_Waters89/status/1137379896899067904

(A quote from the article that Wikipedia cites: “Now that Graph Search has gone down, it’s become evident that it’s used by some incredibly important section[s] of society, from human rights investigators and citizens wanting to hold their countries to account, to police investigating people trafficking and sexual slavery, to emergency responders,” Waters told Motherboard in an online chat.)

@freakazoid @enkiv2 @kick @dredmorbius
Absolutely! I've sort of been arguing for this. When I pushed transitive blocking over unpublishing, it's because I think the biggest issue is the flatness of addresses/access: folks outside your group, who do not share your norms, can read your messages and force replies on you.

@freakazoid @enkiv2 @kick @dredmorbius
OK. I'm fine with that, and most mature systems for static content have facilities for that (ex., IPFS has a hash blacklist for both fetching & forwarding that's basically the same as a killfile, along with mechanisms for folks to share these blacklists with each other).

@enkiv2 @kick @dredmorbius It's not a question of unpublishing what others have solved. It's about supporting the ability to ask that others unpublish things they have published. It need neither be reliable nor perfect in order to reduce harm. But it needs to exist.

@enkiv2 @kick @dredmorbius At any rate I feel that I've given conclusive proof that this needs to exist. If you remain unconvinced then there seems to be little point in my expending additional effort trying to convince you.

@freakazoid @enkiv2 @kick @dredmorbius
OK, yeah, I'm perfectly fine with this as harm reduction. I wouldn't call it 'unpublishing' because on a technical level, on a service that otherwise supported static content, it would be implemented as a blacklist of addresses (which eventually would become un-hosted as the number of nodes with a copy approached zero).

Sign in to participate in the conversation
R E T R O  S O C I A L

A social network for the 19A0s.