Federated Republic of Sean is a user on retro.social. You can follow them or interact with them if you have an account anywhere in the fediverse.
Federated Republic of Sean @freakazoid

Hot take on economics Show more

· Web · 7 · 11

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

@freakazoid Back to coercion:

I've adopted the BATNA test.

Who's got the best alternative to a negotiated agreement?

If one party just picks up the next prospect from a stack, and aother faces a life- or business-disrupting event, then it's the latter who's subject to coercive pressure.

Capability to pursue disputes, invoke assistance, etc., are other parts of this.

@woozle @o

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

re: Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

@freakazoid So, #GreshamsLaw dynamics can turn up in various forms. I've tried (unsuccessfully) to catalogue the in the past.

There's fiat or imposed value, as with coin. Also with transjurisdictional standards, such as divorce law and shipping registries ("flags of convenience"). Whatever the *minimum* acceptable *somewhere* is, is acceptable *everywhere*.

There's effective perceived value -- Mencken's "Brayard", or consumer technologies, or bicycles.

@o @woozle

1/

@freakazoid The limit there isn't one that's externally imposed, it's what the *minimum viable consumer* can effectively sense. It's why, generally, mass-market anything is crap, from a power or elite user standpoint.

That's my "Tyranny of the Minimum Viable User":
old.reddit.com/r/dredmorbius/c

There are a few closely related cases: highly variablel goods, and highly tailored goods.

Variable goods, especially skilled labour / high-skill jobs create problems on both sides of ...

@o @woozle

2/

@freakazoid ... the transaction. Employers / clients have difficulty in assessing the quality of applicants / vendors. And applicants have difficulty in assessing the suitability or acceptability of employers or clients. This in a nutshell is why tech hiring sucks, why there's so much value signalling (high-prestige degrees, Big N employment history, personal recommendations). And everyone's unhappy.

Incidentally, *writing detailed technical content* is a signalling tool.

@o @woozle

3/

@freakazoid The "variable quality" issue is largely the "Market for Lemons" instance. Akerloff solves that by providing more information. That works *if you can meaningfully assess that information*, which for highly complex goods becomes a highly questionable proposition.

Overly-tailored goods are "gadgets". Things that do only one thing, or fit only one circumstance. They're like close-fitting clothing. Change any constraint, and they no longer apply.

@o @woozle

4/

@freakazoid The problem here applies if the tailoring is expensive, *or* results in a high price. Either the good isn't sold (because it's perceived as having limited utility) or it's bought but value isn't delivered. Something of a debatable inclusion in the #GreshamsLaw dynamic, but at least similar.

There are a whole set of signalling mechanisms -- shibboleths, cultural myths, and fads, which emerge from a Gresham's dynamic.

@o @woozle

5/

@freakazoid Underlying quality is difficult to communicate, so some *quality indicator* is substituted. Accent. Vocabulary. Cultural myths. Clothing. Food. Table manners. Branding. Musical tastes. Books read. Schools attended. Management fads.

These signal *both* quality *and* group alignment -- and the wrong set can easily get you killed in many cases.

*Changing* signifiers is highly traumatic: culture wars and value shifts.

This also leads to cargo culting.

@o @woozle

6/

@dredmorbius @woozle @o These fall into a few different possibly overlapping categories: implicit bias, laziness or ignorance (because the information is available but people don't bother to look or don't know it's there), and places where it's genuinely hard to know, like interviewing and managing (though there's a lot we do know about management and interviewing so laziness and ignorance applies there).

...

@o @woozle @dredmorbius Volume also contributes to this a lot: for cheap things, the cost of research can be a significant fraction of the cost of actually buying it. This is probably why for many things there's not much of a "middle ground", just super cheap and super expensive things.

You can also get seemingly paradoxical effects where the brand with the better reputation has lower quality at a higher price point. I've noticed in general an inverse correlation between marketing and quality.

@freakazoid Totally agreed on the cost-of-research factor.

The fact that this is very frequently *excluded* from economic analyses is one huge source of the fallacy of the sunk cost fallacy. That is: the *putative* sunk cost excludes a tremendous number of actual, but non-apparent costs. Which provide benefit, and must be re-invested when switching to another option.

Another case of the #manifestations problem: information that's not manifestly evident.

@woozle @o

@dredmorbius @o @woozle Can you give an example of where it's excluded to the detriment of the conclusion? Since the sunk costs fallacy is widely known in econ circles, I would be surprised if it's being excluded in places where there's a significant effect.

One thing I don't generally hear when people are talking about human cognitive biases w.r.t. markets is how government avoids being affected by the same bias. The sunk costs fallacy, for example, is manifest in every large project.

@dredmorbius @o @woozle Oh! I'd misread. I missed that you said "fallacy" twice.

@woozle @o @dredmorbius That's pretty interesting. It seems like the sunk costs fallacy might make more sense as part of public choice theory than behavioral economics. Speaking of which, I wonder to what extent public choice theory explains decisionmaking inside corporations and other organizations where there aren't prices?

@freakazoid @dredmorbius @woozle @o

Mature markets tend to end up with two market leaders and a bunch of also-rans. In that kind of market, the #1 is often complacent and of poor quality, but the #2 tends to be better because it wants to knock the leader off the top spot.

e.g. VHS vs Betamax, Windows vs macOS, VW vs Toyota for cars, etc.

(Obviously there are counterexamples, and I think the trend is becoming less clear as markets fragment.)

@mathew @o @woozle @dredmorbius Two of the three examples you cite have strong network effects, where that's certainly true. But car manufacturers don't have this problem. Globally, in 2014 (the year I can easily find data for), the number 8 automaker by number of cars (Honda) sold almost 43% of the number of cars of the number one (Toyota). In the US, the number 7 manufacturer, Kia, sold 43% as many passenger cars as the top manufacturer, GM. And number 3, Toyota, has almost 83% of GM's sales.

@dredmorbius @woozle @o @mathew Actually, now that I think about it, VHS vs Betamax happened in a market that wasn't remotely mature, and it was a competition among standards, not companies. There were plenty of manufacturers of both tapes and players.

I'm having difficulty coming up with an example in any situation where there aren't strong network effects, at least in the US.

@mathew @o @woozle @dredmorbius During my orientation at Google, when they were talking about the datacenters, someone asked if they ever planned to open source the designs like Facebook had. The speaker replied, "Open source is what the company in second place does." That wasn't the only thing in orientation that made me think about just walking out.

@dredmorbius @woozle @o @mathew Maybe commercial airline manufacture is a good example? Boeing and Airbus are definitely the top two, and Bombardier is the only other manufacturer I can even think of, but they only do regional jets AFAIK. But I'm not sure either Boeing or Airbus ever really acts like they're either especially comfortable or hungry; the competition seems to keep both companies on their toes pretty well.

@freakazoid China and Russia both have indigenous aircraft industries, and there's Embrar of Brasil, though theirs are also largely regional / corporate jets.

There are more small- and mid-sized aircraft manufacturers.

The industry as a whole is *extremely* conservative, almost wholly governed by engineering and aeronautical constraints (there are only so many arrangements of sausages, engines, and lifting surfaces).

Plus insurance risks and regulation.

@mathew @o @woozle

@freakazoid An interesting parallel is actually cargo ship design and use in the 13th / 14th centuries, about the time the lateen rig was adopted by Europeans, a millennium or more after its appearance on the Indian Ocean and Arabia.

The problem was insurers.

Shipping is high-risk, and voyages were insured individually, as separate ventures. Insuring syndicates wouldn't take risks on new-fangled tech like lateen rigs.

As a consequence, European ships could ...

@mathew @o @woozle

@freakazoid ... not sail close to the wind *at all*, often had to wait *weeks* before entering port (for favourable winds), and were limited to sailing between May and October. November through April nothing moved by ship, which is to say: nothing moved.

@mathew @o @woozle

@dredmorbius @woozle @o @mathew Yup, sounds like the aircraft industry. Interesting to see the same conservatism develop without (much?) government regulation. Was there much competition among insurers? Were *they* regulated or otherwise privileged?

commercial airliners Show more

@freakazoid There are some interesting exceptions, yes, but most of them show strong evidence of forces encouraging regionalisation.

The film industry is a key case in point. Reels of film, or now, digitial streams or recordings, can be transmitted virtually effortlessly worldwide. The *fixed* infrastructure of film development is largely the support industry: carpenters, casting agencies, caterers, coaches, costume & set designers, electricians.

@woozle @o @mathew

1/

@freakazoid So centres of specialisation appear.

But you also have *globalised* centres, especially in India, China, Japan, and multiple European countries.

Most of that is language, though culture also plays a major role, and government programmes specifically encouraging and suporting an indigenous film industry -- a powerful propaganda and cultural tool, kept under local control.

The auto industry is similar, in respects. Not because it's projectable...

@woozle @o @mathew

2/

@freakazoid ... though cars ship easily, factories don't, so it centralises, at least within countries.

(JIT and improved transport networks are changing that somewhat. Factories are more distributed in the US than they were in Detroit's heyday, but still cluster somewhat.)

But: there are both regional taste differences and economics, as well as national interests involved.

Building cars and military vehicles shares much in common, and military manufacture is ...

@woozle @o @mathew

3/

@dredmorbius @mathew @o @woozle Is physical colocation a problem? It's generally centralization of control or coordination that somehow discourages defection (cartels have tended to disintegrate rapidly historically) that is the problem, right? And often when a company does manage to dominate that's because new entrants have to face regulatory barriers to entry it didn't, like Amazon with sales taxes.

@freakazoid Colocation used to be highly important because there was a lot of interplay between the automanufacturers themselves and supplier pipelines. Sometimes meeting F2F and getting your mitts on metal is the best way to resolve stuff.

That's either not so much the case, or other factors matter more, but you still have forms of clustering which matter, ranging from support industries to education and infrastructure.

Early colo was driven by bulk materials.

@woozle @o @mathew

@freakazoid Detroit was where raw iron ore and steel could be directly offloaded via ship and cars shipped by rail to mostly Eastern markets.

Interestingly, Los Angeles once featured pretty much the largest of every factory plant *outside* the primary core group, within the US. Which is to say: at LA's distance from the Rust Belt, 2ndary localisation made sense.

@woozle @o @mathew

@freakazoid ... of strategic interest. So countries otherwise not particularly vested in car manufacturing sustain it.

Local tastes and regulations vary, so cars get built for specific regulatory and cultural markets, as well as price points -- both inputs and consumers. Hence: much more variance *between* national markets, but typically little *within* them.

Aircraft are somewhat similar, though more constrained.

@woozle @o @mathewi

4/

@freakazoid @dredmorbius @woozle @o Cars may not be a two-player market, but I still maintain that VW has gotten lazy (and indeed downright criminal), lets its quality slip and failed to invest in new tech, while Toyota has focused on making better cars, even if they did make a disastrously bad move betting on hydrogen rather than battery storage. (There's probably an interesting case study there on why they went the way they did.)

@mathew @o @woozle @dredmorbius No case study needed: they did it because hydrogen is heavily subsidized in Japan.

VW's failure to invest in new tech is the case with car makers across the board. Their cheating was to try to avoid losing a bunch of car sales as diesel was essentially getting regulated out of business. Which IMO was a stupid move on the government's part since diesel has lower CO2 emissions than gasoline.

@dredmorbius @woozle @o @mathew Actually I should qualify that - it has lower emission not because its specific CO2 is lower but because diesel engines have higher compression ratios so tend to be more efficient. You can also get more of it from oil without having to resort to cracking. But hybrids are better, so probably not stupid to regulate its emissions, really.

@freakazoid @mathew @o @woozle @dredmorbius You can also decouple compression ratio (which actually increases thermal losses the higher you go) from expansion ratio (what actually improves efficiency) through either crankshaft linkages (true Atkinson-cycle engines) or valve timing ("Atkinson"/Miller-cycle engines with late intake valve closing or an extra valve, Budack-cycle engines with early intake valve closing).

And the compression losses actually mean that, in an engine that has a full compression stroke, optimum compression ratio is about 16:1 - anything more than that, and you start losing more to heat than you get back in expansion, as I understand. (Diesels ran significantly more than that in the past because they needed the excess heat to reliably ignite fuel, but in the 2010s they got down to 16.5:1 for most engines.) There's gasoline engines that run 13:1 on American regular fuel in full Otto cycle operation, though, and 14:1 on American premium/European regular.
@freakazoid @dredmorbius @mathew @o @woozle And, yeah, as much as TDIClubbers like to go on about "but diesels beat their EPA mileage and hybrids don't!"... they only do that if they're cheating and/or you're driving slower on the freeway than the current EPA freeway cycle. And hybrids can beat it too if driven like that.

My pre-Dieselgate 1999 New Golf TDI (which was fairly heavily modified, but one of those mods cheated constantly, improving thermal efficiency) pretty reliably got 47-51 miles per US gallon on the highway - original EPA highway was 49, 2007 re-rated EPA highway is 44.

By comparison, on road trips, my 2016 Prius gets about 54 MPG, versus a rating of 50 highway. And, that's on a lower carbon per gallon fuel. (Better aero does help.)

And then, in the city it does decently, I typically get 40-60 on my commute (if it's spring/fall, 60, summer, 50-55, winter, 40). The TDI would be more like 30-35 MPG on that commute.

@freakazoid Deisel fuel itself has a slightly higher energy content than petrol/gasoline, the engines run at higher compression ratios, and at higher temperatures (Carnot efficiency), all of which net more mileage and lower CO2 emissions.

Emissions of *particulates* (especially PM2.5, v. bad for lungs and health), and of NOx (nitrogen oxidising at high temps and pressures) are *worse* for deisel than petrol engines.

Also possibly sulfer and other sour crude contaminants.

@mathew @o @woozle

@freakazoid Incidentally, two cases of dyanamics I've been describingl

Toyota's forray into hydrogen fuel cells is based on government policies and incentives, creating a localised specialisation.

Volkswagon's diesel emissions fraud is a #GreshamsLaw dynamic: trying to substitute a lower-value quality for a higher-value one, through fraud.

@woozle @o @mathew

@dredmorbius @mathew @o @woozle It seems like this is also the case with software. People pick software on the basis of features or price, because they have no idea how to measure quality. So there's no market for high-quality software.

A "Consumer Reports for software" might help. It could track historical bugs, usability/accessibility problems, vulnerabilities, attacks, and the maker's response to them, etc.

@freakazoid "The Tyranny of the Minimum Viable User"

old.reddit.com/r/dredmorbius/c

Since users' _capabilities_ also vary strongly, the problem goes beyond this.

You see similar types of dynamics in, e.g., "audiophile" gear, much of which seems principally engineered to separate rich idiots from their lucre.

A better comparison might be precision or highly-skilled equipment, also somewhat affected.

@woozle @o @mathew

@dredmorbius @mathew @o @woozle This is why I think that products should lift up the user, not descend to the user's level.

@freakazoid The problem, given the dynamic, is that users don't _want_ to be lifted. They want to be comforted. You can try going against the grain. The market will punish you.

I'm not saying the market is right. The market and I disagree violently.

But the market is bigger than me.

@woozle @o @mathew

@dredmorbius @mathew @o @woozle Will it? I can think of plenty of examples of brands marketing how dumb their products are ("You already know how to use it" being a well-known example), but not of the market punishing products that are self-teaching. Do you know of some?

@freakazoid P.T. Barnum's dictum isn't an absolute universal, but it's close.

You can swim upstream, but you're going to find yourself in niche space. That *may* be a *profitable* niche, but it's still a niche.

The useful thing to do is look for cases of exceptions to the rule -- where is coplex, respectful, high-information-density content (or products or services) found?

Quality literature, news, education, music, information gear, etc.

@woozle @o @mathew

@dredmorbius @mathew @o @woozle Sure, but couldn't the reason for that be that our current method of creating new products doesn't tend to incorporate pedagogy as a skill, not that the market doesn't desire pedagogy?

@woozle @o @mathew @dredmorbius To expand on the "You already know how to use it" example, it could be that pedagogical Apple failed because they didn't have much business sense, and dumbed-down Jobs apple succeeded not because of Jobs's dumbing-down of their products but because he understood business and marketing.

@freakazoid The winner-take-all dynamic of many tech-based products (hardware, OS, software, services, social media) makes attribution highly risk prone: success succeeds, failure fails. Survivor bias is manifest.

But having witnessed enough cases directly, and studied numerous others, the general rule of "don't outsmart your market" seems to hold.

Apple's big success is smartphones. Mac is a fairly small share of their market. Though they seem to be catering it again.

@mathew @o @woozle

@freakazoid I was looking at the specs for the upcoming Mac Pro release. It's mind-boggling.

Base model: $4000. Top of the line is 28 cores and 1.5 TB RAM, 4 TB SSD, 4xGPU. Speculation is that this will run north of $35,000, and I suspect that's low. This is a supercomputer in a mesh cage.

(I'm wondering what Linux or Window equivalents there might be.)

Definitely drool-worthy: apple.com/mac-pro

@mathew @o @woozle

@freakazoid @dredmorbius @mathew @o

Note that even Mac mice now have 2 buttons (or so I've been told).

@woozle The latest Mac "magic mice" have *zero* buttons, though there are multiple, non-determinable, sensing zones where things may or may not happen.

@freakazoid @mathew @o

@freakazoid Mostly it's cases where one of several conditions is met:

1. The good is a signalling mechanism. I can advertise my own capabilities in a space by using (or producing) the good. Uni education especially.

2. Direct beneficial use. If the good provides a _direct_ and _quantifiable_ or _perceptible_ benefit, it may find a niche. That will by definition be limited, and faces challenges by imitative competitors and measurement difficulty/costs. Examples ...

@woozle @o @mathew

2/

@freakazoid ... include business/financial news, policy news, etc.

3. Quality professional gear: audio, photo, video gear. Linux/BSD vs. Windows/Mac, Mac vs. Windows. Small, focused, niche audiences.

4. Regulatory quality floor on goods *or* users. Commercial and civil aviation vs. automobiles. Any idiot can (and does) drive a car. Pilots are licensed. Commercial pilots are certified to specific aircraft. There is a very strong quality floor.

@woozle @o @mathew

3/

@freakazoid 5. With some limits: self-use. Especially where tools are mutually developed by specialists within a craft. Linux *used* to occupy this space, it's drifting from it. Whether there's a replacement isn't yet clear. The death of the desktop, may, paradoxically, save Linux, if the idiots all use smartphones instead.

There are some parameters that may influence this. The scope of network effects especially. If intelligence counters network, then a ...

@woozle @o @mathew

4/

@dredmorbius

Are you saying Linux used to be self-teaching? Because in my experience, it used to be worse about that but has slowly improved (from, like, 0% to maybe 20%).

@freakazoid @o @mathew

@woozle @mathew @o @dredmorbius Linux has dramatically dumbed down over time. I wouldn't really call it "self-teaching" at any point, but it used to be that, *if you used Linux*, you made heavy use of man pages and documentation that was included with Linux. So simply having sufficient interest in using Linux to get you over the hurdles would have left you significantly more competent in using Linux than it does today.

@dredmorbius @o @mathew @woozle Today people's response to some random thing breaking in GNOME 3 or KDE (let's ignore Android and Chrome OS) seems to be about the same as it is if something breaks in Windows: format and reinstall.

Some of that is just an increase in accessibility. But it's specifically an increase in accessibility gained by dumbing down the system instead of by improving the system's self-documentation/self-teaching.

@woozle

Some GNU/Linux distributions (like SuSE) used to have top-notch documentation, so that you could read the manual and actually learn something.

These days, if you want a thoroughly documented free Unix, your best bets are #FreeBSD and #OpenBSD.

@dredmorbius @freakazoid @o @mathew

@starbreaker Debian still makes the effort. Not having manpages remains a bug (though not a release-critical one).

Red Hat has almost always been far less useful -- missing manpages and even /usr/share/doc/<packagename> entries.

Debian's dwww is hugely useful.

FreeBSD / OpenBSD manpage quality is typically higher, though they use the wrong utilities.

GNU's insistance on info is a fucking brain disease.

@woozle @freakazoid @o @mathew

@dredmorbius

Definitely agree with you concerning GNU info pages, but what do you mean by OpenBSD using the wrong utilities for man pages?

What's wrong with mandoc?

@woozle @freakazoid @o @mathew

@starbreaker #ItsAJokeSon: BSD manpages document BSD utilities, not GNU utilities.

The arguments are all wrong ;-)

@woozle @freakazoid @o @mathew

@dredmorbius

It isn't the BSD man page authors' fault that GNU can't stick to POSIX. :)

@woozle @freakazoid @o @mathew

@woozle Not so much that, as "developed principally by its own users", much as early Unix had been (1970 - 1990 or so).

That is, "users" weren't a separate class, they were "us", from the developers standpoint.

Today, you've got a much larger nontechnical userbase. The total installed base hasn't changed much by _percentage_ but it's vastly greater by _number_ than in the late 1990s.

Self-documentation through code, manpages, info docs, & HOWTOs has varied.

@freakazoid @o @mathew

@freakazoid ... quality product stands a far better chance. If production can be readily distributed and decentralised, similarly. Open source software seems vastly more tractable than open source hardware. Fabrication, logistics, and distribution are far harder for physical commodities.

In particular, if there's no way to impose some kind of effective floor (as with pilots/aircraft, certified industrial equipment, etc.), the market will seek the minimum viable user.

@woozle @o @mathew

5/

@freakazoid To counter that, you've got to raise the bound on that minimum.

You can gatekeep the users (certification). Or you can make sufficient degrees of incompetence nonviable -- harms or at least does not help the incompetent user is one route. This will still limit the scope of the market, but at least won't dilute the product. Call it a talent bar.

This also means a noneconomic motivation. You're not profit-maximising, but maximising for individual benefit.

@woozle @o @mathew

6/

@dredmorbius @mathew @o @woozle I don't agree with that, because it assumes that the reason for incompetence is lack of ability or desire to become competent. If it's lack of desire, let's exclude them not just from products, but from the planet, since they're ruining humanity. And I suspect lack of ability represents only a tiny fraction of the population.

But I think the real answer is that our system selects for people who are shitty at teaching.

@freakazoid So ... well, current use of idiots notwithstanding, I try to avoid prejudiced language, and the whole long first part of the Reddit essay goes into detail about why simple tools are often a net win.

The problem is where the dynamic directly impedes development of useful tools, systems, goods, services, etc.

And I really _don't_ think it's something you can chalk up only to pedagogy. Put another way: we're at the end of a phenomenal 300 yr ramp up in literacy.

@woozle @o @mathew

@freakazoid Which includes a hell of a lot of pedagical reforms (the history's interesting, esp. for trivia, or trivium, fans).

Literacy ~1700 was ~10%. By 1900 in US/Europe, 90%+

HS graduation in the US 1900: 6% 1950: 90%. Bachellors is now 30%+ and PhD > 8%. There are more PhDs in the US now than HS grads 120 years ago.

But: the quality of that HS education is also, in some measures, much lower: lower language/logic skills, better scientific knowledge.

@woozle @o @mathew

2/

@freakazoid And the general informational tools we have *are* better than 300 years ago. (In part: they deliver us those 300 year old works instantly.) But they're far short of their potential.

And I'm trying to suss out *why*.

Teaching/training is _part_ of it, and yes, the education system likely doesn't meet its potential either, but it does a tremendous amount, for a tremendous number. And hasn't _worsened_ appreciably since the 1950s.

Most variation is ...

@woozle @o @mathew

3/

@freakazoid ... actually, if you look at it, changes in either who's included in classes or testing. Increased access => falling test scores. Rising test scores => falling access. That points to some population-level intractability.

(With exceptions. "Stand and Deliver".)

But trying to make all the children above average is a Sysiphean task, and a doomed premise for progress. You've got to work with the talent you've got.

My point is to not get in its way.

@woozle @o @mathew

4/

@dredmorbius @mathew @o @woozle Replying mid-thread because I think a lot of your reasoning farther down hinges on what I believe to be a mistake in this post. The fall in test scores from "increased access" is not necessarily because the larger group is not learning as well, but because the test wasn't actually testing how effectively the students were being taught. Most of our standardized tests are really indirect tests of socioeconomic class, not of how much students are learning.

@woozle @o @mathew @dredmorbius I think that the real problem is that "education systems" are super bad at educating. They can take a subset of students who have the right background and right set of parents and get them to do well on standardized tests, but they cannot take a random person out of a population of, say, English speakers, and on net provide them significant benefit.

@dredmorbius @mathew @o @woozle The reason students at "elite schools" tend to do better is that the school only allows in students who are going to be successful no matter what. They're *filtering*, not teaching. But they're not really filtering for innate skill. They're filtering for what the student has already absorbed from the world, largely due to the circumstances of their birth.

@woozle @o @mathew @dredmorbius And the problem isn't really that teachers are incompetent, though a bureaucracy isn't capable of hiring competent people; it's that it's not possible to be competent at teaching a class of 30+ randomly selected students.

@dredmorbius @mathew @o @woozle The fundamental problem IMO is that almost all societies treat teaching and learning as just one function among many, and something that's confined to particular institutions and particular phases of a person's life.

IOW it's not just Americans who are anti-intellectual but most of human society. And the reason is that we have entrenched groups who have a vested interest in a stupid population.

@woozle @o @mathew @dredmorbius Politicians don't want an educated population because they want people to be swayed by their emotional arguments. Pretty much every skilled profession has a vested interest in everyone else being stupid (or at least not knowing THEIR skill) because that's how they make their money. And the victims of this "uneducation system" want everyone else to be stupid because otherwise THEY feel stupid.

@freakazoid I'd mentioned the long history of pedagogical evolution. This is a huge part of it.

In particular, there's a long-standing divide between "liberal" and "technical" education. Politicians (and employers) want a _skilled_ but _pliant_ public.

The "servile arts" is another term for technical arts. Technical / polytechnic schools specifically excluded much of the liberal education, whose heart is the Seven Liberal Arts, the Trivium + Quadrivium previously hinted.

@mathew @o @woozle

@freakazoid There's a huge (if obscure) literature on this, stretching to medaeval and ancient times. I've touched on it occasionally, see Hans Jenson on John Stuart Mill, 1860s England:

old.reddit.com/r/dredmorbius/c

Effectively, there are forces working for and against this.

Of late, high-tech skills have been in need, but also tend to create hugely intelligent people with inconvenient consciences: Einstein, Oppenheimer, Chomsky, Ellsberg.

A dilemma for oligarches.

@mathew @o @woozle

@dredmorbius @woozle @o @mathew Literally every single person you gave as examples there is/was Jewish. I'm pretty sure they're all Ashkenazi, in fact. They are not the products of society at large but of a specific subculture that values learning, teaching, and thinking.

@freakazoid I hadn't even realised that.

I was temporising and didn't think through the list at length. I did think of adding Edward Snowden (not Jewish). I'm trying to think of other dissident scientists and engineers ... and several of the obvious contenders are _also_ Jewish.

The Jewish relious Talmudic tradition is one that treats many questions as _not_ answered, but as _subject to inquiry_. So there's a call for questioning concepts.

@mathew @o @woozle

1/

@freakazoid
The lack (until lately) of a nation state in which a uniform orthodoxy might be imposed (and in which that seems to be happening) is also notable. Prior to 1948, any Jew anywhere was part of an internal minority, often at least discriminated against, if not actively oppressed. The WWII oppression was only the latest in a very long history of similar such actions.

What effect that has had culturally or otherwise I don't know.

@mathew @o @woozle

2/end/

@dredmorbius @woozle @o @mathew Snowden is not an idiot but he's not in the same category as any of your other examples. He did, for example, misunderstand how the Prism program functioned, thinking it required the cooperation of the companies involved when even just looking at their own slides it was clear it did not - otherwise reverse engineering the protocols involved wouldn't have been necessary.

@mathew @o @woozle @dredmorbius I think discrimination probably had something to do with it, maybe because Jews were often excluded from those "servile professions" you speak of.

I don't know what impact the existence of Israel has on people's general competence, but I think it could as easily be negative as positive.

@dredmorbius @mathew @o @woozle Even parents don't necessarily want their kids to be *smart* but to be *successful*. But parents and the individual themselves are certainly the ones with the most interest in that person's competence.

@freakazoid @dredmorbius @mathew @o @woozle This is the strongest argument I have heard against the long term use of democracy.

@feonixrift @freakazoid @dredmorbius @mathew @o

Democracy-as-we-know-it(-Jim), anyway.

Current implementations are tilted heavily (if not always obviously) towards protecting existing power-structures.

It's pretty easy to imagine small improvements that would significantly undercut this tendency, and not hard to design improvements that would do more than that.

@woozle @o @mathew @dredmorbius @feonixrift I think the big problem with democracy is that we treat it as if it has some magical power. Instead of fighting for human rights and rule of law, we fight for "democracy", even though the only rights democracy can protect by itself are the rights of the majority of the enfranchised, i.e. the people who least need it, and people will almost always vote *against* rule of law as long as it gets them what they want.

@feonixrift @dredmorbius @mathew @o @woozle I see democracy as a tool for implementing rule of law, because it allows succession of governments in a way that's always governed by law, whereas hereditary succession has always ended up with edge cases and conflicts. And of course can produce rulers who are so incompetent or evil that it causes revolutions where none were necessary (cf. the French and American revolutions).

@woozle @o @mathew @dredmorbius @feonixrift Democracy certainly produces incompetent leaders, but as long as there's rule of law and terms of short enough length, people will just wait rather than revolting. And of course lack of revolt is a necessary condition for rule of law.

@woozle @o @mathew @dredmorbius @feonixrift Randomly selecting representatives from among the population may well produce as good of or better results than having the population at large vote. You couldn't pick someone with as much power as the US President has that way, but those randomly selected representatives could certainly choose someone. And they'd probably do a better job since they'd be more visible and have some feeling of responsibility for their choice.

@freakazoid @feonixrift @dredmorbius @mathew @o @woozle

Democracy is obsolete, yo.

(not that we don't use plenty of obsolete things, but we'd be better of not to).

@freakazoid Selection from a reasonably prequalified pool could well work. Sortition.

@feonixrift @mathew @o @woozle

@freakazoid @feonixrift @dredmorbius @o @woozle
"Randomly selecting representatives from among the population […] You couldn't pick someone with as much power as the US President has that way"

I'd take a randomly selected member of the population over the current US President.

@freakazoid Again, agreed on multiple roles (teaching is often ~= daycare) and anti-intellectualism. I've addressed the latter as urban-rural / complex-simple divide:

old.reddit.com/r/dredmorbius/c

@woozle @o @mathew

@freakazoid I agree.

Teaching works best where:

1. The students are roughly equivalent in skill.

2. The teacher can work 1:1 with students (often described as "the gold standard" -- direct tutoring -- very expensive BTW), to see _how the lesson is being assimilated_.

#2 is a failure of technical teaching "solutions". Teachers aren't merely information delivery systems, they are *guides*, who see where students stumble, and can give learning cues, not just repeat rote.

@mathew @o @woozle

@freakazoid That's a really good point.

It's also directly testable: _if_ we can agree on a set of non-socially-discriminatory test criteria, we can apply those to a wide sample of subjects and see how well they do.

At the same time, this problem highlights a fundamental dynamic of ToTMVU: that *assessing quality is itself difficult, expensive, and not generally agreed upon*. How do you find what is best *if you cannot even agree on what is "best"?*

@mathew @o @woozle

@freakazoid The other issue is that as more things get more complicated, you've got more to teach, to more people, but only so much time, bandwidth, and effectiveness with which to do it.

Civilisations progress based on intergenerational knowledge transfer. *That* is based on both explicit (linguistic) and tacit (experiential) knowledge. We can bump up explicit knowledge transfer fairly effectively. Tacit not so much. Students need to DO, under direct guidance.

@woozle @o @mathew

5/

@freakazoid So you either start stopping teaching things, or you split up what you teach to whom.

The first leads to overall knowledge loss. And that happens anyway.

The second leads to a separation of cultures, and ultimately literally social tribes who do not understand each other. C.P. Snow's Two Cultures, but multiplied.

How easy is it for you to describe what you do professionally to someone who doesn't do it? Your parents, family, friends, strangers?

@woozle @o @mathew

6/

@freakazoid Hell, I've struggled to explain sysadmin + data hat to other (skilled, recognised) tech types. Or even myself.

(A partial realisation is that the knowledge transfer and pioneering I've been describing is actually a large part of the job, though never a part of job descriptions.)

What happens when major, *mutually critical* parts of society and technological workforces cannot understand *or trust* one another?

(This goes beyond shitty products, IKYMI.)

@woozle @o @mathew

7/

@freakazoid But, pulling this train back on the rails: if you have worlds where people don't understand each other fundamentally, they're not going to understand their tools, the work requirements, workflows, habits, etc.

And if you put management in the hands of yet another class or classes (PMs, sales, finance, business), that translation gets all the worse.

There's a type of language that emerges between multiple independent cultures in close contact: pidgin.

@woozle @o @mathew

8/

@freakazoid Pidgin languages share some characteristics:

1. They're not native tongues (those are creoles.)

2. They draw on multiple sources, extracting what they need.

3. They're limited to the interface requirements only. Trading cultures are especially prone to pidgin tongues. Not the production, transit, consumption, or use, but the _exchange_.

4. Of necessity, pidgens are simplifications of their source languages.

And the kicker: ...

@woozle @o @mathew

9/

@freakazoid 5. Pidgen is the linguistic equivalent of a Tyranny of the Minimum Viable User. Pidgen languages evolve under the same requirements as technical products discussed earlier. They have to be sufficient to the task, but understandable to the least capable beneficially-contributing members of interactions.

Also: I'm not saying pidgens are lesser or not useful. They're shaped by demands and environments. But those influences have consequences.

@woozle @o @mathew

10/

@freakazoid And back to ToTMVU and products: there are a minimal number of ways out.

You can try educating users. If that's _everyone_, you've got a big problem. Advertising is one answer to that challenge. I'm not sure that's a net positive. "High touch" sales are another.

(The prevalence of sales-guy-as-tech-CEO may derive from this, a recent realisation.)

You can limit your market through regulation and/or licensing. That keeps standards up, but size down.

@woozle @o @mathew

11/

@freakazoid You can go mass-market. You'll run square into the ToTMVU here. The firm that caters to this more viably, modulo other influences, has a higher likelihood of winning. Trying to chase the market up-skill generally loses: Compaq, SGI, Sun, IBM. Bicycles. Fine foods. Quality TV programming. Especially with network effects.

You can exit the market. Survive on government grants, patronage, subscribers. Small-scale, may work.

That's what I've got.

@woozle @o @mathew

12/end/

@freakazoid @woozle @o @dredmorbius Oh yes, the "people choose software for features, not quality" thing is something I've been yelling about since the 90s. It's why we have people insist on using Word when they would have an easier time with Markdown, insist on Excel when all they need is a table with no computation, etc.

@mathew @o @woozle @dredmorbius @freakazoid Yeah, I think the whole "VW vs. Toyota" thing works out very differently from how it's stated.

VW tended to run on the bleeding edge of internal combustion technology from the 90s through the 2000s, while also trying to push upmarket. This meant that they had compelling products when new, that were half-baked and unreliable.

Conversely, Toyota had an extremely conservative culture and ran only the most proven tech, which got them a reputation for extreme reliability. And, with the combination of American protectionism and the Japanese bubble economy, they decided to push upmarket with extremely high quality and luxury, which then faded away once the protectionist policies were lifted. (The Prius, OTOH, was them panicking in the face of falling market share in Japan and trying to create a bleeding edge development model. The Prius worked, but the bleeding edge development model AFAICT got directed towards the hydrogen folly.)
@mathew @dredmorbius @freakazoid @o @woozle And, the "boring appliances" that Toyota made worked in Japan for whatever reason, and they worked quite well here in the US where the vast majority of the population views a car as a tool.

They were seen as unsuitable for Europe, though, and sticking to lower-tech naturally aspirated engines didn't work well on the (easily gamed) European fuel economy tests, compared to downsized turbocharged gasoline engines like what VW and other European manufacturers started pushing.

Note that VW's dreadful reliability kept them quite niche in the US in the 2000s and 2010s, while Toyota was dominant over the car market. In Europe, conversely, Toyota is for old people on death's doorstep.

@freakazoid Largely agreed, though I'll take exception to "laziness".

Case in point:my professional career as a sysadmin / data / ops / dev mutt. I've struggled to answer "what do you do", because much of it is a rather ill-defined "figuring out where the fucking plane is flying", where the plane isn't the system under my control, but the larger technical landscape.

What tools, methods, formats, applications, languages, devices, regulatory changes, market pressures, etc.,...

@o @woozle

1/

@freakazoid ... are coming down the pike, and what *minuscule* fraction of those will actually matter?

It's a case of overhwhelming information, both in quantity *and* complexity. And I landed in the spot *not* from Great Skill In Predicting the Next 30 Years of IT Infrastructure, but dumb young luck: I liked Unix.

After your first few years in the field (5, 10, 15, 20, and in my case, closer to the latter), your luck in being where the firehose happened to be squirting...

@o @woozle

2/

@freakazoid ... tends to run out, and you've got to get back in front of it, or bail out.

I've spend most of the past 5 years trying to decide which I want to, or *can*, do -- the process is tremendously exhausting.

Trying to suss out complex consumer options, and not just tech: investments, job/career, medical advice, services, and devices devices devices devices is HARD.

For all intents *NOBODY* can keep up at all of it. It's not laziness, it's overload.

@o @woozle

3/

@freakazoid So you fall back on heuristics.

A heuristic is a cheap (that is, low-cost, low-effort) way of discarding virtually all information in the process of coming to a decision.

Even *totally random* heuristics can be useful, in fact, *more* useful than biased ones, *if they prevent systemic error*.

(In exams I once took out and flipped a coin for questions on which I didn't know the specific answer though I'd ruled some out. Effective. And intimidating to others.)

@o @woozle

4/

@freakazoid Cargo-culting is heuristics gone bad.

Though in many cases, with some luck, *it can actually work*, if the facsimile is sufficiently close to the original, and/or you happen to be opportunistically in the cargo stream.

Not always, but just by odds, a decent bet.

So: yes, there's laziness. But there's also too much to know.

And: in a prescientific culture, virtually all human practice was cargo-culting, generally following tradition or myth.

@o @woozle

7/end/

@freakazoid A #CargoCult is the adoption of the *signifiers* of a quality, process, or mechanism, without understanding the *means* or *causes* of those signifiers. The extreme example being constructing airstrips and craft from jungle vines to invite western "cargo" as in WWII Pacific islanders.

If done deliberately, the word is "fraud".

The issue again is the difficulty in distinguishing *surface* appearances, #manifestations, from non-topical deeper significances.

@o @woozle

6/

@freakazoid I think that's most of the cases I've encountered.

A fun little exercise is running the query "a gresham's law of" or "a kind of gresham's law" in Google Books and seeing what turns up.

Examples: Divorce law, shipping regulations, environmental regulations, morals / ethics, neighbourhoods, legal citations, academia, students. It's a diverse and interesting list.

@o @woozle

7/end/

@dredmorbius @woozle @o I suspect all of those are based on the pop phrasing of the law, "Bad X drives out good." But AFAICT all of the actual literature of the law is specific not just to money but to money that people are prohibited from discounting.

I saw this in India, where people who still had prohibited coins or bills would try to pass them off every chance they got.

@freakazoid *Much* of the discussion of #GreshamsLaw is precisely as you describe it. To an extent that's quite frustrating to me.

*Some* is not, though much of that masquerades as discussions of information asymmetries. Those are *part* but *not all* of the Greshm's dynamic (a proper subset).

A reason I've gone #BorderlineObsessive on this is because I'm convinced that this *IS* a highly generalisable, and key, fundamental, and problematic, economic dynamic.

@o @woozle

@freakazoid More to the point, "all" of the mainstream economics literature on Gresham's Law is *a very small collection*, particularly as regards numerous other (and frankly, less significant) topics.

Which is one of my more general criticisms of economics-as-science: it's obtusely blind to its own problem areas. Gresham's Law being only one instance.

@o @woozle

@freakazoid Though, to take you 2nd 'graph: the actual details of a GL mechanic *in the case of money* is interesting. Bad money *tends* to drive out good, though in specific ways and subject to various limitations.

If there's only a *little* "bad money", it does so to a limited extent.

If there's an alternative quality currency, people will often switch to that -- USD today, Spanish Reals / Pieces of Eight in early America. Good/bad is specific to *A* currency, not *ALL*.

@o @woozle

@freakazoid Also, trying to dispose of prohibited (or counterfeit, or altered) bills, etc., isn't the original notion of GL, where an *officially sanctioned* debased currency drives out bad.

It's effectively fraud -- *attempting* to pass a *substandard* token as a *standard* one. Rather than having *standard* tokens of differing intrinsic value (usually: specie content) valued identically by official sanction.

Which gets us to the notion of seignorage...

(Ask if you want.)

@o @woozle

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

re: Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

Hot take on economics Show more

re: Hot take on economics Show more

Hot take on economics Show more