They also took out all the quality, though in pure business terms one can argue that's a kind of "slack" by itself.
The beancounters have cut all the corners on physical products that they could find. Now even design and manufacturing is outsourced to the lowest bidder, a bunch of monkeys paid peanuts to do a job they're woefully unqualified for.
And the end result is just a market for lemons. Nobody trusts products to be good anymore, so they just buy the cheapest garbage.
Which, inevitably, is the stuff sold directly by Chinese manufacturers. And so the beancounters are hoisted by their own petard.
We've seen it happen to small electronics and general goods.
We're seeing it happen right now to cars. Manufacturers clinging on to combustion engines and cutting corners. Why spend twice the money on a western brand when their quality is rapidly declining to meet BYD models half the price.
---
And we're seeing it happen to software. It was already kind of happening before AI; So much of software was enshittifying rapidly. But AI is just taking a sledgehammer to quality. (Setting aside whether this is an AI problem or a "beancounters push everyone into vibecoding" problem)
E.g. Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there. Windows is just going down in flames. People are jumping ship now.
SaaS is quickly going that way as well. If it's all garbage, why pay for it. Either stop using it or just slop something together yourself.
---
And in the background of this something ominous: Companies can't just pivot back to higher quality after they've destroyed all their inhouse knowledge. So much manufacturing knowledge is just gone, starting a new manufacturing firm in the west is a staffing nightmare. Same story with cars, China has the EV knowledge. And software's going the same way. These beancounters are all chomping at the bit to fire all their devs and replace them with teenagers in the developing world spitting out prompts. They can't move back upmarket after that's done.
Even when the knowledge still lives, when the people with the skills requires have simply moved to other industries and jobs, who's going to come back? Why leave your established job for the former field, when all it takes is the management or executive in charge being replaced by another dipshit beancounter for everyone to be laid off again.
> E.g. Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there.
Desktop Linux has gotten better, though much of the improvement happened decades ago. I believe the first person to prematurely declare "the year of Linux on the desktop" was Dirk Hohndel in 1999: https://www.linux.com/news/23-years-terrible-linux-predictio...
And speaking as someone who was running desktop Linux in 1999, I remember just how bad it was. Xfce, XFree86 config files, and endless messing around with everything. The most impressive Linux video game of 2000 was Tux Racer.
But over the next 10 years, Gnome and KDE matured, X learned how to auto-detect most hardware, and more-and-more installs started working out of the box.
By the mid-2010s, I could go to Dell's Ubuntu Linux page and buy a Linux laptop that Just Worked, and that came with next day on-site support. I went through a couple of those machines, and they were nearly hassle free over their entire operational life. (I think one needed an afternoon of work after an Ubuntu LTS upgrade.)
The big recent improvement has been largely thanks to Valve, and especially the Steam Deck. Valve has been pushing Proton, and they're encouraging Steam Deck support. So the big change in recent years is that more and more new game releases Just Work on Linux.
Is it perfect? No. Desktop Linux is still kind of shit. For examples, Chrome sometimes loses the ability to use hardware acceleration for WebGPU-style features. But I also have a Mac sitting on my desk, and that Mac also has plenty of weird interactions with Chrome, ones where audio or video just stops working. The Mac is slightly less shit, but not magically so.
So yes, Desktop Linux has "gotten better". What it hasn't done is solved any of the systemic problems.
The Open Source development quirks that created the shitshow of the 1999 is still here. Gnome is better but still suffers massively from mainstream features being declared stupid by the maintainers. (A power button that turns off the machine? Heretical.)
Valve's recent successes are pretty illustrative here. They used their money to directly hijack the projects their products rely on.
For what it pertains the comparison, Windows is not without this "slow" improvement either. 95 and 98 are lightyears behind contemporary Windows in so many ways. Until quite recently it still made about as much sense to use Linux as it did back then; Not much.
Take your Linux Laptop example. Sure, Linux finally kind of worked on some specific models that were tested for it. Meanwhile, Windows had moved from "it'll work with some mucking about with drivers" to "It works universally, on practically all hardware". Really, by the mid 2010s Windows would finally be quite tolerant of you changing the hardware.
Hence my original point; Desktop Linux hasn't really caught up with Windows in any meaningful sense. Windows is just nose-diving into the ground in the last few years.
> The Open Source development quirks that created the shitshow of the 1999 is still here. Gnome is better but still suffers massively from mainstream features being declared stupid by the maintainers. (A power button that turns off the machine? Heretical.)
Gnome have been chopping off their own limbs because it reduces weight. All in the name of simplicity. I think they are not the best example of Open Source development.
KDE on the other hand had a hard fall once and basically recovered and invested long term in Plasma and that has paid off handsomely. Today, it is one desktop that I can say is closest to typical/standard desktop paradigms out of the box while retaining a high degree of flexibility for those who choose to customise it. I have been using KDE on Fedora for a while now and it has been basically solid.
> I think they are not the best example of Open Source development.
They're not. I'm using them as an example of the "bad" in Open Source development.
But it's also not so much the individual OS components that are a problem, their interactions are just as fragile and usually subject to neither party taking ownership of the problem.
I feel like open source is a forest, not a garden. Things grow and die and those who are strong enough and useful enough and can fit in the environment do survive. The linkages are there but they are not perfectly arranged by some curator because it is not a curated garden. It is not totally haphazard though because the ecology has rules and the better pieces are in harmony with the rest of the system.
We forget that it this organic nature that makes open source what it is. Nobody charges you entry fee, there are no hidden fees once you enter, nobody is trying to sell you crap but you need to learn how to survive yourself. Once you do that, you understand how to deal with shortcomings of a package or finding another solution to your needs. Sure it takes time, but when that becomes second nature and you stop fighting the nature of the ecology itself, there is no other system like it.
That's why comparing it with advantages of other systems is pointless. I didn't choose to walk into the forest because it was a perfectly tailored experience, I did because I wanted freedom. It required me to learn how to fit in and now that I have, I'm at peace with where I am.
For some reference back in Ubuntu 6 days around 2005 I switched. It took me 2 weeks to get X Org to run with my nvidia card at the time. 2 weeks of messing with config files. I only persisted because I was so sick of windows.
> And in the background of this something ominous: Companies can't just pivot back to higher quality after they've destroyed all their inhouse knowledge. (...) They can't move back upmarket after that's done.
The knowledge isn't the problem. It can be quickly regained, and progress of science and technology often offer new paths to even better quality, which limits the need for recovering details of old process.
The actual problem is, there is no market to go up to anymore. Once everyone is used to garbage being the only thing on offer, and adjust to cope with it, you cannot compete on quality anymore. Customers won't be able to tell whether you're honest, or just trying to charge suckers for the same garbage with a nicer finish, like every other brand that promises quality. It would take years of effort and low sales to convince the customers to start believing you're the real deal, which (as beancounters will happily tell you) you cannot afford. And even if you could, how are you going to convince people you're not going to start cutting corners again a few years down the line? In fact, how do you convince yourself? If it happened once, if it keeps happening everywhere around across all economy, it's bound to happen to your business too.
Wrong on the first point, right on the second. Institutional knowledge can't be easily regained. To build up the knowledge to, say, make a transistor, you need a bunch of people experimenting with a bunch of things. Published scientific papers and patents will get you part of the way there, but the final stretch is still up to you, including things like which equipment to buy, purity of supplies (and where to get them!), how long the chip needs to be bombarded by each kind of particles, how much air the cleanroom needs to move. All the tiny details. You have to discover them by trial and error. Actual chip manufacturing companies have found themselves unable to get good yield until they copied the floor plan of another working fabrication plant, and they still have no idea why that mattered, but that's an extreme case. Maybe nobody expected miniscule air contamination from one process step was affecting another nearby process step, and in the original plan they were farther apart.
Yes if you want to wire a neighborhood for internet you can skip DSL and go straight to fiber. That's not the problem. The problem is that nobody in your company knows how deep to put the fiber to minimize problems, how much redundancy is needed, how strong the mechanical armor around the fiber needs to be, how many fibers per cable to meet future capacity needs without excessive costs, which landlords are friendly to you, nobody has the right connections to city hall to get digging permits approved expediently, and so on.
Only if everyone expects the business itself to continue to exist for 10 years.
Back when I was a kid, my dad claimed house construction firms (in the UK) regularly closed down after building homes to get around having to fix things under warranty.
Regardless of if this was actually true or not, the perception will kill trust.
Sure, hypothetically e.g. any western car manufacturer could poach a bunch of BYD employees. But it's not really practical for most businesses.
> The actual problem is, there is no market to go up to anymore.
This is the "Market for Lemons" problem, yes.
It's less of a problem than you might think. Convincing the entire wider world that you're legitimate is a problem. One made infinitely worse by store marketplaces like Amazon preferring to push "aqekj;bgrsabhghwjbgawrjwsraG" brand garbage.
So you just don't. The trick is to start small. The smallest you can sustain. (This doesn't work for cars, or anything that's sufficiently complex. You won't be taking on Salesforce.)
But so long as you can find a market niche where there's demand for quality, you can carve out a living, and from there, scale up.
The problem with that is twofold: Venture Capital has supplanted other forms of investment and "small business generating single digit millions in revenue" is utterly unappealing to VCs, even though the investment required is downsized accordingly.
And problem #2: The cost of starting a business is too high right now. Real estate and cost of living just make it unaffordable to even try. + Healthcare if you're in the US.
> And problem #2: The cost of starting a business is too high right now. Real estate and cost of living just make it unaffordable to even try. + Healthcare if you're in the US.
Fortunate for everyone that Europe exists and counts as "the west". Real estate isn't expensive everywhere, and much of Europe has affordable healthcare. Even in the current economy, several of my former coworkers from recent employers have founded new businesses.
> Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there.
Desktop Linux mostly works these days. It does everything most regular people would want of it, with zero fuss. Including playing games. In some respects, it's easier to use than Mac or Windows.
When it has trouble with some things, one must remember neither Mac nor Windows is perfect, and they can be extremely frustrating at times.
You know, I think you're right. I set up Void Linux on my desktop a couple of years ago, using the plainest base image. I chose that path because ZFS is important to me, and systemd is ugly to me.
I determined that this minimal start was the right confluence of those two preferences with the howto documentation available at the time, so that's what I did.
So I'm going along in rolling releases for years, and even playing games fine (except GTA:V online). Things are fine until there's some weird flickering-window bug that rolls down the pipe and apparently only affects nVidia users who use xfce4. It's an awful and particularly jarring bug.
The workaround is to disable vsync in xfce4, which solves that problem but always causes tearing for me. Even in YouTube videos. (Who knew how reliant we'd become on rote frame synchronization? Why, it seems like just yesterday when I was streaming potato-quality RealVideo episodes of South Park over dialup on a K6-2 box while marveling that any of it worked at all.)
I don't know which party is responsible for that problem, but nVidia has supposedly fixed it on their end in recent weeks. Which is cool, but this distro isn't shipping that version yet. And I'm reluctant to go off-script -- I loathe the idea of letting the nVidia installer do whatever it wants, and I'm not too keen on building my own package for xbps to handle (hopefully with better grace), either.
It was time to fix it anyway.
And I wanted to give Wayland a shot because there's a limitation with SDL that causes the clipboard method used by Factorio to fail to work with large blueprints under X11, so change was already in the air.
And so it began.
KDE Plasma: I couldn't get it to work. With both X11 and Wayland, parts of it would just die without leaving any traces I could find. It was unusable. I spent hours troubleshooting it and only a few minutes of actually using it, which is a terrible ratio. It had to go.
Gnome: It actually worked OK with Wayland. But when I say "worked OK", I mean that it acted like a touchscreen interface -- for toddlers. It didn't crash in mysterious and buried ways, which is good, but it I found it to be an affront to my sensibilities in ways that I simply could not tolerate. I won't apologize for hating it or for feeling insulted by it. That iteration of Gnome is dead to me.
So working down the list of non-ancient desktop environments: Cinnamon? It works. It's alright. It took some kicking to get sound to work because the mixer it comes with makes it impossible to set up default outputs in a way that behaves here and it reverts the system to its own broken ideas every time it runs, but it responded to my kicks without much of a fight and it works. The xfce4 volume mixer is on the taskbar instead, and I removed all traces of the Cinnamon mixer like it was a cancerous tumor, but with that done: Sound works. Regular X stuff works. It's good enough.
I haven't tried it with Wayland yet to see I can work around the SDL+Factorio SNAFU, but it's been behaving itself with X11 for a week or more.
---
Now, that may sound awful. And to be clear, it wasn't fun at all. But at least it's not like my Windows laptop. I've got stories about that, for sure.
One of things that sticks out right now is when that laptop would deplete its battery just sitting in my bag in the car. That was weird, but it became more urgent to fix it when I got the machine out of the bag and it was hot.
The cause for that was an HP printer driver (for a rather old color printer that I don't even own -- I do use it sometimes, but it's 25 miles away) that was periodically waking the machine from hibernation to check that printer's supply status, so it could try to sell me more stuff.
This task was so completely buried in Windows that it took hours to find it, and it was configured by its installer to wake the machine from hibernation -- including, specifically, while on battery. Because that's obviously what every user of any printer needs: Computers that turn themselves on using battery power to sell toner cartridges while hidden unseen inside of a bag in the back of a car.
It didn't have to be that way, but it was this way anyway. I consider that kind of thing to be deliberate in a fashion that extends beyond mere maliciousness: It is instead simply fucking evil.
---
So yeah, Linux is a great desktop.
I'd like to propose a new slogan: "Linux. At least it's not deliberately evil. Usually."
I think you’re not blaming political leadership enough. NAFTA, and other programs were always going to lead to the state of affairs we have now. This was a choice. Blaming greed is like blaming gravity.
The beancounters have cut all the corners on physical products that they could find. Now even design and manufacturing is outsourced to the lowest bidder, a bunch of monkeys paid peanuts to do a job they're woefully unqualified for.
And the end result is just a market for lemons. Nobody trusts products to be good anymore, so they just buy the cheapest garbage.
Which, inevitably, is the stuff sold directly by Chinese manufacturers. And so the beancounters are hoisted by their own petard.
We've seen it happen to small electronics and general goods.
We're seeing it happen right now to cars. Manufacturers clinging on to combustion engines and cutting corners. Why spend twice the money on a western brand when their quality is rapidly declining to meet BYD models half the price.
---
And we're seeing it happen to software. It was already kind of happening before AI; So much of software was enshittifying rapidly. But AI is just taking a sledgehammer to quality. (Setting aside whether this is an AI problem or a "beancounters push everyone into vibecoding" problem)
E.g. Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there. Windows is just going down in flames. People are jumping ship now.
SaaS is quickly going that way as well. If it's all garbage, why pay for it. Either stop using it or just slop something together yourself.
---
And in the background of this something ominous: Companies can't just pivot back to higher quality after they've destroyed all their inhouse knowledge. So much manufacturing knowledge is just gone, starting a new manufacturing firm in the west is a staffing nightmare. Same story with cars, China has the EV knowledge. And software's going the same way. These beancounters are all chomping at the bit to fire all their devs and replace them with teenagers in the developing world spitting out prompts. They can't move back upmarket after that's done.
Even when the knowledge still lives, when the people with the skills requires have simply moved to other industries and jobs, who's going to come back? Why leave your established job for the former field, when all it takes is the management or executive in charge being replaced by another dipshit beancounter for everyone to be laid off again.