Beyond enshittification, why does tech oftentimes suck?

Hammocks4All@lemmy.ml to Asklemmy@lemmy.ml – 205 points –

Sometimes I’ll run into a baffling issue with a tech product — be it headphones, Google apps like maps or its search features, Apple products, Spotify, other apps, and so on — and when I look for solutions online I sometimes discover this has been an issue for years. Sometimes for many many years.

These tech companies are sometimes ENORMOUS. How is it that these issues persist? Why do some things end up being so inefficient, unintuitive, or clunky? Why do I catch myself saying “oh my dear fucking lord” under my breath so often when I use tech?

Are there no employees who check forums? Does the architecture become so huge and messy that something seemingly simple is actually super hard to fix? Do these companies not have teams that test this stuff?

Why is it so pervasive? And why does some of it seem to be ignored for literal years? Sometimes even a decade!

Is it all due to enshittification? Do they trap us in as users and then stop giving a shit? Or is there more to it than that?

97

I worked at Google for over a decade. The issue isn't that the engineers are unaware or unable. Time and time and time again there would be some new product or feature released for internal testing, it would be a complete disaster, bugs would be filed with tens of thousands of votes begging not to release it, and Memegen would go nuts. And all the feedback would be ignored and it would ship anyway.

Upper management just doesn't care. Reputational damage isn't something they understand. The company is run by professional management consultants whose main expertise is gaslighting. And the layers and layers of people in the middle who don't actually contribute any value have to constantly generate something to go into the constant cycle of performance reviews and promotion attempts, so they mess with everything, re-org, cancel projects, move teams around, duplicate work, compete with each other, and generally make life hell for everyone under them. It's surprising anything gets done at all, but what does moves at a snail's pace compared to the outside world. Not for lack of effort, the whole system is designed so you have to work 100 times harder than necessary and it feels like an accomplishment when you've spent a year adding a single checkbox to a UI.

I may have gone on a slight tangent there.

Reputational damage isn’t something they understand

Is this really the case? I feel like they might, but are deciding that its "worth the cost of business"

I'd think since companies get big enough they can just buy the promising competition before it becomes a problem, I'd say it's a worthwhile cost to them

Yeah, lack of competition is driving a lot of this. Fixing bugs doesn’t increase their stock value. It doesn’t make the line go up.

Launching products and bragging about profits makes the line go up (especially just before a quarter or monthly report is due).

AT&T/Bell Telephone was like this for years until they were finally broken up (nominally). When cellphones came out and provided nationwide competition, long distance suddenly became free.

We need to bust up google, Facebook, etc. They have nothing to push them to be better, just CEO egos and investors to please.

Facebook as a product is over. It's like 90% ads. I almost never see my friends posts anymore.

Its marketplace has been really popular in my area. Craigslist has all but dried up for may item types.

But they own Instagram as well don’t forget, and they have bought out many other competitors that we won’t ever get to experience.

I ran into a guy from high school and it turns out he worked for Microsoft back in the Windows Mobile days. He said that changing even a single button on a submenu would take six months of meetings, and if it involved other departments they would actively sabotage any progress due to the way MS internally made departments compete, so you could basically forget it. He said they literally backdoored software so they could sidestep other departments to get features in.

I think about that a lot.

A corporate analogy/strategy is to block your competition from the market share.

For example, a company I used to work for would open accounts in non-viable/non-profitable locations so that our competition would not have the chance to get more market share.

Big corps don't give a shit if it works or not, as long as they are the biggest they can squeeze out anyone else, so they will launch whatever is trending (meta/threads) and bullshit thier way into another piece of the pie.

This sounds like working as a North American public servant hahahahaha

This is what you get when humans try to work beyond our monkeysphere. It's not "capitalism" or "greed" or any other such childish ideas. Groups that large cannot be efficient.

https://www.cracked.com/article_14990_what-monkeysphere.html

The article is so old the formatting is all jacked, but you can get the jist of it easily enough.

But the trick is having layers of monkey spheres! The ceo monkey has 20 directors below it and each of those has 20 people leading people so it all reports up and gets lost but is "good enough".

except capitalism incentivises top down organizations that lead to those problems?

The difficulty of keeping something working scales exponentially as its complexity grows. Something of 1x complexity take 1y effort, but 2x complex is 10y effort, 3x complex is 100y, on and on.

Phones/computers/apps are at hilarious levels of complex now, and even 100k people running flat out can barely maintain the illusion that they "just work." Add enshittification heaping its intentionally garbage experience onto the unintentional garbage experience that is modern computing, and it's just gotten stupid.

Seriously. Millions of things have to go right for your consumer electronics or software experience work seemingly flawlessly. Think about the compounding probabilities of it. It’s a monument to human achievement that they work as well as they do.

It’s a monument to human achievement that they work as well as they do at all.

FTFY.

It doesn't help that every new generation adds a new blackbox abstraction layer with little to no end-user benefit, the possibility of duplicated functionality and poor implementation, security concerns, poor support, and requiring a flashy new CPU with system crashing speed tricks to maintain a responsive environment through 12 levels of interpreters.

Is this a complaint about the OSI model?

No, the OSI model is fine.

I'm talking more about sandboxing an interpreted app that runs a container that runs another sandboxed interpreted app, both running their own instances of their interpreter with their own dependencies and accessible through a web interface that is accessible through yet another container running a web server that is running in Python with a virtual environment despite being the only Python app on the container, which is then connected to from another sandboxed tab on a sandboxed browser on your machine.

But hey, at least it isn't, god forbid, a MONOLITH. That would require someone to take the time to understand how the application works.

Ah, yeah I get that. Java interpreter so you can virtual machine your way into having someone else making sure the thing works with all hardware it can live in.

Blind scalability and flexibility are neat tho, gives access to a lot less knowledgeable people to do stuff and theoretically frees up those who know for more complicated tasks.

It almost never works like that.

People who don't understand computers will work against it in almost every case.

Been saying that about the internet for 30 years. It's a damned miracle it works at all and people whine and cry about every little hitch.

Yes, people often want things that work. If there are good reasons why there is clunkiness, then, if these reasons are commonly understood, more people will be more patient. Knowledge is power. That’s the point of this entire thread.

People who weren't interested in tech found out they could make a lot of money in the field. The scene went from nerds who were passionate about the field to people who would be just as (un)interested in being doctors and lawyers. The vibrancy is gone.

Source: tech-excited nerd who got into the industry in the late aughts.

I definitely agree about the vibe being different in the mid 90s to the early 00s. Lots of passion and energy about the tech. I don't think it's all gone but it's definitely nowhere near as intense.

Every single new "innovation" is literally locked behind a paywall, sometimes multiple, in tiers. You can't just "buy" anything anymore, you can only lease it, usually at exorbitant prices compared to not that long ago.

Why is it so pervasive? And why does some of it seem to be ignored for literal years?

Considering that you know that these problems have not yet been fixed, you must still be using these products despite these problems not yet being fixed and there's your answer: What would the motivation be to fix problems that aren't severe enough to make you stop using the product?

Programmers don't get given the leeway to make the work they do of good quality if it doesn't directly lead to more profit

Speaking as a software engineer, it's usually a combination of things.

The root of all evil is that yes, fixing that thing doesn't just take one hour, as it should, but rather a few days. This is mostly preventable by having sufficient automated tests, high code quality and frequent releases, but it's a lot of work to keep up with. And you really need management to not pressure early feature delivery, because then devs will skip doing necessary work to keep up this high feature-delivery velocity.

Well, and as soon as such a small fix has a chance of taking more than a day or so, then you kind of need to talk to management, whether this should be done.
Which means probably another day or so of just talking about it, and a good chance of them saying we'll do it after we've delivered this extremely important feature, which usually means 'never', because there is always another extremely important feature.

This. Worked at a consulting firm doing e-commerce for a client. The client always pushed making changes on banners or promotional texts rather than fixing bugs.

There was an issue with the address validator in the checkout (why and how is irrelevant) and it was raised by the QAs, but we were told to fix it in the future, they didn’t see it as a priority, they preferred a checkout that worked most of the time an focus on adding a promo banner.

Now I work in a better place, working on product with stakeholders who don’t prioritise new things over fixing stuff, but we still need to fight to have time allocated for technical improvements that the benefits are not directly evident in the final product.

Are there no employees who check forums? Does the architecture become so huge and messy that something seemingly simple is actually super hard to fix?

👆I’m guessing this one is Microsoft. 👆

Apple I cannot explain. They were the gold standard of both brilliant UI and UX, as well as best in class customer support. Now I’m tearing my hair out over seemingly simple things (like their horrendous predictive text in iOS), and I don’t even have any hair.

Apple is a strange beast. I was at their space ship HQ getting interviewed, and the guy kept pointing random facts about it. Like, this particular wood was harvested in the winter so that made it better, or that entire segments can be siloed off, or that the full height glass walls of the cafeteria can be opened on pivots, and there was just so much effort in making sure things worked just right.

Meanwhile [this team] had to test software fixes for their product by provisioning ancient Mac mini's in a closet lab because they wanted to test the "full experience" and so every patch and update they had to do was painful and horribly tested. They all hated each other (which was obvious to me just from my time in their interviews, so it must have gotten really bad during the workday I imagine). Everyone seemed on edge all the time. Even the people in the hallways. But they were all super excited that they could order lattes from the iPads tethered to the break room countertops. And they had an apple orchard I guess. The idea of changing how they do what they do was completely unentertainable.

The whole experience felt surreal, like I had stepped into the world according to The Onion.

Their UX and UI are their bread and butter, but as someone who has done extensive web app development for use on Safari browsers, if I had a nickel for every time their browser just IGNORED a standard, broke one that previously worked, or added new "features" that broke a standard, passing the responsibility of building a workaround down to individual developers... I'd have a few dollars anyway. I don't have much faith their code is all that good compared to average under the hood and the UI, and I think their reputation unjustly leads users to turn a blind eye or give them a pass when their stuff DOESN'T work or works BADLY. "They're Apple... everyone else seems happy. I must be doing something wrong."

Well i for one experience Apple rage multiple times a week, but I’m so entrenched in their ecosystem, i may never escape. Also there is no better alternative that would be quick and easy to setup and maintain.

Apple is a victim of always having to build the new thing, so there's never time or resources to fix the old things. They can sometimes do an end run around this by re-releasing the same thing over again and pretending it's new, but then the cycle just begins anew

Half their stuff is just android features they were slow to adopt

People still buy the mac books and that's got nothing to do with androids, so they can fish from a few different rivers.

Because you're not paying extra for those problems to get fixed. And no, when you receive millions of forms per day, not every piece of feedback makes it back to someone to actually fix the issue. Especially when half those issues are "when I don't have internet I don't receive new emails".

Software, like hardware, is a balance between supply and demand. People would rather pay less for a phone crammed full of ads than pay for a service. Just look at YouTube for that one.

Also, those clunky interfaces are there for a reason. Maybe the interface element that's a lot better doesn't work in right to left languages. Maybe the information overload of too many buttons and labels made the old interface impossible to extend. Maybe the prettier solution doesn't work with screen readers or with the font size and colour cranked up for people with low vision. Maybe the feature redesign worked great but SomeCorp Tweaker Software will bluescreen the machine when it finds the word "checkbox" in a settings page for your mouse. Maybe the design team had a great idea but the feature needs to ship next week so whatever needs to happen to make that works happens, and the five other features planned for the month already eat up the rest of the dev team's time anyway.

But most of the time, things are suboptimal because there are seven teams of people working on features on the same screen/system/application and they need to make do.

If you have serious issues with some software, many companies will let you partner with them. In exchange for hundreds of thousands or millions, you can directly get support for your use cases, your workflow, and the stuff you need to get done, over the billions of other people that also need to use the software. And sometimes, that means your super duper expensive preference/feature/demand means someone else's workflow is entirely broken.

If you know what you want, there is a way out: going the way of open source and self hosted. Within a few years, you too will grow resentful of dozens of systems made by different people all interpreting standards differently and not working together. You have the power to fix each and every feature, bug, problem, and design flaw, but none of the time or the detailed knowledge. You don't have the money to pay experts, and even if you did, what they do may not entirely suit you either. Trying to fix everything will drive you absolutely mad. And that's why companies and people often don't try for perfection.

Aside from the effort required others have mentioned, there's also an effect of capitalism.

For a lot of their tech, they have a near-monopoly or at least a very large market share. Take windows from Microsoft. What motivation would they have to fix bugs which impact even 5-10% of their userbase? Their only competition is linux with its' around 4(?)% market share and osx which requires expensive hardware. Not fixing the bug just makes people annoyed, but 90% won't leave because they can't. As long as it doesn't impact enterprise contracts it's not worth it to fix it because the time spent doing that is a loss for shareholders, meanwhile new features which can collect data (like copilot for example) that can be sold generate money.

I'm sure even the devs in most places want to make better products and fight management to give them more time to deliver features so they can be better quality - but it's an exhausting sharp uphill battle which never ends, and at the end of the day the person who made broken feature with data collector 9000 built in will probably get the promotion while the person who fixed 800 5+ year old bugs gets a shout-out on a zoom call.

I’m not sure Windows is a good example here since they’re historically well known for backwards compatibility and fixing obscure bugs for specific hardware.

Whereas Linux famously always had driver support issues.

Backwards compatibility - yes I agree, it's quite good at it.

Hardware specific issues for any OSes - disagree. For windows that's 80-90% done by the hardware manufacturer's drivers. It's not through an effort from Microsoft whether issues are fixed or not. For Linux it's usually an effort of maintainers and if anything, Linux is famous for supporting old hardware that windows no longer works with.

But the point I was making is not to say Linux or osx is better than windows or vice versa, it's that windows holds by far the largest market share in desktops and neither of the alternatives are really drop-in replacements. So in the end they have no pressure on them to improve UX since it's infeasible to change OS for the majority of their users at the moment.

In terms of driver development it’s more collaborating than just Windows releasing an API and the manufacturers creating the drivers. Bug reports from large manufacturers are absolutely taken seriously. Customers usually never see this interaction.

I would say maybe 1990-2010 was really a dominant time for Windows and that was also when they were actively improving, but now they have plenty of competition. It’s just that people have moved away from desktops. Mobile platforms are now an existential threat. Fewer and fewer people are buying desktops to begin with. Maybe Microsoft has given up on desktops and that’s why they’re making Windows worse.

"Unless it's renders the product completely unusable, why spend money and fix it?"

Corporate mindset in a nutshell!

"Unless it's renders the product completely unusable unprofitable, why spend money and fix it?"

Most people tend to buy the imperfect cheap product rather than the better, more expensive product.

If we refused to buy crap, they wouldn't make it. If we refused to buy it, they couldn't make it.

They sell us crap because collectively we prefer it.

But in tech, there's often a lot of overlap in the high-end and crap...at least in terms of issues.

Expensive, high-end products can sometimes just be frustrating, or just lacking features that'd seem obvious.

Capitalism sucks!

continues purchasing crap

See! I told you!

Tech companies only care about making money. If people continue to buy their half-effort products, then they'll keep making it.

On the other hand, open-source (hardware or software) is designed for maximum longevity.

Unfortunately, the wrong people have unlimited resources when it comes to making our tech products.

Arrogance. They're attitude is basically "we built it, so it's golden. If you can't understand why we did it this way, then put the device down and flip burgers".

I saw this starting around the year 2005. I spoke out about it and told people stop buying /using products that aren't logical and easy to use. If it takes a Google search and a YouTube video to figure out how to use it, then it was built wrong. Return the product and get a better one. No one listened to me. We have what we have.

It sucks and it will only get worse. People will not change. People will keep buying shit products, then bitch that the products suck. Instead of returning the crap, they will keep it. Because they keep it the companies have zero reason to change.

LOL, those last three sentences wrap up lemmy's capitalism hate perfectly.

"We keep spending money on bullshit and kept getting fed worse bullshit!"

"Have you considered not spending money on bullshit?"

"We HAVE to!!!"

There's the compounding issue that something that seems simple on the surface, say, pairing a pair of bluetooth headphones, is a convoluted mess of super-complicated shit on a technical level.

And to even handle that, the engineer making the app that handles these does not know about how to sync an L and an R headpiece. And the person who knows about that does not know how to establish contact via bluetooth. Etc. It's layers upon layers upon layers of tricky technical stuff. Each of which has the ability to propagate buggy behavior both up and down the layers. And each engineer probably cannot easily fix the other layers (they're not theirs), so they work around the bugs. Over time this adds an insane amount of complexity to the code as hundreds of these tiny adjustments are spread everywhere.

TL;DR software development is hard.

Hard to respond with anything else since you haven't really given examples.

I’d expend your tldr just a bit to include.

  • users are stupid
  • software is designed to work for both Tom Tecnowizard and Paul Pebkac
  • finally, ads ruin everything they touch

It's a young field and we're still entrenched in the consequences of the sort of mistakes that, in a few hundred years, will become "those silly things people used to do because they didn't know better".

Daily reminder that the web is a mess of corpo bullshit piled on top of 90s tech and most OSes currently in use are culturally from the early 80s.

Is that a thing that goes away? I think a lot of fields still have that silly things being done even closing in a half millennia on the industrial revolution. You still have tons of screw head sizes and types! Why such diversity!

The screw heads are mainly to prevent people from tampering with stuff they aren't supposed to unscrew. Hard drives, for example, all use the same star-shaped heads that most people don't have screwdrivers for.

I do think that people passionate about information technology – those who love it for the intrinsic awesomeness and not the money it brings – could break away with some of the legacy bullshit that holds back the quality of the software we use, if they were given the opportunity to defy software "tradition" and the profit motive. As of now, there is no systemic path forward, only occasional improvements incited by acute inadequacy of existing conventions for the growth of interested businesses.

Whole that's true, you have Philips and flat heads and ikea hex which could all be those sort of flat and star that are for common people that could be more universal.

About software were a lot freeer, because if it doesn't have hardware and specially infrastructure requirements, such as the whole Internet layers or new visualisation devices you're open to change things up a lot.

I mostly agree - however there are physical/mechanical reasons behind the use of some of those. For example, Phillips head screws will 'cam out' (driver will slip out of the screw head) rather than get over-torqued, which is useful in various situations - although TIL this was not actually an intentional design feature!

https://en.wikipedia.org/wiki/Cam_out

Hex keys are better than a Robertson (square head) in tight spaces with something like an Allan key, and, in my experience anyway, Robertson can take a fair bit of torque, so they're great for sinking into softwood - and also for getting out again, even when they've been painted over.

Flathead screws, on the other hand, should launched into the sun

Things like planned obsolescence and software blocks on things like farmers fixing tractors without John Deere’s software permission almost makes me think the bad guys won the Cold War.

Between me and a mechanic friend, we can fix my car but we can’t turn off the (wholly unnecessary) “inspection needed” noise without me spending $1000 on software. Apparently, the inspection needed warning isn’t even related to anything. It just comes on every x miles. The car doesn’t have a detected issue or anything. That beep is radicalizing me.

Have you tried Google keyboard (gboard) lately? It made me want to break my phone and just not have one at all. It corrects proper words to other words that make the sentences don't make sense. It corrects words that are already correct and it ignores the misspelled words. It wants to speak for me. They think they're making us type faster with their predictive text, but I was re-reading every thing I put on the internet. I became slower. Thankfully I found a worse keyboard, but it doesn't autocorrects as much and I'm ok with that. Fuck Google.

Go heliboard

I didn't know about this keyboard actually. Just installed it and it's great. Only issue with it that it only supports English. I guess I'll use it for English only. Thank you so much.

EDIT: never mind. It does support other languages. All set now.

I use it with many languages. I even swipe in spanish

How do you enable swiping in this keyboard. I can't, for the life of me, find it in the settings.

To enable gesture typing, download Google's gesture typing library at https://github.com/erkserkserks/openboard/tree/master/app/src/main/jniLibs/arm64-v8a and then import it by opening the OpenBoard settings app, going to "Advanced", and then choosing "Load gesture typing library". Note: Google's library is proprietary and not open source.

The management culture emphasizes a workflow that's heavy on low skill junior devs and cheap foreign labor of highly variable quality. You caaaaan do that well with infinite planning and QA and project management and test-driven design, but the reason you're trying to do it that way to begin with is you're an under qualified yes-man careerist dipshit trying to come in under budget and time, so you won't. And these are the wages of low wages.

This is somewhat outside the box but as tech becomes easier, a lot of people tend to become weaker at certain tech skills. An example of this is directory management. A lot of folks don't organize their file structures nowadays, relying heavily on the search bar to find everything.

Something I've noticed in places I've work that aren't small, whoever has talent gets promoted into being half the time in meetings at best, and at worse into managing teams and working by Outlook.

Agile has poisoned software development to the point where it's fine to ship shit products that can be fixed post-release, which of course gives stakeholders and execs the reasons to tie performance and bonuses to shipping, as opposed to routine stable operations.

I don't know if going back to Waterfall is the right fix, but something has to change. Shipping crap is the new normal. If programmers organize to fight for better wages and conditions, we absolutely must fight to hold management responsible for code quality. Get us additional hours for unit and behavioral testing, assessing and tackling technical debt, and so on.

Apple

I’ve submitted at least 8 bug reports to them since Oct 2023 (and also many suggestions) through their feedback app. No response to any of them until now. The only closed bugs I closed myself because the problem went away in an update.

I’m pretty sure they don’t have any bug triager whatsoever.

I’ll keep doing it out of spite and because it’s what I do for open-source as well, but I’m really not sure if it has any effect at all.

  1. Monopolization. If you have become the standard, there's no reason to improve.

  2. Technological advancement. If the speed of new processors continue to double every year, why bother optimizing your program?. This pisses me off so much, games don't look much better but are 4x harder to run compared to 8 years ago.

  3. Cost. Having many programmers, and bug testers on payroll to improve your product is expensive. Massive companies are pennywise pound foolish and will hack and slash at their staff line up until catastrophe strikes (which usually only occurs long after the layoffs)

Everyone else has great points about complexity, but there is an additional issue which is the constant desire for change keeping products from being refined and perfected.

Any product will have small changes that improve it, like reinforcing points of failure specific to that design. Let's take a kitchen knife, the kind chefs use. Some manufacturers have the exact same model produced for decades, with ever so slight variations on angles, handles, and so on as they refined design. Now they are high quality if they keep the production going, and that is something that has no moving parts! These knives continue to sell because they are used constantly, can break or be damaged, and new restaurants open all the time requiring a constant supply of knives.

The home knife market does not have the same pressure for reliability because people don't use them all day every day like a chef. Instead, companies are constantly changing designs to sell new versions to the same over saturated market that prizes form over function. They change the handles slightly, make a change to the blade, and sometimes these changes make the knife worse but they can slap a 'new and improved' sticker on the label as long as something changed.

The same thing happens with technology except complex systems have even more refinement needed while the companies are also trying to change things just to change them in the pursuit of the 'new and improved' market. Moving menus around, changing orders of things, making things look flashy are all side effects of tech being afraid of selling the same thing for an extended period of time because people want something new and shiny to replace what they had. Time and effort is spent on changing things, and it is hard to do bug fixes while also creating something new that might make a bunch of old bugs obsolete. Oh, and they will also be spending their time trying to patch critical vulnerabilities, because that might keep someone from buying their next thing.

So all the effort going into changing things, often making them worse if they happened to stumble into a useful design already, and they put all of their focus on that change and vulnerabilities so they don't have time to fix usability issues or do the things that would make their product better because why bother as long as people are buying? Anything someone who is knowledgeable about being fixed is unlikely to be a priority because the regular user probably hasn't even noticed and they are the ones who are going to buy the next version. That is why things like bluetooth continues to suck, because it works well enough to sell more things and doing it right would take more effort. The handy feature that you used to like being removed? They felt it needed to change just to change and whoever provided input or feedback came up with this instead.

Oh, and all of this was just talking about available time spent doing things but on top of that they want to spend as little as possible so they get the cheap parts that are made by companies who also make a product just good enough that they get more customers to buy their parts for as little cost to produce as possible.

TLDR: market pressures favor changing things constantly which introduces more design flaws and capitalist pressures focus on revising designs to sell more and security flaws so as long as it sells it doesn't matter if it has shitty usability and minor flaws are never fixed

Most tech sucks because it's closed source. Closed source products are typically made with "the least amount of work done to sell for the most amount of buck". So standards are only sloppily and partially implemented (or sometimes purposefully badly or differently to ensure incompatibility), and bugs after sale won't be fixed because why would they? They already have your money. Middle managers will work hard to ensure more money goes to advertising and marketing than to actual development.

Then there is the embrace, expand, extinguish mentality (hello Microsoft!) to force customers to stay around their shitty products. Microsoft 365 and teams shit are perfect examples. The company I work at currently uses it and it's beyond garbage shit that is expensive as hell. Not an hour goes by without me being confronted by bad design, bugs, bugs, bugs, so many bugs... And it's all designed to ensure you stay in their little walled garden. I can't change this today, but I'm planning to be rid of it in about a year from now, fingers crossed.

In my experience, open source software is fucking awesome because people built it to actually build something awesome. Standards are implemented to the letter, bugs are fixed, and it all works and looks awesome.

Good enough 90% of the time makes 99.9% of the money so why bother making things perfect for the power users?

I think that manufacturers of tech products test their products only with a few standard configurations - but in reality there are too many possible combinations of different configurations:

Take a bluetooth mouse for example. Generally, it connects to a computer and it works. Now imagine that you have a different configuration - a logicboard in your laptop that has not been tested by the manucacturer of the mouse or an obscure model of the bluetooth reciever, that also hasn't been tested to work with that mouse. Your mouse works well in the beginning, but disconnects at random times. You can't pinpoint the issue, and when you are looking for help online, nobody seems to have the same problems with that mouse.

In this case, said mouse sucks, because it doesn't function reliably. A different person with a different configuration of their computer (different logicboard, different model of the bluetooth unit) might have no problems at all with the same mouse.

Tge problem is money. The incentive if to make as much money as they can, not the company. Company loyalty has completely been blown up by companies, so now not even the ceo gives a fuck, he'll be running another company with a 10% raise this time next year.

This is a topic that could be a novel for how much there is to consider, but in the end it comes down to resources and companies trying to choose what it best for the company overall. For a company to do anything, they are giving up many other things they could be doing instead. Whether it is limited budgets, limited personnel, or company priorities every decision made is always a tradeoff that means you aren't doing something else.

Most companies prioritize releasing new product so they can start getting revenue from it as soon as possible. A new product has the largest potential market, and thus makes shareholders happy to see revenue coming in. The sales from a new product are the easiest ones in most product's lifecycle. Additionally. releasing new products helps keep you ahead of competitors. So ongoing maintenance work is de-prioritized over working on new things.

The goal of testing is to simulate potential use cases of a product and ensure that it will work as expected when the customer has the product in their hands. It is impossible to fully test a product in a finite amount of time, so tests are created that expose flaws within a reasonable search space of the expected uses. If an issue is found then it needs to be evaluated about whether it is worth fixing and when. There are many factors that affect this, for example:

  • How much would it cost to fix?
  • How much time would it take to fix?
  • Does it need to be fixed for launch or can it be a running change?
  • How many customers are actually going to see the issue? Is it just a small annoyance for them or will it cause returns/RMAs?
  • Is it within the expected use case of the product?
  • Can we mitigate it in software/firmware instead of changing hardware?
  • Is it a compliance/regulatory issue?
  • Would this bring in new customers for the product?
  • Was this done a specific way for a reason?

Unfortunately, after considering all this the result is often that it isn't worth the effort to fix something, but it is considered.

Sometimes it's a solution in search of a problem. Usually that'll be some startup that really wants Google (or somebody) to either buy them out or shovel millions of venture capital money at them. VC that would be better used for anything that housing homeless people, feeding the hungry, or hell just burning to stay warm.

We tend to forget that all of that is to support people. Tech shouldn’t be an end goal, merely one of the ways to achieve it. And not always the best one at that.

enshitification is based on the ease of moving profits from users to creators then from creators to shareholders in a digital service economy all the while degrading the service for the users and then the creators as the profit fulcrum.

so enshitification might be a different thing than the reality around manufacturing items in an international environment which requires design decisions that later require revising because not all materials are available from everyone in the way a design is called for. and finding people that can assemble things while receiving a wage that they can live so that a company can make a profit requires compromises. and that is just two tiny points in not including shipping and workspaces and insurance et cetera

it is hard, yo. in a not a one part is inconceivable hard but in a it gets complicated pretty quickly type of hard.

I feel that best tech comes out when the economy is doing great, lots of development and creativity.

CEOs wanna cosplay Steve Jobs and unvail their crazy new features. They've already "trimmed the fat" to appease shareholders.

They just can't make fixing old issues sexy.

A bit like Motorola phones killing messenger e even if you tell them not to, or losing photos if you press home too soon after a night shot.

Leveraging technology is a lever of power. Whenever you use technology, you are acting in a submissive manner and that will be used to exploit you.