s1k3s 16 days ago

I want to make native apps but Apple and Microsoft seem to be trying really hard to stop me. I have to buy developer accounts, buy certificates for signing binaries, share 30% of my revenue with them for barely any reason and so on. Not to mention the mess they've introduced in their APIs - especially Microsoft. So of course we choose the much simpler, much cheaper way of the web.

  • Aaron2222 15 days ago

    > I have to buy developer accounts

    The Apple Developer Program is only needed for macOS if you want to do sign your binaries or distribute through the Mac App Store. And you only have to pay Microsoft if you want to publish to the Microsoft Store (or use Visual Studio if you're a company that has more than 5 Visual Studio users, more than 250 computers, or more than $1 Million USD in annual revenue).

    > buy certificates for signing binaries

    Fair (though both Windows and macOS will run apps that haven't been signed, with more warnings of course).

    > share 30% of my revenue with them for barely any reason

    Only if you use their stores (Mac App Store or Microsoft Store), and it looks like the Microsoft Store won't take any cut if you do your own payments and it's not a game.

    • fragmede 15 days ago

      If you don't sign your binaries on macOS, the friction for the user to run your app is prohibitive, outside of developer-focused communities.

      • Aaron2222 15 days ago

        Yes, it definitely adds quite a bit of friction. Though my other points about not needing to pay for the Apple Developer Program unless you want to codesign (at a much lower price than what you pay for a codesigning certificate suitable for signing Windows programs) and not having to pay Apple 30% (or 15%, or anything) on macOS still stand.

        • stale2002 15 days ago

          Your point doesn't really stand, no.

          Your solution solves a made up problem that nobody cares about, and doesn't solve the one that actually matters, which is to successfully make and distribute good software to users.

          Someone shouldn't have to add the the fine print line "Assume I am talking about things that matter, instead of things that don't" to every statement or opinion that they have.

      • mr_toad 15 days ago

        It didn’t take my kids long to learn how to run unsigned binaries, and neither of them are developers.

        • palijer 15 days ago

          I've walked a couple hundred customers (American small business owners) through installing an unsigned MacOS application.There was plenty of friction for enough of them to cause us onboarding problems and for us to invest in doing it the Apple way.

          A lot of it introduced from 2017 onwards and I think now it says something akin to "this application will hack your computer and is a virus" and you need to click the smaller hidden "ignore"s a few times to do what you want.

        • Pesthuf 15 days ago

          An actual customer won’t like it when you tell them they have to turn off or bypass a security feature to run your software. Not when other software doesn’t need it.

          • jjnoakes 15 days ago

            Once I get a project to "actual customers" I don't mind paying and signing my binaries.

            • ryandrake 15 days ago

              How about "actual users" rather than "actual customers?" We should not normalize this because it eats away at free software. It is totally unreasonable to have to pay the operating system's manufacturer in order for person A to simply distribute software to person B, outside of manufacturer's infrastructure. The manufacturer has nothing to do with that distribution, and has no business "warning" the user about this software.

              • GnarfGnarf 15 days ago

                As much as I hate to submit to Apple having to Notarized my software, I have to admit that it’s a useful measure to detect and prevent malware. The end user is protected by Apple’s “Good Housekeeping” seal of approval.

                • dTal 14 days ago

                  Funny, I've never once in all my days installed malware from a Linux package manager, and this "seal of approval" doesn't cost me or the developer any money at all.

                  • GnarfGnarf 14 days ago

                    That’s because your computer is a hobby, and mine is a business. My customers use Windows and macOS. They have happily paid for my house, my car and my retirement. :o)

                    • dTal 13 days ago

                      If you want to justify rent-seeking because it helps you pay for your lifestyle, come out and say so in the first place instead of pretending it's for the benefit of your users. But claiming that Linux is a "hobby" on HN is essentially trolling.

              • rblatz 15 days ago

                You don’t have to pay to do that on MacOS, they can bypass the warning saying it’s unsigned and that the developer can’t be positively identified.

                • Aaron2222 15 days ago

                  Apple should really provide free codesigning for free/open source software.

                  • almostnormal 15 days ago

                    (almost) everyone has an SSL certificate for the web. An OS could check if software is signed with one. And maybe display a warning for only domain validation.

                  • stavros 15 days ago

                    What does software being signed signify? Does it mean it's vetted? Can a malware author pay the $X and have their malware signed?

                    • GnarfGnarf 15 days ago

                      No, Apple will detect and suppress malware as part of the vetting process.

                      • stavros 15 days ago

                        Ah, so they do vet? I didn't realize, thanks.

            • samplatt 15 days ago

              This is something that definitely chafes. Even in a large-company enterprise environment, so many worthy & legitimate projects never end up shipping due to financial or office-politics reasons. Putting up paywalls between devs and their work that they to spend both time and money on is bloody stupid.

        • fragmede 15 days ago

          yet. everyone knows kids are good with getting around restrictions on computers, whether put there by their parents or otherwise.

        • cultofmetatron 15 days ago

          kids will learn just about anything with the right motivation. adults who you are trying to get to pay you to use your software on the other hand...

          well as someone who runs a few unsigned binaries myself. Its not hard if you know what to do but apple makes a big deal about how its "unsafe" and this freaks non tech people out.

        • sydbarrett74 15 days ago

          Yes, but your kids have a technical parent, so chances are they both have significantly above-average intelligence.

          • otteromkram 15 days ago

            I technically have two parents. How far up the smarts pole am I?

            • sydbarrett74 15 days ago

              I should've said 'techie' parent. I assume most people knew what I meant.

      • currency 14 days ago

        I answer a support line for users at my institution installing an unsigned application and almost every MacOS support call is because the unsigned app option is only shown in a normally hidden system setting.

      • bitwize 15 days ago

        And if you don't sign your binaries on Windows, Windows Defender will assume they're malware and silently delete them.

        • AnonymousPlanet 15 days ago

          That statement is just not true. We don't sign our software and we never had that happen with any customer. It neither happened to any unsigned software on any of my own machines, in spite of running Defender on them.

        • Repulsion9513 15 days ago

          Nah, much more common that "SmartScreen" will assume they're malware and throw up a big warning prompt (which the user will say "can't be bypassed" because they didn't click "More info").

        • thombat 15 days ago

          Nope. Or at least, never happened to me. This comment section is starting to read like a "Bad Times" virus warning

          https://web.archive.org/web/20060925013545/http://www.making...

          • thombat 15 days ago

            And having re-read "Bad Times" for the first time in years, the "screw-up your VHS tracking" is a testament to its age.

            • bitwize 15 days ago

              "...translate your documents into Swahili, make your TV record Gigli, neuter your pets and give your laundry static cling."

              https://www.youtube.com/watch?v=zvfD5rnkTws

              Seriously, though, I've had the Windows Defender thing happen to freshly compiled binaries I made. The only way to prevent it from happening is to sign your binaries, or submit them individually to Microsoft using your Microsoft account for malware analysis.

              It flagged the binary as being some sort of trojan (which name I looked up and found that it was a Windows Defender designation for "I don't know the provenance of this binary so I'm going to assume it's bad") and quarantined it.

      • wredue 15 days ago

        I’m not sure what is “prohibitive” about pressing literally one button.

        • fragmede 15 days ago

          There are a bunch of words around the button. If you read them, they make you not want to press the button.

        • dimask 15 days ago

          If it is a company laptop, it can be impossible (unless you sort of hack it to circumvent the security settings they put).

        • alpaca128 15 days ago

          It's often not just one button. It's a button, then opening the settings, manually navigating to the right section, clicking Open Anyway and then entering your password.

        • n8henrie 15 days ago

          On macOS? One button?

  • jgord 15 days ago

    One of the reasons I moved to Javascript web development after many years as C/C++ dev, and after the hell of making iphone apps for Apples appstore - you dont have to get a licence, get approval or make an installer, if you ship a web 'app'.

    • sumnole 15 days ago

      With the added bonus of being available to users across platforms.

    • wredue 15 days ago

      You don’t have to do any of that for native apps either.

      What on earth is happening in this comment section?

      • wsc981 15 days ago

        I think on macOS it's kinda a requirement, even if you ship outside of the AppStore, to be trusted by consumers. Because I think the app needs to be signed by Apple, in order to start the app without a warning and I think in order for Apple to perform the signing, you'd need a developer account.

        I might be wrong here as I have been focused pretty much only on mobile, so feel free to correct me.

        • thegrim33 15 days ago

          True. Apple devices are a lost cause for me, I don't even consider supporting them in my software. It doesn't even come up as an option in my head, I forget it exists. I'd never willfully have anything to do with their ecosystem, whether desktop or mobile. I wonder if eventually people like me refusing to support them will actually make a difference and force them to change, or if enough people will just continue to bow down to them and do what it takes to be on their devices that they can just keep their horrible practices going.

        • wredue 13 days ago

          It’s a warning once that stops happening when you approve it. It’s not a big deal. Also happens on windows.

      • hoseja 15 days ago

        People aren't ignoring Apple hard enough. Why americans bother with it is way beyond me.

        • Nevermark 15 days ago

          I find it inexplicable when people respond to a particular problem with a suggestion on which large platform/ecosystem someone should use instead, or avoid.

          Switching ecosystems is nowhere near that trivial.

          Ecosystem choices are dependent on content and tool investments, other devices owned, product groups, integrated technologies, network effects between people, between companies, customer relationships, existing phone payments, existing ecosystem familiarity and skills, on and on.

          As for developers, they often need to be on the top 2-3 platforms to be a serious choice for customers.

          Nothing wrong with highlighting different pros and cons of different ecosystems.

          But a suggestion to switch ecosystems, without a very deep understanding of someone's particular situation, just isn't helpful advice.

          • vilunov 15 days ago

            I'd go further and state that "ecosystems" are evil as they erode competition. It should be easy to change products independently of each other, e.g. I should be free to choose between Apple iCloud or Google Photos for storing my photo library. Instead I'm forced to experience what you already mentioned: integration preferences on different platforms, network effects and so on.

            Only direct product properties should drive users' choices, everything else just raises the market entry barrier for potential competitors.

            • JohnFen 15 days ago

              "Ecosystems" certainly are a real problem, although I think calling them "evil" is a bit far. What they are is a way for companies to create an artificial moat, and artificial moats are very bad things.

      • fingerlocks 15 days ago

        Yeah, the only actual hurdle from Apple is the measly 8 bucks a month for a developer account. I would happily pay ten times that amount just to avoid the node_modules dumpster fire

        • worthless-trash 15 days ago

          And the hardware around it, that needs to be updated and managed.

  • StressedDev 15 days ago

    To be blunt, you do not have to create a developer account, sign binaries, or share 30% of your revenue with Microsoft. MS's API are not a mess in my opinion. You do have several options (traditional Win32, .NET, UWP, etc.). These options all work fairly well and are very flexible.

    As for, Apple, I do not know but I suspect you can make Mac applications without a developer account. You need a developer account for iPhone. It's $99 a year the last time I looked. This is not a lot of money if you are serious about making an application.

    • sobellian 15 days ago

      If you don't sign your Windows installer, then the first N users to use it will get a scary pop-up message saying that the AV "protected your PC." I think you might also need do code-signing if you distribute through the MS store.

      Compare with the web where LetsEncrypt just works without demanding a king's ransom.

      As for the APIs, it is very easy to get into dependency hell between all the different UI technologies, .NET implementations, and target systems. Want to develop a brand new plain-old GUI app? Probably simple (although I've never tried, the web is right there). Need to develop a plugin for an existing application, or a new app for something like Hololens? Have fun.

      • viraptor 15 days ago

        It's a bit worse with windows. You can get a scary warning or you can get smartscreened to death and the app will be prevented from starting. This is random / depending on functionality and effectively impossible to test with 100% certainty.

    • krageon 15 days ago

      > This is not a lot of money

      It is a lot of money when you consider it should be free and serves exactly no purpose.

      • siva7 14 days ago

        It serves a purpose: Keeping those out who can't afford it. They just can't say it out loud for obvious reasons.

    • KomoD 14 days ago

      > This is not a lot of money if you are serious about making an application.

      Maybe not for you.

  • zer00eyz 16 days ago

    Ok

    Set up CC processing on the web:

    How much are you going to pay stripe? 2.9% + 30¢ ... that means you have to charge 10 bucks to get down to a 6% transaction fee. Quite the price floor and an interesting cap on your pricing model!

    What does managing chargebacks cost you? The moment your taking money your going to hire in customer service, or spend time dealing with CS. What happens when you get a chargeback, or do a refund? Most of the time you loose money (processing fees etc)

    If your under a million bucks a year apple is 15%. If you're building a low price app or a value add app, odds are that apple is going to be a far better deal for you than doing it on your own.

    • tomhallett 16 days ago

      $10 = 6% fee; $5 = 8% fee. Both of which are far better than apple’s fees, so that point is a bit confusing.

      Chargebacks = customer support. I agree with that, but if you have a B2C business which has any non-trivial revenue (OP is talking about word doc apps, so we’re obviously not talking about indie $2 side project apps), then you would already have CS anyway. I fully understand there is an opportunity cost with any service and where those costs get realized, but your examples don’t seem like a slam dunk in apple’s favor.

      • zer00eyz 15 days ago

        >> then you would already have CS anyway

        Would you? Because I would argue that CC processing is the point where you NEED near real time CS. Before that handling customer issues can be done better through forums, and you're going to get a lot of self service support from those.

        >> (OP is talking about word doc apps, so we’re obviously not talking about indie $2 side project apps)

        Your competing with free, libra office, Zoho writer (shockingly popular)... I would not know how to price the product to compete... 2 bucks a month as a trial? Would I pay 10 bucks a year if you were great? IF you got said productivity app past 100k users, getting to a million isnt a stretch (you have velocity and popularity).

        Unless your doing something really slimy, your going to be able to get a better rate out of apple if you ask your rep.

        • jokethrowaway 15 days ago

          5% of my support has to do with payments and it's all about refunds.

          Everybody pays for stuff online

    • littlestymaar 15 days ago

      Even when using Stripe (which is a premium payment service that's more expensive than most options) you'd be better off than the 15% from Apple as long as you sell for more than $2.5. And that's not even counting the up from cost that come with Apple (subscription + the need to buy a Mac).

      How is chargeback being managed on Apple? I doubt they are swallowing the cost on their side, so I don't really see the difference between what'd get with a bank: you're losing the money anyway.

      • zer00eyz 15 days ago

        At 5 bucks a customer, you need 200k new ones a year to break a million bucks.

        TO break even with apple you have about 80k a year all in cost to deal with all your refunds and charge backs.... after taxes, insurance and overhead that's 40-60k take home for a CS agent.

        What is the charge back rate on digital goods? Im going to tell you that if your a small player it will be WAY higher than apple. Apple will cut a consumer off if they have a high refund rate, your CS agent will have no such insight.

        %5-10 of your charges will just turn into refunds. Is that a process where you're killing license keys? Oh did you forget you now have infrastructure to run to issue and maintain said key? What is that going to cost you? Dont want to run like that... well ok then expect your return rate to go even higher. That discount CC processor is going to look at your refund and charge back rate and jack your fees up sky high (because that's the name of the game).

        Once you get past a million bucks the open question is "do I do enough business to negotiate with apple". IN the case of a dry business oriented app, that has enough popularity to make that much, you might see apple willing to negotiate with you much sooner than a game dev who has sneaky buy options and huge charge back rates.

        • jokethrowaway 15 days ago

          Chargebacks are a pain but are not that frequent. You need to make a way to refund your product easily discoverable because customers go unpunished.

          You can use chargeback protection on stripe or use a different payment provider which absorb the 15$ fee for chargebacks

        • littlestymaar 14 days ago

          > At 5 bucks a customer, you need 200k new ones a year to break a million bucks.

          But at $5 per user Apple is already much more expensive below the million threshold. It gets worse after a million, but it's already costing you tens of thousands before that. And again, you are comparing with one of the most expensive option on the market!

          > after taxes, insurance and overhead that's 40-60k take home for a CS agent

          Which, almost anywhere in the world, is more than you need to hire someone full time to work on your customer support! And no, what Apple provides is definitely not superior to a full time consumer support person.

          The “value” that you pay for when dealing with Apple is access to their walled-off user base.

          > the open question is "do I do enough business to negotiate with apple

          This isn't an “open question”, it's a closed one: Apple isn't going to talk to you unless they think not giving you special treatment would get them antitrust issues. In your case or mine, it's not gonna happen.

    • SiVal 15 days ago

      Does Apple charge 15% for each dollar up to a million plus 30% for each dollar above a million, or when you cross a million (in a year), do they suddenly jump to 30% of everything? IOW, if I have earned $999,999 so far this year, I have to pay Apple about $150,000. If I then make another $1 sale, do I owe a few cents more or another $150,000?

      And once your rate goes to 30%, does it stay there the following year, or does the whole system reset to zero each year?

      • zer00eyz 15 days ago

        15 percent on the first million in a year 30% for everything after.

        Subscriptions are 15% for renewals (and maybe for all subs).

        If your pulling in more than a few million a year from apple, and your not "gaming" or gaming the system I hear they are fairly open to negotiate. YMMV

    • palijer 15 days ago

      Dealing with Apple is a tax as well though.

      How do you calculate a price for not being able to release your main product? Usually without clear indications of what exact interpretation of a rule you are breaking...

      We've had delays of a week because of things like we mentioned "Android" in an integration setting that had been there for years.

    • 6510 15 days ago

      The cheapest in my country is 7 cent per transaction, the most popular is 25 cents. We also don't do claw-backs.

    • paulddraper 15 days ago

      > apple is going to be a far better deal

      ?

      Your math seems to show the exact opposite.

    • exe34 16 days ago

      Do any of these problems go away when you sell in the walled garden?

  • jcelerier 16 days ago

    whenever I do native (native as in, compiled without going through some bytecode / VM / interpreter ...) apps for mac / windows / linux I don't have to do any of this, I just use Qt

    • 01HNNWZ0MV43FF 16 days ago

      But then you have to use c++ or Python, and figure out a good way to ship 10 dlls

      • jcelerier 16 days ago

        I ship apps that statically link against Qt, but even if I didn't it's not like "shipping DLLs" wasn't a solved problem two decades ago

      • BearOso 16 days ago

        You can static-link in all of Qt. Just build Qt yourself. It can strip out all the things you don't need, even symbols from the libraries you do use, so your binary isn't going to be that big.

        • delduca 15 days ago

          I do not think it is possible to have a commercial application, you have to pay Qt’s license.

          • Maxatar 15 days ago

            You can statically link Qt in compliance with the LGPL. The LGPL only requires that users are able to substitute the LGPL'd portion of an application with an alternative and compatible implementation.

            Using a shared object/DLL is the traditional way of doing so, but you can also accomplish this by providing the object files for your application to allow users to link their own substitutions statically.

            The FSF explicitly permits this as documented here:

            https://www.gnu.org/licenses/gpl-faq.en.html#LGPLStaticVsDyn...

          • Arnt 15 days ago

            Nah. But do it.

            You just have to open your source, that part which depends on Qt. It's not a real problem. But get a commercial license anyway, the cost is small compared to the other costs of developing your program, and you want to be friends with them.

            (There's someone on HN who lives on a single-line modification of an open source program. Trust me, source availability of the source code of your client app won't really make a difference.)

            • ale42 15 days ago

              > There's someone on HN who lives on a single-line modification of an open source program

              Now I want to know more about this :-)

              • Arnt 15 days ago

                He's a nice guy. If you want your company to buy his product, you send your boss a link to the product's home page (which doesn't say "open source") and tell your boss that this product is great. Your boss looks at the pricing and description, and says ok.

    • turrini 15 days ago

      I do as well. I program everything in C++ with Qt 6 (commercial license), compile statically where convenient, and use a single code base for all platforms (mobile, desktop, web). I handle the responsiveness of interfaces, DPI, and other micro-adjustments directly in a simple QML template.

    • GnarfGnarf 15 days ago

      I use Qt, and I have to get my binary Notarized by Apple.

    • bobajeff 15 days ago

      What about code signing. Won't people that run your programs want to do so without the OS claiming it will harm their computer?

  • cosmotic 15 days ago

    Devil's advocate: you now get a lot of tooling for 'free', which used to cost hundreds or thousands of dollars.

    • _aavaa_ 15 days ago

      Counter offer: the tooling is ultimately for their benefit. They need the apps to make their platform what it is.

      • eviks 15 days ago

        They ultimately need money, not apps or platforms, so this is exactly how they achieve that ultimate benefit, no top-level logic will just justify free here

        • fauigerzigerk 15 days ago

          >no top-level logic will just justify free here

          Not true. The technical term for "ultimately need money" is discounted future cash flow. It is impossible to know for sure what price you have to charge for any particular item at any given time in order to optimise for this metric.

          Realistically, the answer depends on the state of competition between platforms. We all know what that state is.

          • eviks 15 days ago

            > It is impossible to know

            so it is true, you can't provide any top-level logic to justify 0, you need some facts

            • fauigerzigerk 15 days ago

              If "top level logic" is supposed to mean "analytic statements" then you are right. The optimal price cannot be derived analytically.

              As this is such a pointlessly contrived interpretation of the term "logic" in this context, I chose to use a different one: Is there a set of empirical circumstances under which an optimisation algorithm could conclude that the optimal price is zero? The answer to that is clearly yes.

              • eviks 15 days ago

                Top level logic is supposed to mean the logic the comment I'm responding to uses to justify free. You know, "in this context". And I'm not talking about "optimal", just a single price of 0

                Now, what exactly is the point of you insisting on your wrong interpretation?

        • _aavaa_ 15 days ago

          I mean by that logic, no top level logic will justify a cut of all revenue.

          They want to be feudal lords, requiring us to pay a tithing for the privilege of selling something to customers.

          • eviks 14 days ago

            you're right, no top level logic would help you settle on any specific price, hence you need to engage with actual reality instead of simply noticing one party that benefits while ignoring the other that also "ultimately benefits" and "needs the platform to run apps on".

            You, of course, want to be their feudal lord and get access to all their customers by right while also requiring them to pay a tithing of their hardware sales to you since you advance "their ultimate benefit" (they wouldn't sell any hardware without software)

            • _aavaa_ 14 days ago

              > You, of course, want to be their feudal lord and get access to all their customers by right

              If someone buys an iPhone, Apple does not have the right to interpose themselves between that person and what they want to do with the iphone they bought. They have no right to a cut of the sales any more than the power company that provided the electrons to charge the battery.

              What I want is for Apple to get out of the way.

              > while also requiring them to pay a tithing of their hardware sales to you since you advance

              What I want them to do, for a start, is to the same thing on ios that they already do on macos. I can already write a piece of software an sell it without forking 30% over to Apple.

              The current situation where they feel entitled to a cut of every software sale that happens on ios, and veto power over it, is a wet dream that even Microsoft in the 90s wouldn't have thought they'd get away with.

    • ChuckMcM 15 days ago

      Yeah that doesn't quite work. I agree that the cost of tooling has gone to nearly zero in most cases, but not giving it away will limit the people willing to even try to develop code for your platform.

      • cosmotic 14 days ago

        Microsoft charged ~$1000 a seat for Visual Studio and at the same time they had an effective monopoly eventually leading to United States v. Microsoft Corp.

    • pjc50 15 days ago

      But for things like Apple notarization, you don't get the choice of not using the tooling. Besides, that transition already happened with the popularization of Free/Open software, somewhere in the early 2000s.

    • modeless 15 days ago

      The problem with this argument is that the tools for proprietary platforms are inferior to the cross-platform ones in many cases. VSCode is better than XCode or Android Studio. GCC and Clang are better than MSVC. We don't need platform lock-in to subsidize good tools because the best tools are unencumbered.

      I'd happily build iOS apps without XCode or any of Apple's frameworks to save the 30% fee. Heck, I'd do it even if I still had to pay the 30%, I hate being forced to use XCode.

      • fingerlocks 15 days ago

        I don’t use Xcode, I develop in vim. You can run the deploy/signing step on the cli without launching Xcode.

      • rTX5CMRXIfFG 15 days ago

        > VSCode is better than XCode or Android Studio

        That’s just your opinion though isn’t it

        • scotty79 15 days ago

          That's just an opinion. Not only his. It's shared by many.

          • HPsquared 15 days ago

            "This is my opinion. There are many like it, but this one is mine.

            My opinion is my best friend. It is my life. I must master it as I must master my life.

            Without me, my opinion is useless. Without my opinion, I am useless."

    • varispeed 15 days ago

      One off few hundred or few thousands is nothing in comparison to 30% tax.

      That said, I don't know about Mac, but you can build apps using free tools - maybe in not as convenient way, but certainly you can.

      I remember, because I was someone who couldn't afford Visual Studio licence and had to make do with GNU tools.

      The greed of these companies put me off from developing anything.

      • dhosek 15 days ago

        In the early 90s, you could expect to pay anywhere from $200–1000 for a good C/C++ compiler. Now it’s free. The 30% tax, as many people have already pointed out, is only if you want to sell through the store. Back in the 90s, if you were selling software, downloading off the net wasn’t really a thing yet and you could easily expect to end up giving up 40–60% of the retail sales price and out of what was left you were paying for manufacturing of product so you’d maybe get 20–40% of the retail sales price.

        Which leaves the certificate thing and while it’s an annoyance, it’s also nice as a software user to know that a program I’m running is the program it claims to be without much friction on my part, and the cost can’t be that prohibitive since I don’t remember the last time I ended up with an unsigned binary on my Mac, even for free software like TeX and friends or Aquamacs.

        • saagarjha 15 days ago

          It’s free because of free software, not because of platforms that used to sell these tools.

        • eviks 15 days ago

          > and the cost can’t be that prohibitive since I don’t remember the last time I ended up with an unsigned binary on my Mac, even for free software like TeX and friends or Aquamacs.

          Ok, so your app tastes aren't that varied then (or maybe it's the memory), plenty of devs of various little utilities don't bother paying

      • astrange 15 days ago

        It's 15% for almost all developers, not 30%.

        • sojournerc 15 days ago

          A distinction without a difference

          • astrange 15 days ago

            It's a 50% difference.

            • dcow 15 days ago

              It’s a 100% difference (;

      • samplatt 15 days ago

        ICT departments in many large companies often force dev teams to use certain tools, because it's what's on their list of 'approved tools for devs'. Getting new tools on this list is often stonewalled for usually office-politics reasons.

        Sometimes devs are locked into the tools they use. This situation is shit, but not uncommon.

    • nox101 15 days ago

      On which platform? On Apple's there cost is part of the premium you pay for the device. Cheapest Mac is $599. Cheapest windows machine is $199? $99. So arguably some of tbat $400-500 is for the extra software. Or would compare against Linux where you could also get a machine for $25

      • threeseed 15 days ago

        You don't have to buy a brand new Mac.

        Older M1 devices which are still very fast are available for much cheaper.

        • TheCapeGreek 15 days ago

          "Much cheaper" is still very relative. I got a second hand 8GB M1 MBP last year for $900, as is the standard price in my region. The cheapest M2 Air brand new retails for ~$1200. Meanwhile I've just ordered a new non-mac machine with up to 5GHz boost and 32GB RAM for a whopping... $1000, including extended warranty and delivery fee.

  • xvilka 15 days ago

    Apple certificates are cheap compared to Microsoft. To get rid of UAC on Windows you have to buy certificate for thousands of dollars.

    • pquki4 15 days ago

      I don't think you "get rid of" UAC, you just put the author's name on the screen instead of unknown publisher. (And why do you need elevated privilege? Most applications don't) unless you are referring to "smart screen" which is a very different thing, although quite similar from a user's perspective.

    • kkarakk 15 days ago

      yeah and consequently users have been trained to ignore UAC warnings or just disable them as irritants.

  • tppiotrowski 15 days ago

    > share 30% of my revenue with them for barely any reason

    Does the App Store collect sales tax and remit on your behalf? If it does then I think it's worth it or face registering both in the EU and UK ($0 tax threshold) as well as 50 US states (once you hit the allowed limit) will take you a long time.

    • d357r0y3r 15 days ago

      30% cut for handling taxes? That's wild.

      • tppiotrowski 15 days ago

        You'd think so until you look into doing it yourself. It's more work than building a simple app.

        • eviks 15 days ago

          And you'd thinking would reverse again to the common sense baseline when you realize that alternative providers outside of locked systems don't charge 30%

          • tppiotrowski 15 days ago

            Can you name a provider? I personally use Stripe Tax for my business and while they will calculate the taxes you owe in each municipality it is totally on you to create an account with each country/state's Department of Revenue and fill out a form quarterly to submit your payments.

            This paperwork is what I believe a marketplace like the App Store or Amazon do for you under their own entity that you have to do yourself if you bypass their stores.

            Please correct me if I'm wrong.

            • piyushag 14 days ago

              You could consider using Galvix, which will connect with your billing system (including Stripe), fetch all the invoices and automatically prepare and file sales tax returns across each of the US states where you are legally required to file, all while providing you full control and visibility over the process. Galvix charges a flat fee of $75 per filed return, which can be considerably more affordable compared to paying a revenue share to other platforms (if tax compliance is the only reason you want to use these platforms).

              Disclosure: I am a co-founder of Galvix.

            • eviks 15 days ago

              What's wrong with your Amazon example? They don't charge as much (and that's partially because, while they're big and dominant, they're still not as big/dominant in that market)

      • threeseed 15 days ago

        It's 15% for most developers. And it's for a lot more than just handling taxes.

    • pc86 15 days ago

      If you live in the US the only entity you need to collect sales tax for is the state you live in. Despite what they may say you are under no legal obligation to collect sales tax for the other 49 states, nor the EU or UK.

      • bdw5204 15 days ago

        I'm pretty sure that was changed by South Dakota v. Wayfair[0]. Most states seem to only require you collect the tax if you have 200 shipments into the state or $100k in revenue because going after a small time out of state e-commerce business over a few dollars of tax probably wouldn't be worth it but a large firm in Delaware refusing to collect tax on shipments into California would probably be hearing from California's government.

        If you're shipping overseas, you can probably ignore foreign taxes if you don't have a business nexus there. Especially if you have no desire to ever visit those countries. Basically just leave it up to your customers to pay whatever tax they owe.

        [0]: https://en.wikipedia.org/wiki/South_Dakota_v._Wayfair,_Inc.

        • tppiotrowski 15 days ago

          > Every company selling goods and services to European customers needs to collect value-added tax (VAT), even if their business is not established in Europe.

          https://stripe.com/guides/introduction-to-eu-vat-and-vat-oss

          • bdw5204 15 days ago

            > Enforcement of judgments issued by foreign courts in the United States is governed by the laws of the states. Enforcement cannot be accomplished by means of letters rogatory in the United States. Under U.S. law, an individual seeking to enforce a foreign judgment, decree or order in this country must file suit before a competent court. The court will determine whether to recognize and enforce the foreign judgment.

            Obviously, its not a good idea to bet your business on the courts not enforcing an EU fine when you can just add the VAT and cost of the handling hassle to the price for EU customers.

            https://travel.state.gov/content/travel/en/legal/travel-lega...

            • TheCapeGreek 15 days ago

              The operating idea from governments is that in the digital age, when you sell something to a customer abroad, you're selling to them on their turf and not yours. That's why you're considered liable for sales tax in the first place. Doesn't matter that your own country of residence may or may not care. For all intents and purposes it's as if you physically flew to the country and hand-delivered your software/product to your customer.

              It's clearly an awful "patch" to outdated concepts on how commerce works compared to pre-internet, but it's what we have right now.

            • tppiotrowski 15 days ago

              > Obviously, its not a good idea to bet your business on the courts not enforcing an EU fine

              Right. Plus it might hinder your ability to travel freely in those jurisdictions as well which I'd like to avoid.

  • silverquiet 16 days ago

    I work at a place that ships an app to both Apple and Microsoft Desktops (we could even do Linux is there was ever any demand for it). We use this old thing called Java which still seems to work. I don't develop it though so I guess I don't have to worry about too much of my resume getting caught up with unfashionable languages (let's face the facts about what most tech these days is trying to advance - promotions - not the state of the art).

    • Maxatar 15 days ago

      Java apps are not native on either macOS or Windows.

      • foresto 15 days ago

        Nor Linux.

        The only Java desktop app I've ever used (on any platform) without frustration was Slay the Spire, and it only passes because it's a game and doesn't require desktop integration of any kind.

        • astrange 15 days ago

          I hear Minecraft is popular.

          I use JDownloader sometimes, it's totally fine. Weka is bad, but not worse than other academic apps.

        • dexwiz 15 days ago

          Slay the spire is built using libGDX which provides a lot of cross platform support on top of Java. For platforms like Switch without JVM support it probably ships a compiled version without JIT.

        • patrick451 15 days ago

          Matlab is a java app. I used it on both windows and linux without any complaints.

    • colecut 15 days ago

      OP is obviously talking about mobile apps.

      • Sardtok 15 days ago

        Yeah, that Microsoft App Store on mobile is a b**h.

        • pjc50 15 days ago

          It's a bit of a shame they abandoned it with Windows Phone, but you can see why developers don't want three completely different mobile targets.

  • api 16 days ago

    The native world also refuses to create a standard UI API, making everyone use either Qt or Electron because sorry writing it over again for each platform is a hard “no.” Not even big companies do that anymore.

    • ryandrake 15 days ago

      Yes. Not only are they refusing (and have been for decades) to create a standard UI API, they are 1. actively making their own UI APIs as different as possible from one another, even down to requiring different programming languages to use them, and 2. killing things that they once supported, which ease cross-platform code (both major platforms walking away from OpenGL in favor of their incompatible native APIs).

    • AnonymousPlanet 15 days ago

      Not only that. There are people who go to great lengths to make sure that native apps don't work properly across desktop environments even on the same OS. They also call out anyone who dares to complain about it.

    • jwells89 15 days ago

      Why would platform maintainers want to encourage the lowest common denominator apps that such an API would undoubtedly result in (as a standardized UI API by definition cannot leverage platform strengths)?

      Apps like that get made anyway but as it stands at least there’s a healthy crop of smaller/indie native alternatives which often best the behemoths in UI/UX. That would likely disappear with the addition of a standardized UI API, as it would probably also come with the abandonment of the old specialized APIs.

      • api 15 days ago

        They are already doing that. Everyone uses Electron now. A good common API would lead to much better results.

      • Klonoar 15 days ago

        Platform maintainers (Apple, Microsoft, etc) already do this by being on web standards panels. ;P

  • rini17 16 days ago

    You can't use Qt?

    • injuly 16 days ago

      Qt licensing is its own mess. For commercial software, the pricing is 350-500$ per developer, per month. Seriously [1]. The company that now owns the framework doesn't seem to acknowledge the gap between big enterprises and solo developers/smaller teams.

      [1] Yes, one can use Qt for commercial software without buying a license (as long as it is dynamically linked), but their marketing does everything it can to hide that fact. Also, the newer additions to Qt do not fall in this category – for those, you have to pay.

      • turrini 15 days ago

        Mess?

        Here are the most commonly used options:

        - Go LGPL. Sure, you will need to ship binaries and libs, but there are tools within the SDK that do this automatically for you (windeployqt, macdeployqt, etc.). And as others have stated, it is a problem that was solved years ago.

        - Go Commercial to link statically. If you are a single developer, there is an annual license available for $499 (up to $100k yearly revenue).

        • dexwiz 15 days ago

          It always shocks me developers complain so much about QT licensing. For any other business, an expense that small for so much value seems trivial. Without a decent UI software is a terrible for experience for most users.

      • pjmlp 16 days ago

        Imagine that, having to pay for the tools one has to use for their work, what an abuse.

        • Maxatar 15 days ago

          Having to pay a monthly fee in perpetuity in order to distribute an application is absolutely egregious.

          • readyman 15 days ago

            The fee is for selling someone else's software. I personally despise capitalism, but your complaint about it is among the least convincing ones I have ever heard.

      • Veserv 15 days ago

        That is 4,200-6,000 $/yr. In the US, a junior developer in a software company costs (all-inclusive, not just salary) around 150,000-200,000 $/yr. That is 2-4% of yearly cost on tooling. That is not very much.

        It might not be worth the price, but that is hardly ridiculous. It is quite believable to get a 4% productivity improvement from appropriate tooling. You need to do a cost-benefit analysis to determine the answer to that question.

        • guappa 15 days ago

          No they'd rather spend weeks to reimplement scrolling.

          • mardifoufs 15 days ago

            Lol scrolling on qt is worse than on the web. I mean, you can use normal scrolling super easily on both (you don't have to do anything, and it just works). But truly custom scrolling is much harder on qt than web. In a way that's a good thing, but again, the default is just as easy on the web as it is on QT. Plus you don't have to deal with the qtquick/qtwidgets/etc thing and the non open source parts of qt

            • guappa 15 days ago

              I have to use for work a software that is implemented in electron.

              I think less than 1% of the users use it on mobile, but it's designed as a mobile interface.

              To scroll you need to click and drag, or you need to click 5px buttons. Regular mouse scroll doesn't work.

            • rini17 15 days ago

              Because subverting users' expectations about scrolling is the step 0 of efficient software. /s

              • mardifoufs 15 days ago

                That's why I said it might be a good thing. My main point was that it's just as easy on the web for standard scrolling. But even if you don't want standard scroll behavior, it's still easier. There's nothing easier to do on qt than on web. Compare a qtgraphicsview or qt3dcanvas to a webgl canvas and again, it's fighting against the framework versus stuff just working. Now sure qt is much better for tons of other stuff, but I just found it weird that the comment I was replying to mentioned wasting time on customizing stuff as being the downside of web apps, as if it's not a much more difficult task to do in qt.

                • guappa 8 days ago

                  You remind me when microsoft was claiming that bash was hard and as example did some crazy obfuscated bash scripts, rather than just doing them the sane way.

                  If you're doing a GUI, you have no reason to be doing canvas manually.

                  • mardifoufs 7 days ago

                    What? Even with QT you often have to use a painter and draw what you want more or less. You also need a canvas to display anything that is visualisation related. In any case it doesn't matter, as I said, scrolling is just as easy on the web as it is on QT. my point was more general, if you want to do anything custom it's easier to do in JS than with QT. Even using the multiple tools QT offers to customize the view (the painters, canvases, 3d widgets, etc)

                    • guappa 7 days ago

                      You're just showing me you've never dune as much as an hello world using QT. Which is completely fine, but don't paint yourself as knowing what you're talking about.

                      • mardifoufs a day ago

                        That isn't true. Are you really saying that qt is easier to customize than a plain JS/HTML UI? Seriously?

    • a1o 16 days ago

      You will still be in binary sign hell and Windows Defender may wake up one day and decide your app is a virus "when it does X", which is exactly it's business case. Complaining to MS will do nothing since their online thing will check and not find anything. Boom, entire software business gone for reason out of control. Doesn't care about your signed certificate too.

      • hsbauauvhabzb 15 days ago

        I’ve always been curious if this counts as decimation, espionage or antitrust?

      • xet7 15 days ago

        That's why the only way to develop software is to provide URL for login. MS desktops are usually too locked down to install anything.

  • mattl 16 days ago

    You don’t have to do any of that for a native Mac app. Signing it is a good idea but not required and you can distribute it from your own website or even from GitHub/Lab where you can tell people it’s not notarized and they’ll need to command click and open it the first time.

    • cmiles74 16 days ago

      In my opinion, this will become harder and harder to do with every release of Windows and MacOS. I wouldn't count on the average customer of these vendors being willing to shop outside of their plaatform's app stores forever.

      • rblatz 15 days ago

        Does a sizable portion of people shop for apps in the Microsoft or MacOS App Store? I was under the impression that neither are very popular.

    • fragmede 15 days ago

      > tell people it’s not notarized and they’ll need to command click and open it the first time.

      That's not realistic for Apple users who are used to ergonomic software. It's not technically required to notarize, but practically speaking, it is.

      • mattl 15 days ago

        It’s really only practical for dev tools or niche open source desktop apps.

    • EGreg 16 days ago

      The reason that Apple and Microsoft require all this is also that native apps have a lot more access to the system.

      • Nextgrid 15 days ago

        This doesn't matter. Notarization doesn't do anything against a dedicated attacker willing to commit illegal acts.

        Notarization is supposed to deter malware by a combination of static/dynamic analysis and attaching some real-world legal entity to any signed binary so law enforcement can follow up on if malicious activity is happening.

        Analysis is not bulletproof and can be worked around.

        The legal entity requirement is also trivial to nullify. At least in the UK, the company registration authority charges a nominal fee (payable by credit card - stolen if necessary) and puts you on the company register. Dun & Bradstreet scrapes that and that's how you get the DUNS number necessary to register for an Apple dev account. All of this is trivial to get through if you don't mind breaking the law and making up a few fake documents and providing a stolen CC (and assuming you're already planning to break the law by distributing malware, this is not a problem).

        Finally, even if the "legal entity" bit was bulletproof, law enforcement just doesn't give a shit about the vast majority of online crime anyway.

        All of these requirements are just a way to lock down access to the walled garden and put as many roadblocks to laymen trying to make their own software (in favor of big corps) masquerading as security theatre.

        • mike_hearn 15 days ago

          Notarization does do things against attackers, yes.

          Firstly, stolen CCs tend to get reported especially if you make a big purchase. If you use a stolen CC to buy a developer certificate then it's going to get revoked the moment the real owner notices, and then your apps will be killed remotely by Apple before they've even been detected as malicious.

          Still, the big win of notarization is that Apple can track down variants of your malware once it's identified and take them all out simultaneously. They keep copies of every program running on a Mac, so they can do clustering analysis server side. On Windows there's no equivalent of notarization, but the same task is necessary because otherwise malware authors can just spin endless minor variants that escape hash based detection, so virus scanners have to try and heuristically identify variants client side. This is not only a horrific resource burn but also requires the signatures to be pushed out to the clients where malware authors can observe them and immediately figure out how they're being spotted. Notarization is a far more effective approach. It's like the shift from Thunderbird doing spam filtering all on its own using hard-coded rules, to Gmail style server side spam filtering.

          > All of these requirements are just a way to lock down access to the walled garden

          I've been hearing this for over a decade now. In the beginning I believed it, but it's been a long time and Apple have never made macOS a walled garden like iOS is. There's no sign they're going to do it either. After all, at least some people have to be able to write new apps!

          • worthless-trash 15 days ago

            > They keep copies of every program running on a Mac, so they can do clustering

            > analysis server side.

            Are you sure about this ? I did not give apple permission to keep a copy of my software that I am writing.

            • mike_hearn 15 days ago

              Yes you did, if you have notarized your app:

              https://developer.apple.com/support/terms/apple-developer-pr...

              Section 5.3: "By uploading Your Application to Apple for this digital notary service, You agree that Apple may perform such security checks on Your Application for purposes of detecting malware or other harmful or suspicious code or components, and You agree that Apple may retain and use Your Application for subsequent security checks for the same purposes."

              • worthless-trash 3 days ago

                I did not notarize my app. So it doesn't have a copy of my program, which I believe is a subset of every.

            • saagarjha 15 days ago

              Isn’t that something you agree to when you notarize?

  • matheusmoreira 15 days ago

    Just stop making apps for Apple, Microsoft, Google platforms. Truth is everything except Linux is just somebody else's digital fiefdom where we developers are but serfs and the users are even lower. It's either Linux or the web.

  • blackeyeblitzar 15 days ago

    We need freedom and privacy oriented general computing hardware and software again. Not these locked down operating systems tied to one walled garden.

  • j45 15 days ago

    Choosing the web is great.

    Choosing overly complex web frameworks is still a guilty pleasure of too many projects.

  • tempodox 15 days ago

    > we choose the much simpler, much cheaper way of the web.

    Once the beancounters at the rent-seeking companies (Apple, Microsoft, …) have figured out that web development is where all the money is, this will change rapidly. Google has already started gatekeeping the web via Chrome.

  • raincole 15 days ago

    It's either me having reading comprehension issue, or it's surprisingly unclear which certificate I need to buy to publish an app on Microsoft Store and what the minimum cost is.

    Considering the whole point to have Windows is to use apps I'd expect they made the process super smooth.

  • timeon 15 days ago

    How do you host your apps?

  • vb6sp6 15 days ago

    I'm not sharing any revenue with Microsoft for my desktop apps :)

  • Rochus 15 days ago

    [flagged]

    • KittenInABox 15 days ago

      Unfortunately, yes, having one's personal information accessible to large, private companies really doesn't matter to most people. The only people I know who really care about this stuff are tech people, stalking victims, and victims of domestic abuse. [Admittedly this is becoming more aware for women trying to get abortions, but they're also a minority shamed to silence most of the time.] This isn't going to change until there are real, public, personal stakes for the majority of people.

    • hn_version_0023 15 days ago

      The alternative that exists today that I can buy and all the apps I need for work will actually exist and function correctly is called…

      • Rochus 15 days ago

        > all the apps I need for work

        The whole thing is like an intentional vicious circle. People buy the systems because certain applications are available on them (or rather because that's what everyone does), and the application manufacturers support the systems where the most customers are expected. But if one takes an impartial look at which applications or functions are really needed for a company, there are certainly alternatives.

        Unfortunately, the open source community sabotages itself, e.g. by constantly changing the ABI of essential functions and thus undermining the portability of non-open source applications (see e.g. https://news.ycombinator.com/item?id=32471624).

        • Rochus 15 days ago

          I find it very regrettable that now also on HN the flagging function is being misused more and more often to suppress other, but completely legitimate views. It is obvious that the majority of people are unaware of this problem or marginalize it, but that does not make it any less critical.

          My statement was: Apparently, people prefer to buy expensive devices that eavesdrop and patronize them. As long as this continues and people don't run away from these manufacturers, they will continue with the trend and patronize people even more.

      • LAC-Tech 15 days ago

        I run desktop linux and I've had no issue with joining zoom or teams calls.

      • worksonmine 15 days ago

        The hardware is difficult but people are working on it. If you really want all firmware to be open old Thinkpads are popular but I've never tried it myself. And Linux/*BSD should be your OS. I've been using Linux for over a decade and don't miss anything.

        If your work mandates something you can't solve with Linux the issue is with your work and you should push to change that.

      • antiframe 15 days ago

        GrapheneOS for mobile. Any Linux distro on desktop.

      • bojan 15 days ago

        Fairphone 4.

        • toastal 15 days ago

          Sustainable brands don’t remove their headphone jacks then start selling wireless headphones that require more batteries & rely on firmware updates

        • bobim 15 days ago

          With which OS to have both privacy and Banking apps running in a jailed-signed container? (Genuine question)

    • dan-robertson 15 days ago

      I think your comment would be better if it would at least humour the idea that people might have legitimate reasons for their preferences, even if they don’t match yours.

    • guappa 15 days ago

      In several countries you can't file your taxes or access your bank without a google/apple smartphone. People need to live too.

      • Rochus 15 days ago

        Such a state would not only be very unsocial (just think of the many elderly and disabled people, apart from the less well-off, who are unable to use such small screens and operating elements), but would also have to accept the question of why it is so interested in forcing such a device on every citizen.

      • threeseed 15 days ago

        What country only lets you file your taxes and do banking via a smartphone.

        It's always been an app in addition to a website.

        • guappa 15 days ago

          In sweden you can't login to A LOT of stuff without a smartphone.

          Including taxes, getting a covid certificate, applying to rent an apartment.

          • Rochus 15 days ago

            This seems very unlikely to me, as Sweden is known to be one of the most social countries in Europe, and such a requirement would not only discriminate against the less well-off, but also against the elderly and disabled. It would be very surprising if a majority could be found for such a regulation in Sweden.

            • guappa 8 days ago

              Sweden has one of the highest wealth disparity in europe, and it's increasing.

              Also, as person living in sweden, let me tell you that marketing as inclusive is not the same as being inclusive. Spending money to let disabled people be able to get on trains, or checking that accessibility laws are respected (they aren't) are not things that happen in sweden.

              Just last month I encountered a broken elevator at a train station. Which means no taking the train if you're on a wheelchair (and good luck with getting a refund). Even worse, if you actually were on the train, you're now stuck on the platform and can't leave until the next train shows up. Of course to buy the ticket for the next train you will need a smartphone.

  • greenthrow 16 days ago

    "Barely any reason"... except they created and maintain the entire plarform and tooling that you're building on. And in Apple's case they give it away for free with any mac.

    I'm old enough to remember when buying development tooling for DOS or Windows was $$$$$$

    • cmiles74 16 days ago

      Apple started giving away the development environment because they had such an anemic software ecosystem. They had a handful of OpenSTEP developers and a larger crowd of die-hard Mac people, the successful ones mostly moving away from the platform.

      Today Apple is taking percentage of every dollar made from application developers who participate in their App store and they are making it increasingly difficult to avoid this with every release. IMHO, they are making far more dollars today than they ever did selling development hardware and SDK licenses.

    • cma 16 days ago

      They had a $100 yearly dev fee for ios.

      • mattl 16 days ago

        Only if you want to distribute via the App Store. There’s also TestFlight and distribution of source code I believe if you want to avoid that.

        • internetter 16 days ago

          Both of these are completely false. Testflight distribution without a developer license is impossible. Asking users to compile the app themself is infeasible, as the XCode toolchain is upwards of 18gb and they will be required to compile it once every week to keep it on their device. The developer fee is unavoidable — even with EU intervention

          • Nextgrid 15 days ago

            Even signing for your own device (if you manage to get your users to do this) requires an Apple ID in good standing.

          • mattl 13 days ago

            My own poor wording.. I'm talking about avoiding the App Store not avoiding the license.

        • cma 16 days ago

          Sounds much less generous than OP.

    • glass-z13 16 days ago

      Can i borrow your compiler for a few days?

claytonwramsey 16 days ago

There is perhaps some irony in the fact that this blog was posted to Medium, which serves 10.88 MB for a 265-word article.

  • apantel 15 days ago

    The ads are the real content from Medium’s perspective. The article is actually the medium by which the real content is delivered, like a train carrying dark passengers. The article is not what Medium cares about delivering to your browser, but the ads. And delivering the ads requires a lot of complexity.

    • Animats 15 days ago

      The article is an ad: "*** provides uptime monitoring and flow-based monitoring for APIs."

      This is an important subject, thus it's one for which clickbait is generated.

      Size is a problem. I look at my Rust compiles scroll by, and wonder "why is that in there?". I managed to get tokio out, which took some effort. The whole "zbus" system was pulled in because the program asks if the user is in "dark mode". That brought in the "event-listener" system.

      Lately, "bash" in a Linux console has become much slower about echoing characters. Did someone stick in spell check, or a LLM for autocomplete, or something?

      • boustrophedon 15 days ago

        I'm not sure if it's related, but I have the git branch in my PS1 and I've noticed that it's much slower to show a new prompt when inside very large repositories now, and I don't think that was the case previously.

      • fragmede 15 days ago

        I'd check your .bashrc because that shouldn't be happening without your say so.

        • Animats 15 days ago

          .bashrc hasn't change since 2021. But Ubuntu pushed a new /usr/bin/bash in mid-March.

          • NoGravitas 15 days ago

            It's more likely to be system-provided config files like bash-completions.

            • Animats 13 days ago

              It's Ubuntu, with their automatic updates. Canonical is now at least as intrusive as Microsoft was in the days of Windows 7.

    • collinrapp 9 days ago

      …what ads? See the last paragraph here [0].

      Obviously your statement is true about most other sites, but I thought it was an odd thing to say about a platform that famously doesn’t serve ads.

      [0] https://medium.com/about

  • Aloisius 15 days ago

    Firefox about:process reports the article taking 239 MB of memory and 0.06-0.2% of my CPU ten minutes after it finished loading - 45% of the CPU time seems to be spent in Google's reCAPTCHA.

    I wish Mozilla or Google or someone aggregated statistics for cpu/memory/energy usage by domain to shame devs who clearly don't otherwise care.

  • zer00eyz 16 days ago

    And browsers are larger that some operating systems. And talk about a closed off ecosystem ... WASM is still crippled and JS/HTML/CSS is your only real viable option for web development.

    The web feels like 2005 again. Only thing is, this time the popups are embedded in the page...

    • vnuge 15 days ago

      I think I would prefer 2005 web again. I'd probably be able to see more of the internet. I use heavy DNS filtering, no javascript on untrusted sites, no cookies, no fonts, VPN and so on. With cloudflare blocking me I basically can't see the majority of websites.

      • snoman 15 days ago

        Oh don’t worry. Once dns-over-https becomes standard, you won’t be able to do any dns filtering anymore.

        • vnuge 15 days ago

          Why not? I can still mitm DOH now? I try to use DOH for everything I have. I did recently switch to self hosted recursive resolution.

        • prmoustache 15 days ago

          I don't know about other browsers but on firefox I can decide which DNS server is used for DNS over https.

  • anthk 15 days ago

    For that I fire up a Gemini browser against gemini://gemi.dev/bin/waffle.cgi and paste the URL. For non Gemini network users, just change medium.com to scribe.rip at the URL.

  • kmstout 15 days ago

    It's fine in a text-mode browser.

stephc_int13 16 days ago

My opinion about this is that yes, we lost our way, and the reason is very simple, it is because we could. It was the path of least resistance, so we took it.

Software has been freeriding on hardware improvements for a few decades, especially on web and desktop apps.

Moore's law has been a blessing and a curse.

The software you use today was written by people who learned their craft while this free-ride was still fully ongoing.

  • alerighi 16 days ago

    The thing that makes me crazy is that the thing that we do on computers are basically the same each year, yet software are more and more heavy. For example just in 2010 a Linux distribution with a DE just started did consume 100Mb of RAM, an optimized version 60Mb of RAM. I remember it perfectly. I had 2Gb of RAM and did not have even a swap partition.

    Now just a decade later, a computer with less than 8Gb of RAM is unusable. A computer with 8Gb of RAM is barely usable. Each new software uses Electron and consumes roughly 1Gb of RAM minimum! Browsers consume a ton of RAM, basically everything consumes an absurd amount of memory.

    Not talking about Windows, I don't even know how people can use it. Every time I help my mother with the computer is so slow, and we talk about a recent PC with an i5 and 8Gb of RAM. It takes ages to startup, software takes ages to launch, it takes 1 hour if you need to do updates. How can people use these system and not complain? I would throw my computer out of the window if it takes more than a minute to boot up, even Windows 98 was faster!

    • flenserboy 15 days ago

      Think also about all the finished stand-alone applications which have been discarded because of replacement APIs, or because they were written in assembly. We had near-perfect (limited feature-wise from a 3-decade view, of course) word processors, spreadsheets, and single-user databases in the late 80s/early 90s which were, except for many specific use-case additions, complete & only in need of regular maintenance & quality-of-life updates were there a way to keep them current. They were in many cases far better quality & documented than almost any similar applications you can get your hands on today; so many work-years done in parallel, repeated, & lost. If there wouldn't be software sourcing & document interchange issues, it would be tempting to do all my actual office-style work on a virtual mid-90s system & move things over to the host system when printing or sending data.

      Addition: consider also how few resources these applications used, & how they, if they were able to run natively on contemporary systems, would have minuscule system demands compared to their present equivalents with only somewhat less capability.

      • anonzzzies 15 days ago

        > limited feature-wise from a 3-decade view

        Outside gaming, ai and big data, aka things for instance my parents don’t use at all, what limited feature wise? Browsers, sure, however my father prefers Teletext and newsgroups and Viditel (doesn’t exist anymore but he mentions it quite a lot) over ad infested slow as pudding websites. Email didn’t change since the 90s, word processors changed but not with stuff most people use (I still miss WP; it was just better imho; I went over to Latex because I find Word a complete horror show and that didn’t change), spreadsheets are used by pros and amateurs alike as a database mostly for making lists; nothing new there. You can go on and on; put an average user behind a 80s/90s pc (arguably after win95 release; DOS was an issue for many and 3.1 was horrible; or Mac OS) and they will barely notice the difference. Except for the above list of ai, big data, gaming and most importantly, browsers. Ai is mostly an api so that can be fixed (I saw a c64 openai chat somewhere) , big data is a very small % of humanity using that and gaming, well, depends what you like. I personally hate 3d games; I like 80s shmups and most people who game are on mobile playing cwazy diamonds or whatnot which I can implement on an msx 8 bit machine from the early 80s. Of course the massive multiplayer open world 3d stuff doesn’t work.

        Anyway; as I said before here responding to what software/hardware to use for their parents; whenever someone asks me to revive their computer, I install Debian with i3 wm and dillo and ff as browser, Libreoffice and thunderbird. It takes a few hours to get used to but people (who are not in IT or any other computer fahig job) are flabbergasted by the speed, low latency and battery life. I did an x220 (with 9 cell) install last week; from win xp to the above setup; battery life jumped from 3 to 12 hours and everything is fast.

        I install about 50 of those for people in my town throughout the year; people think they depend on certain software, but they really usually don’t. If they do, most things people ask for now work quite well under Wine. I have a simple script which starts an easy ‘Home Screen’ on i3 with massive buttons of their favourite apps which open on another screen (1 full screen per screen); people keep asking why Microsoft doesn’t do that instead of those annoying windows…

      • sydbarrett74 15 days ago

        Your sentiment is probably shared by many dusting off old systems and going back to first principles. SerenityOS is one example.

      • hobs 15 days ago

        It's because a lot of it is fashion, doesn't matter if you have an old working shirt, need new shirt.

    • paulryanrogers 15 days ago

      Windows 98 was often running on fragmented disks. I recall it taking minutes before I could do useful work. And having multiple apps open at once was more rare. While possible it often ended in crashes or unusable slowness.

      • whatevaa 15 days ago

        Experienced same, it was faster to not multitask, do one thing a time. You would think launching 2 tasks would take 2x time with same resources, but it felt more like 3-4x. Disk was 1GB back then. I blame it on disk seek times and less advanced IO scheduling.

    • wvenable 15 days ago

      > The thing that makes me crazy is that the thing that we do on computers are basically the same each year

      I think that is some kind of fallacy. We are doing the same things but the quality of those things is vastly different. I collect vintage computers and I think you'd be surprised how limited we were while doing the same things. I wouldn't want to go back.

      Although I will say your experience with Windows is different than mine. On all my machines, regardless of specs, start up is fast so the point where I don't even think about it.

    • jimmaswell 15 days ago

      I have a Macintosh Plus, SE, 7200, and iMac G3 (System 6, 6, 7, 9) I've been using for fun lately after fixing many of them up. Even with real SCSI harddrives in the SE, 7200, and iMac, they're such a joy to use compared to a modern OS. Often much more responsive, UI is always more consistent, not to mention better aesthetics. They really don't make software like they used to. A web browser or OS should not be slow on any modern hardware but here we are.

      • NoGravitas 15 days ago

        System 7 runs so fast in BasiliskII on an old Atom netbook. I recently saw a video showing System 6 running in an emulator on an ESP32 microcontroller on an expansion card in an Apple II. It was substantially faster than the Mac Plus it was emulating. It really takes seeing this kind of thing to understand the magnitude of the problem.

    • VelesDude 15 days ago

      My daily runner is a T400 Laptop with 4GB RAM on a fairly slim Linux distro. But in the last 6-12 months it is starting to feel a little tight when it comes to anything web browsing. Even things like Thunderbird are getting very bulky in keeping up with web rendering standards.

      I pulled down an Audiobook player the other day, once all dependencies were meet, it need 1.3GB to function! At least VLC is still slim.

      • prmoustache 15 days ago

        I think there is a thing about starting to boycott overly heavy websites.

        There are some useful resources: https://greycoder.com/a-list-of-text-only-new-sites/

        There are also some tricks to have a lighter web browsing by default:

        - try websites with netsurf, links or w3m first

        - using a local web to gemini proxy to browse many websites with a lightweight gemini browser.

        And you can go a long way by using an adblocker and/or disabling javascript by default using an extension with a toggle.

    • tomsmeding 15 days ago

      Not discounting your lament about memory use, this caught my eye:

      > I would throw my computer out of the window if it takes more than a minute to boot up, even Windows 98 was faster!

      Sure, Windows has grown a lot in size (as have other OSes). But startup is typically bounded by disk random access, not compute power or memory (granted, I don't use Windows, if 8GB is not enough to boot the OS then things are much worse than I thought). Have you tried putting an SSD in that thing?

      (And yes, I realise the irony of saying "just buy more expensive hardware". But SSDs are actually really cheap these days.)

      • ponector 15 days ago

        But that is true. My laptop with windows, i7, nvme, 32gb ram now feels the same as my old laptop with i7, SSD and 16gb ram did 7 years ago.

        Bloat ware everywhere, especially browsers.

        • fuzzfactor 15 days ago

          A brand new mid-range business PC is not as snappy as they were brand new 20 years ago with XP.

          And that was on an IDE HDD, with memory speed, processor speed and cores a fraction of today, and 512MB of graphics memory or less.

        • noahtallen 15 days ago

          This whole thread needs a huge amount of salt and some empirical examples. I think if you compared side-by-side it’d be different. I remember my upgrade from 2019 MacBook to M1, when every single task felt about 50% faster. Or from swapping a window laptop’s HDD with an SSD. (Absolutely massive performance improvement!) Waiting forever for older windows computers to boot, update, index or search files, install software, launch programs, etc. Waiting ages for an older iMac to render an iMovie timeline.

          Others in the thread talking about the heyday of older spreadsheet and document programs that were just as fast. So? I bet you could write a book on the new features and more advanced tools that MS Excel offers today compared to 1995.

          We went from things taking minutes to taking seconds. So you could improve things by 50% and that could be VERY noticeable. (1min to 30s, for example.) If your app already launches in 500ms, 250ms is not going to make your laptop feel 2x faster even if it is. On top of that, since speed has been good enough for general computing for several years now, new laptops focus more on energy efficiency. I bet that new laptop has meaningfully better battery and thermal performance!

          • NoGravitas 15 days ago

            > I bet you could write a book on the new features and more advanced tools that MS Excel offers today compared to 1995.

            I'm sure you could, but it would be of interest to a relatively small audience. Excel 95 would be fine for about 90% of Excel users.

          • ponector 14 days ago

            How advanced is excel now comparing with 2016 version?

            New expensive laptop had the same "fast" feeling which fade with new iterations of software. Browser takes insane amount of CPU and memory but isn't faster.

            Maybe some intense CPU tasks like zipping folder is faster then ever, but I'm not zipping all day. However Slack is behaving like there is server side remote rendering for each screen...

          • FeepingCreature 15 days ago

            If you keep your software up to date, every hardware upgrade will feel like a significant improvement. But you're comparing the end of one hardware cycle to the beginning of the next. You regain by upgrading what you previously lost to gradual bloat.

      • anonymoushn 15 days ago

        I think Windows taking 1 minute on SSDs is typical, and it takes like 40 if you want to use a spinning magnet

        • xboxnolifes 15 days ago

          Most of my windows PC's boot time happens before my computer even starts loading the OS. If I enabled fast boot in my bios, I'm pretty sure my PC would boot in around 15 seconds.

    • test6554 15 days ago

      Back in my day websites didn't have "dark mode" and we liked it. We didn't trust the compiler to do our optimizations in the snow (both ways). etc.

      • timeon 15 days ago

        Back in my day there was only "dark mode" and we liked it.

    • sydbarrett74 15 days ago

      Blame surveillance capitalism for a lot of this. All those hundreds (thousands?) of trackers running simultaneously add up.

  • vnuge 15 days ago

    > It was the path of least resistance, so we took it.

    Well said. I believe many of the "hard" issues in software were not "solved" but worked around. IMO containers are a perfect example. Polyglot application distribution was not solved, it was bypassed with container engines. There are tools to work AROUND this issue, I ship build scrips that install compilers and tools on user's machines if they want but that can't be tested well, so containers it is. Redbean and Cosmopolitan libc are the closest I have seen to "solving" this issue

    It's also a matter of competition, if I want users to deploy my apps easily and reliably, container it is. Then boom there goes 100mb+ of disk space plus the container engine.

    • mike_hearn 15 days ago

      It's very platform specific. MacOS has had "containers" since switching to NeXTStep with OS X in 2001. An .app bundle is essentially a container from the software distribution PoV. Windows was late to the party but they have it now with the MSIX system.

      It's really only Linux where you have to ship a complete copy of the OS (sans kernel) to even reliably boot up a web server. A lot of that is due to coordination problems. Linux is UNIX with extra bits, and UNIX wasn't really designed with software distribution in mind, so it's never moved beyond that legacy. A Docker-style container is a natural approach in such an environment.

      • skydhash 15 days ago

        Is it? I'm using LXC containers, but that mostly because I don't want to run VMs on my devices (not enough cores). I've noted down the steps to configure them if I ever have to redo it so I can write a shell script. I don't see the coordination problem if you choose one distro as your base and then provision them with shell scripts or ansible. Shipping a container instead of a build is the same as building desktop apps instead of electrons, optimizing for developer time instead of user resources.

        • mike_hearn 15 days ago

          > if you choose one distro as your base

          Yes obviously if you control the whole stack then you don't really need containers. If you're distributing software that is intended to run on Linux and not RHEL/Ubuntu/whatever then you can't rely on the userspace or packaging formats, so that's when people go to containers.

          And of course if part of your infrastructure is on containers, then there's value in consistency, so people go all the way. It introduces a lot of other problems but you can see why it happens.

          Back in around 2005 I wasted a few years of my youth trying to get the Linux community on-board with multi-distro thinking and unified software installation formats. It was called autopackage and developers liked it. It wasn't the same as Docker, it did focus on trying to reuse dependencies from the base system because static linking was badly supported and the kernel didn't have the necessary features to do containers properly back then. Distro makers hated it though, and back then the Linux community was way more ideological than it is today. Most desktops ran Windows, MacOS was a weird upstart thing with a nice GUI that nobody used and nobody was going to use, most servers ran big iron UNIX still. The community was mostly made up of true believers who had convinced themselves (wrongly) that the way the Linux distro landscape had evolved was a competitive advantage and would lead to inevitable victory for GNU style freedom. I tried to convince them that nobody wanted to target Debian or Red Hat, they wanted to target Linux, but people just told me static linking was evil, Linux was just a kernel and I was an idiot.

          Yeah, well, funny how that worked out. Now most software ships upstream, targets Linux-the-kernel and just ships a whole "statically linked" app-specific distro with itself. And nobody really cares anymore. The community became dominated by people who don't care about Linux, it's just a substrate and they just want their stuff to work, so they standardized on Docker. The fight went out of the true believers who pushed against such trends.

          This is a common pattern when people complain about egregious waste in computing. Look closely and you'll find the waste often has a sort of ideological basis to it. Some powerful group of people became subsidized so they could remain committed to a set of technical ideas regardless of the needs of the user base. Eventually people find a way to hack around them, but in an uncoordinated, undesigned and mostly unfunded fashion. The result is a very MVP set of technologies.

      • titzer 15 days ago

        > A lot of that is due to coordination problems.

        The dumpster fire at the bottom of that is libc and the C ABI. Practically everything is built around the assumption that software will be distributed as source code and configured and recompiled on the target machine because ABI compatibility and laying out the filesystem so that .so's could even be found in the right spot was too hard.

        • fch42 15 days ago

          To quote Wolfgang Pauli, this is not just not right, it's not even wrong ...

          The "C ABI" and libc are a rather stable part of Linux. Changing the behaviour of system calls ? Linus himself will be after you. And libc interfaces, to the largest part, "are" UNIX - it's what IEEE1003.1 defines. While Linux' glibc extends that, it doesn't break it. That's not the least what symbol revisions are for, and glibc is a huge user of those. So that ... things don't break.

          Now "all else on top" ... how ELF works (to some definition of "works"), the fact stuff like Gnome/Gtk love to make each rev incompatible to the prev, that "higher" Linux standards (LSB) don't care that much about backwards compat, true.

          That, though, isn't the fault of either the "C ABI" or libc.

          • mike_hearn 15 days ago

            Things do break sadly, all the time, because the GNU symbol versioning scheme is badly designed, badly documented and has extremely poor usability. I've been doing this stuff for over 20 years now [1] [2], and over that time period have had to help people resolve mysterious errors caused by this stuff over and over and over again.

            Good platforms allow you to build on newer versions whilst targeting older versions. Developers often run newer platform releases than their users, because they want to develop software that optionally uses newer features, because they're power users who like to upgrade, they need toolchain fixes or security patches or many other reasons. So devs need a "--release 12" type flag that lets them say, compile my software so it can run on platform release 12 and verify it will run.

            On any platform designed by people who know what they're doing (literally all of the others) this is possible and easy. On Linux it is nearly impossible because the entire user land just does not care about supporting this feature. You can, technically, force the GNU ld to pick a symbol version that isn't the latest, but:

            • How to do this is documented only in the middle of a dusty ld manual nobody has ever read.

            • It has to be done on a per symbol basis. You can't just say "target glibc 2.25"

            • What versions exist for each symbol isn't documented. You have to discover that using nm.

            • What changes happened between each symbol isn't documented, not even in the glibc source code. The header, for example, may in theory no longer match older versions of the symbols (although in practice they usually do).

            • What versions of glibc are used by each version of each distribution, isn't documented.

            • Weak linking barely works on Linux, it can only be done at the level of whole libraries whereas what you need is symbol level weak linking. Note that Darwin gets this right.

            And then it used to be that the problems would repeat at higher levels of the stack, e.g. compiling against the headers for newer versions of GTK2 would helpfully give your binary silent dependencies on new versions of the library, even if you thought you didn't use any features from it. Of course everyone gave up on desktop Linux long ago so that hardly matters now. The only parts of the Linux userland that still matter are the C library and a few other low level libs like OpenSSL (sometimes, depending on your language). Even those are going away. A lot of apps now are being statically linked against muslc. Go apps make syscalls directly. Increasingly the only API that matters is the Linux syscall API: it's stable in practice and not only in theory, and it's designed to let you fail gracefully if you try to use new features on an old kernel.

            The result is this kind of disconnect: people say "the user land is unstable, I can't make it work" and then people who have presumably never tried to distribute software to Linux users themselves step in to say, well technically it does work. No, it has never worked, not well enough for people to trust it.

            [1] Here's a guide to writing shared libraries for Linux that I wrote in 2004: https://plan99.net/~mike/writing-shared-libraries.html which apparently some people still use!

            [2] Here's a script that used to help people compile binaries that worked on older GNU userspaces: https://github.com/DeaDBeeF-Player/apbuild

            • mattpallissard 13 days ago

              > How to do this is documented only in the middle of a dusty ld manual nobody has ever read.

              This got an audible laugh out of me.

              > Good platforms allow you to build on newer versions whilst targeting older versions.

              I haven't been doing this for 20 years (13), but I've written a fair amount of C. This, among other things, is what made me start dabbling with zig.

                ~  gcc -o foo foo.c
                ~  du -sh foo
                16K foo
                ~  readelf -sW foo | grep 'GLIBC' | sort -h
                     1: 0000000000000000     0 FUNC    GLOBAL DEFAULT  UND __libc_start_main@GLIBC_2.34 (2)
                     3: 0000000000000000     0 FUNC    GLOBAL DEFAULT  UND puts@GLIBC_2.2.5 (3)
                     6: 0000000000000000     0 FUNC    GLOBAL DEFAULT  UND __libc_start_main@GLIBC_2.34
                     6: 0000000000000000     0 FUNC    WEAK   DEFAULT  UND __cxa_finalize@GLIBC_2.2.5 (3)
                     9: 0000000000000000     0 FUNC    GLOBAL DEFAULT  UND puts@GLIBC_2.2.5
                    22: 0000000000000000     0 FUNC    WEAK   DEFAULT  UND __cxa_finalize@GLIBC_2.2.5
                ~  ldd foo                                 
                  linux-vdso.so.1 (0x00007ffc1cbac000)
                  libc.so.6 => /usr/lib/libc.so.6 (0x00007f9c3a849000)
                  /lib64/ld-linux-x86-64.so.2 => /usr/lib64/ld-linux-x86-64.so.2 (0x00007f9c3aa72000)
              
              
                ~  zig cc -target x86_64-linux-gnu.2.5 foo.c -o foo
                ~  du -sh foo
                8.0K  foo
                ~  readelf -sW foo | grep 'GLIBC' | sort -h        
                     1: 0000000000000000     0 FUNC    GLOBAL DEFAULT  UND __libc_start_main@GLIBC_2.2.5 (2)
                     3: 0000000000000000     0 FUNC    GLOBAL DEFAULT  UND printf@GLIBC_2.2.5 (2)
                ~  ldd foo                                 
                  linux-vdso.so.1 (0x00007ffde2a76000)
                  libc.so.6 => /usr/lib/libc.so.6 (0x0000718e94965000)
                  /lib64/ld-linux-x86-64.so.2 => /usr/lib64/ld-linux-x86-64.so.2 (0x0000718e94b89000)
              
              
              edit: I haven't built anything complicated with zig as I have with the other c build systems, but so far it seems to have some legit quality of life improvements.
              • mike_hearn 13 days ago

                Interesting that zig does this. I wonder what the binaries miss out on by defaulting to such an old symbol version. That's part of the problem of course: finding that out requires reverse engineering the glibc source code.

                • fch42 13 days ago

                  Maybe just nitpicking but he _specified_ the target version for the zig compile.

                  (Haven't tested what it would link against where that not given)

                  • mattpallissard 12 days ago

                    > Maybe just nitpicking but he _specified_ the target version for the zig compile.

                    Right, but I was able to do it as a whole. I didn't have to do it per symbol.

            • fch42 14 days ago

              Thanks for extensive examples of "the mess"...

              I'd only like to add one thing here ... on static linking.

              It's not a panacea. For non-local applications (network services), it may isolate you from compatibility issues, but only to a degree.

              First, there are Linux syscalls with "version featuritis" - and by design. Meaning kernel 4.x may support a different feature set for the given syscall than 5.x or 6.x. Nothing wrong with feature flags at all ... but a complication nonetheless. Dynamic linking against libc may take advantage of newer features of the host platform whereas the statically linked binary may need recompilation.

              Second, certain "features" of UNIX are not implemented by the kernel. The biggest one there is "everything names" - whether hostnames/DNS, users/groups, named services ... all that infra has "defined" UNIX interfaces (get...ent, get...name..., ...) yet the implementation is entirely userland. It's libc which ties this together - it makes sure that every app on a given host / in a given container gets the same name/ID mappings. This does not matter for networked applications which do not "have" (or "use") any host-local IDs, and whether the DNS lookup for that app and the rest of the system gives the same result is irrelevant if all-there-is is pid1 of the respective docker container / k8s pod. But it would affect applications that share host state. Heck, the kernel's NFS code _calls out to a userland helper_ for ID mapping because of this. Reimplement it from scratch ... and there is absolutely no way for your app and the system's view to be "identical". glibc's nss code is ... a true abyss.

              Another such example is (another "historical" wart) timezones or localization. glibc abstracts this for you, but language runtime reimplementations exist (like the C++2x date libs) that may or may not use the same underlying state - and may or may not behave the same when statically compiled and the binary run on a different host.

              Static linking "solves" compatibility issues also only to a degree.

          • titzer 15 days ago

            glibc is not stable on Linux. Syscalls are.

            • saagarjha 15 days ago

              glibc is ABI-compatible in the forward direction.

            • fch42 15 days ago

              https://cdn.kernel.org/pub/software/libs/glibc/hjl/compat/

              It's providing backwards compatibility (by symbol versioning). And that way allows for behaviour to evolve while retaining it for those who need that.

              I would agree it's possibly messy. Especially if you're not willing or able to change your code providing builds for newer distros. That said though... ship the old builds. If they need it only libc, they'll be fine.

              (the "dumpster fire" is really higher up the chain)

        • vnuge 15 days ago

          > Practically everything is built around the assumption that software will be distributed as source code

          Yup, and I vendor a good number dependencies and distribute source for this reason. That and because distributing libs via package managers kinda stinks too, it's a lot of work. Id rather my users just download a tarball from my website and build everything local.

          • skydhash 15 days ago

            I don't think that users expect developers to maintain packages for every distro. I had to compile ffmpeg lately for a debian installation and it went without an hitch. Yes, the average user is far away from compiling packages, but they're also far away from random distributions.

      • metalspoon 15 days ago

        I think flatpak is closer to .app bundles. So, the argument is a little unfair.

  • EGreg 16 days ago

    Now imagine same but with AI killer bot swarms. Slaughterbots. Because we could !

    As long as we have COMPETITION as the main principle for all tech development — between countries or corporations etc. — we will not be able to rein in global crises such as climate change, destruction of ecosystems, or killer AI.

    We need “collaboration” and “cooperation” at the highest levels as an organizing principle, instead. Competition causes many huge negative externalities to the rest of the planet.

    • HappMacDonald 16 days ago

      What we really need is some way to force competition to be sportsmanlike. EG: cooperating to compete, just like well adjusted competitors in a friendly tournament who actually care about refining their own skills and facing a challenge from others who feel the same way instead of cutting corners and throats to get ahead.

      Cooperation with no competition subtracts all urgency because one must prioritize not rocking the boat and one never knows what negative consequences any decision one makes might prove to have. You need both forces to be present, but cooperation must also be the background/default touchstone with adversarial competition employed as a tool within that framework.

      • EGreg 15 days ago

        I don’t see any urgency in depleting ecosystems or building AI quickly or any other innovations besides ones to safeguard the environment, including animals.

        Human society has developed far slower throughout all history and prehistory, and that was OK. We’ve solved child mortality and we are doing just fine. But 1/3 of arable farmland is now desertified, insect populations are plummeting etc.

        Urgency is needed the other way — in increasing cooperation. As we did ONE TIME with the Montreal Protocol and almost eliminated CFCs worldwide to repair the hole in the ozone layer

      • saulpw 15 days ago

        I like this viewpoint of "cooperate to compete". It's what we've been doing on a global scale as ~all nations have agreed to property rights, international trade, and abiding by laws they've written down. And in fact some would say that at the largest business scale, there is this cooperation--witness the collusion between AAPL/GOOG/etc not to poach each others' employees. But there doesn't seem to be the same respect for "smaller" businesses, as they are viewed as prey instead of weaker hunters.

    • NoGravitas 15 days ago

      You're right, but it's not just tech development, it's pervasive throughout our civilization. And solving it requires solving it almost everywhere, at close to the same time.

  • guestbest 15 days ago

    I disagree. It’s all the frameworks and security features like telemetry of the operating systems and those framework libraries. There are programs written in Lazarus (free pascal) that run blazing fast on windows, even the modern ones like Windows 11. Keeping the software written for a specific purpose for the desktop is the best bet for quickness and stability.

    Every modernization (hardware and framework) in software is a tax on the underlying software in its functional entirety

  • asp_hornet 15 days ago

    > path of least resistance

    Great take. It feels like the path of least resistance peppered with obscene amounts of resume driven development.

    Complexity in all the wrong places.

    • fuzzfactor 15 days ago

      >Did we lose our way

      It wasn't supposed to be like this but it looks like most people never have found the way by now.

      So, misguided efforts, wasted resources, and technical debt piles up like never before, and at an even faster rate than efficiency of the software itself declines on the surface.

  • eternityforest 15 days ago

    Moore's law is still going, but we stopped making software slower.

    We use JITs and GPU acceleration and stuff in our mega frameworks, and maybe more importantly, we kind of maxed out the amount of crazy JS powered animations and features people actually want.

    Well, except backdrop filter. That still slows everything down insanely whenever it feels like it.

constantcrying 15 days ago

Again and again people complain about this. But it remains a fact that essentially nobody actually wants this.

Developers certainly like to have their completely integrated, connected and universal computing platform (the web). And users do not seem to particularly care about performance as long as it is good enough. And that is exactly the standard that is set, software is allowed to be so bad that it doesn't really annoy the user too much. Management doesn't care either, certainly creating good software isn't important when good enough software has already been developed.

Sure, I would like things to be different, but until one group decides that a drastic departure is necessary, nothing will change. There are also no real incentives for change, from any perspective.

  • austinjp 15 days ago

    I take your point, but people certainly do complain about performance and download sizes, they just do it indirectly by describing the side effects. Only recently my partner asked me why her laptop got so hot. An iPhone user I know said they hate it when their phone got "screen freeze", in other words when it became unresponsive. They didn't describe it in terms of performance or app size, even though those are the underlying problems. Anyone trying to download a large app on a phone with poor signal will get frustrated, and people who live in areas with unreliable internet experience third daily. People on low incomes or living in developing nations can often only use older devices which get clogged up with large, slow apps and become frustrating to use.

    If you feel that people don't care about performance and download size, you may be asking the wrong people the wrong questions.

  • II2II 15 days ago

    I suspect a few things are going on here.

    This is not exactly a new phenomena. People have been complaining about software bloat since at least the mid-1990's. I suspect someone older than myself would gleefully explain that the complaint's went back to the mid-1980's, mid-1970's, etc.. Eventually it gets to the point where only outliers will complain. Everyone else will simply upgrade, put up with the bloat, or stick with old software.

    Then there is the question of whether the bloat is worth the benefits. If Docs was a simply a clone of Word, few people would have adopted it. Some people use it because it is free, others since they want to work on or access their documents from various devices, yet others want to collaborate on documents seamlessly. If you're getting something out of the bloat, you're less likely to think of it as bloat.

    We also have to consider that some bloat isn't really bloat. It's easy to point to AppleWorks on the Apple II then bemoan how modern word processes require about five orders of magnitude more resources, while ignoring how resource intensive the niceties are. Want to use proportional fonts that look nice at any size and have them rendered properly on the screen? That's about three orders of magnitude more video memory, additional CPU and use for rendering the text, etc.. I'm using that example since it is something people can actually see. Now consider the things they cannot see (such as working on documents larger than the computer's main memory, the memory required for Unicode fonts, the ability to switch between the working document and research notes, memory protection to prevent an ill behaved application from wiping out all of your work). Yes, bloat exists. On the other hand, a lot of the increased resource use is actually quality of life improvements.

    • Anarch157a 15 days ago

      What people also forget is that if Windows 11 requires 8GB of RAM while Windows 98 only required 8MB, that's 3 orders of magnitude. If we adjust the size of our modern "bloated" applications using the same scale, an app that today takes 100MB on disk, would be equivalent to a just 100KB in 1998, an Electron app that uses 1GB of memory would be equivalent to one that took only 1MB 30 years ago.

      Apps today are no more bloated than they were last century, while we gained a lot of functionalities that would have been considered witchcraft in the days of Win98.

  • impossiblefork 15 days ago

    Do we really though?

    Web developers do of course, but I've hardly touched web development myself. Web interfaces etc., are a choice, but I think it's driven by commercial needs-- a desire for subscription revenue instead of one-time sales, etc.

    Much of the modern cloud-based or half-online world is quite unnatural from a user perspective, and where there is no need for monetisation-- for example with OpenOffice, the software can expected to remain a desktop application.

    • edanm 15 days ago

      > Web developers do of course, but I've hardly touched web development myself. Web interfaces etc., are a choice, but I think it's driven by commercial needs-- a desire for subscription revenue instead of one-time sales, etc.

      This is certainly a big part of it, but it's not the whole story.

      For one thing, there were ways to achieve those business models with native software too - "web-based=subscription" isn't actually a requirement, or the only way to go.

      But users in the early 2000s, many of them more technical than users today, rejected this idea. It felt like native apps had to cost money one time, and doing otherwise would be wrong. But with Web, since users understood that it was hosted elsewhere, it "made sense" for it to be a subscription, so users went with it.

      This also affected the technical things as well. Auto-updating was incredibly frowned upon in the 2000s - you bought it, you got to keep it as-is. So companies had to work very hard to keep multiple versions working all the time.

      Most of these biases by technical users have gone away. We now have auto-updating subscription native apps, e.g. Photoshop works that way today. But these technical biases drove the usage of the web, because it was so much technically easier for businesses, and allowed much better business models.

      (And, and this isn't even getting into the whole "installing software was really hard for users" thing!)

      • skydhash 15 days ago

        If I bought a license in 2004, then run XP for ten years on the same machine, why should I pay for updates when I don’t need the update? If the license said three years, then I’d know I’d be out of support for the last seven years

        Everybody knew that updates weren’t free and they’d buy licenses when it was time to move on. But it’s hard to justify moving on when the software is still running fine even on Windows 10.

    • constantcrying 15 days ago

      Certainly there has been almost no pushback. I don't think most users really care for native applications, what they like to see is clicking on something and having it work instantly, web apps deliver that.

      • jwells89 15 days ago

        I think an overwhelming majority of users aren’t technical enough or well enough versed in UI/UX to be able to put a finger on the frustrations they experience with software, and this is something that’s important to remember when considering complaints coming from a more technically-inclined minority — even if only a small number of techy folks are unhappy, these frustrations likely exist in the larger userbase too even if most users are unable to articulate them. In addition, some percentage undoubtedly perceive these issues but are just too busy to bother with sending in feedback.

        With all that considered, I believe the extent of pushback that is possible is quite limited as long as the app technically works, but this is far from an accurate indicator of user happiness.

        • skydhash 15 days ago

          > I think an overwhelming majority of users aren’t technical enough or well enough versed in UI/UX to be able to put a finger on the frustrations they experience with software

          They're ok with it because they don't know it could be better. A spinner every 2 minutes? 12 minutes to open slack? They accept it as fact of life until a better software comes in, and now they're wondering why they haven't come across something like this sooner.

          • ryandrake 15 days ago

            Sadly, I think people are starting to accept, as a fact of life, that software gets worse every time a developer touches it. People dread "upgrades" because it's going to get slower, buggier, the UI is going to change unnecessarily, and there's nothing they can do about it besides try to stay on the previous version, which is often impossible with web software.

  • SkyPuncher 15 days ago

    Yep. The most successful startup I worked at had a SPA that downloaded 5MB bundle and preloaded a bunch of data. Took nearly 10 seconds to startup.

    Nobody complained about that. In fact, few people complained about a few portions of the app that had abysmal performance. It often wasn’t until 60 second load times that customers started complaining.

    They still raved and raved about that software because it solved an extremely valuable problem for them. A job that took literally a week could now be done in minutes.

    As the space heated up, we needed to improve some things, but most people literally did not care. It would always be stack ranked last in the list.

    • trgn 15 days ago

      Startup time of an SPA is meaningless when it's for the sort of app you open once in the morning and then use during the rest of the day. It's a single startup-hit, and the user suffers it in between closing the tabs from the previous day and fiddling with some emails. Doesn't matter it is 10-20 seconds.

      The problem with the long startup is that it tends to cloud any discussion on performance. Code loading and parsing is basically the biggest bar in the app-perf breakdown of your profiles, and thus spins this narrative that this is the thing to optimize for, because it's the biggest bang. Rather than say, responding to user selections, reducing jitter and sluggishness while scrolling, etc...

      I'm starting to believe that for a large class of apps, developers should look at it as if they are writing video games: the user will tolerate the spinner before the level, but then it needs to be silky smooth after. And the _smooth after_ requires a whole class of other optimizations; it's striving for a flat memory profile, it's small ad-hoc data transfer, it's formatting data into usable layout at lower levels in the stack, it's lazy loading of content in the background, etc... Those are the areas where web-devs should be looking a

      (again, this only applies for that sort of SPA; e.g read-only content, blogs and such, should display _fast_).

      • skydhash 15 days ago

        Most of my current applications have been opened since the computer booted (11 days ago). Where I'm drawing the line is wasting resources and time while I'm using them. As you said, mostly about user interactions and scrolling because UI is bound to IO for some reason. I remember all the big software taking time to startup (Adobe's, Autodesk's, even Microsoft Office's), but once they do, it was pretty smooth unless you launch that computer/gpu intensive operation. But now, things like Slack causes the computer fan to scream.

    • yen223 15 days ago

      "They still raved and raved about that software because it solved an extremely valuable problem for them. A job that took literally a week could now be done in minutes."

      This is a big point isn't it.

      We seem to think that customers are choosing "slow" over "fast", when a lot of times they are really choosing between "slow" vs "manual" (i.e. very very slow)

      • skydhash 15 days ago

        You’d always take a bike instead of walking if you can’t get a car. No one is looking to waste time when they need to get something done. If a tool is the only thing in town, they’ll praise it. Until your competitor came with something better in the way that matter.

        • yen223 15 days ago

          I do not doubt that if customers had a choice between "slow" and "fast", all things being equal customers will pick "fast". Customers aren't stupid.

          But in a surprising number of cases, either customers don't have that choice (because the market hasn't provided a "fast" solution yet), or all things are not equal (say, the fast solution is fast because it's missing features that are crucial).

          And this is why it always looks like customers are content with poorly-performing solutions.

  • VelesDude 15 days ago

    The area I feel this difference the most is when you use software that didn't fall into this trap. Things like MYOB EXO/CRM, SAP ERP systems have code bases stretching back decades that innovate at a glacial pace. As such it is basically 2000's tech that we are forced to still use and that has turned into its big advantage.

    It is always fun to put up the task managed on these to see them using 20-30MB of RAM with a large part of that being the current data base loaded in.

    VLC & Blender are other examples of this.

  • somenameforme 15 days ago

    You're saying two very different things. Nobody actually wants this is not the same as nobody is actually moving forward to achieve this. Performance is something like a tragedy of the commons. Because everybody is just doing what's in their own short term interests, but the long-term consequence is where we are today: you'd need what would literally have been a supercomputer, not that long ago, to run a word processor.

    It's kind of funny to imagine this parallel world where you send a PC of today back to the 70s. Whichever government got their hands on it would be keeping it ultra classified and hiding it away, like it was some device, too dangerous for the public, that could computationally solve any problem imaginable, create anything imaginable.

  • metalspoon 15 days ago

    The blog is flawed. 33MB is likely no problem for the web. It's just that Google Docs devs haven't cared about that size probably because hardly anybody creates a huge doc on their platform. Or maybe his dad created a complicated doc file that G Doc failed to parse.

    That's different from us devs losing efficiency from our deployment platform.

ab8 15 days ago

It is interesting to see most people lay the blame at the feet of developers.

The reality is that these are all business decisions:

1) Move to the cloud because the business likes the steady payout of subscriptions. Business customers love not having to hire IT teams and demand six 9s of uptime because it is someone else’s responsibility. But performance needs to just be acceptable to end users.

2) Customers refusing to upgrade on-premises software, that led to long maintenance cycles and endless patches

3) Developing once for the web vs. Multiple times for different platforms – each needing its own developers and testers.

No amount of expertise on the part of developers is going to address these fundamental forces.

  • teeray 15 days ago

    > Customers refusing to upgrade on-premises software

    After a certain period of time, that software worked just fine for those customers. Photoshop is a great example. Sure, you won’t get the flashiest features, but CS4 will still work for you on a Win7 machine without any additional fees paid.

    • ryandrake 15 days ago

      Once I commit to buying a version of Software X, I'm happy with it. As a user I expect Software X to work as-installed for decades to come. I don't want new features. I don't want the UX to change on me all of a sudden. I don't want it to get slower. Bugfixes and security fixes are fine, as long as everything else remains the same. I wish more developers understood and respected this.

      • edanm 15 days ago

        This attitude is why the web won, IMO.

        When it comes to native apps, in the 2000s, this was the common attitude of users. But it's much harder to implement from a business perspective! Both in terms of business models, and in terms of dev time - having a bunch of possibly-incompatible versions lying around is a lot of overhead.

        On the web, where most technical users understood this is technically impossible, they were willing to allow businesses to act differently, keep the software always-updated, and charge per usage. And since that's much easier and more lucrative for companies, they all switched to that.

        (Now everyone kind of accepts that model, which is why today's Photoshop works via subscription, but the "damage" was done and the web won.)

      • mike_hearn 15 days ago

        > Bugfixes and security fixes are fine, as long as everything else remains the same.

        Devs absolutely do not enjoy backporting bug fixes to 5 different LTS versions of their software and then getting user complaints because there's inevitably an important customer who is six versions back. It's inefficient with expensive dev time and it's better for the business to use that time to create new features.

        edanm is correct, a lot of this is historical caused by very loud and angry tech users around the turn of the millennium. Want to know why Chrome won? When telling that story people tend to focus on performance or security, but that's not really it. Chrome won because Larry Page overrode all the internal screaming about silent web-style auto update for desktop apps. Oh boy, a whole lot of people really hated that idea, in fact Google had to develop their own software update engine from scratch to make it happen. Page didn't care. He understood that the ability to release a new version of web apps every week without the user noticing was a huge competitive advantage for the web, IE also updated in the background as part of the OS, and he wanted Google's desktop apps to have that same advantage.

        Meanwhile Firefox stuck with the old model of rare releases and letting users choose whether to upgrade or not. It was a disaster. Old Firefoxes constantly annoyed web devs by preventing them from using new features. Security patches got reverse engineered and exploited. Still, Firefox's passionate fanbase loudly rejected the Chrome approach because they felt it took away their control.

        Eventually the Mozilla guys accepted that they were wrong, their fans were wrong and Larry Page was correct. But it took years and in that time Chrome had built up a huge reputational advantage.

      • ehnto 15 days ago

        This is the real reason I use linux and open source, I want stability and flexibility not inevitably enshittifying SaaS. I am not an OSS or FOSS die hard, and I even advocate for a return to selling software as a deliverable so making small applications is a viable small business. But SaaS is the only viable business model it seems.

      • wpm 15 days ago

        Developers probably do.

        Businesses don't. Gotta find a way to sell the same shit every year.

  • worksonmine 15 days ago

    You can have efficient web apps running on the cloud, it's just a server after all. The issue lies with developers developing on machines their users can't afford and not caring about performance and efficient code.

  • ausbah 15 days ago

    i think many developers would make these same decisions. it’s painful to have to maintain separate platform versions of the same software, dealing with servers takes away development time, etc

BobbyTables2 16 days ago

In the early 90s, I believe MS Word came on a few floppy disks and the main executable was 2MB. It ran fine on a 16Mhz 386 with a total of 2MB RAM (let that sink in!)

It did pretty much everything it does now, only lacked a grammar checker. (WordPerfect had one.)

Now we measure things in GB units. 1000X bigger, but what was gained?

We not only lost the way, we don’t even know the destination any more.

  • dale_glass 16 days ago

    > Now we measure things in GB units. 1000X bigger, but what was gained?

    Functionality and graphics.

    For instance 'dict.words' alone on Linux is 4.8MB. Arial Unicode is a 20MB-ish font. The icon for an application I work on is 400K. The Google Crashpad handler for handling crashes is somewhere around several MB.

    A 4K true color display is 138 times larger than 640x480x16 colors.

    • berkes 16 days ago

      Let's ask the question differently: what problems were solved?

      With your examples, it could be:

      - introduce global spell checker.

      - have emoji?

      - fix blurry icons?

      - being able to search through crash logs?

      - not having to switch between windows.

      Do we need GBs instead of MBs for that? Why? Was that problem not fixed already? Could we not fix it in a way that didn't demand magnitudes more resources?

      I'm asking, because I highly doubt that there's a technical reason that requires an improved piece of software or a solved problem,to require magnitudes more resources.

      Sure, slack is far superior in UX to IRC. But could we really not get that UX without bloatware hogging my CPU, taking hundreds of MBs installation size and often biting off significant chunks of my memory? Is that truly, technically impossible?

      • dale_glass 16 days ago

        > Let's ask the question differently: what problems were solved?

        A few more:

        * Seamless internationalization. If you're a native English speaker you probably never experienced the "fun" of dealing with French and Russian in the same text document. Pre-Unicode supported English + one other language, if that other language wasn't too weird.

        * Lots of tiny life quality improvements. Eg, not seeing windows repaint costs a LOT of memory. Every window is present in RAM even if not being looked at so that when you switch to it you never see it paint.

        * Stability. Windows 9x tried to be frugal by keeping a copy of everything in system32. That was called "DLL hell". So the current standard is that the app just packages every framework, so you may have a half dozen copies of Qt easily.

        > Do we need GBs instead of MBs for that? Why?

        Well, let me look at my AppImage:

        3.8 GB total.

        2.3 GB of dependencies. 2.1 GB is libnode, 128 MB is Qt Webengine.

        1.4 GB application. 126 MB of JavaScript and UI images. The rest is mostly code.

        • FdbkHb 15 days ago

          > * Seamless internationalization. If you're a native English speaker you probably never experienced the "fun" of dealing with French and Russian in the same text document. Pre-Unicode supported English + one other language, if that other language wasn't too weird.

          For some programs, that hasn't changed. I use OneNote heavily to write some sort of personal info database I always look up when I forget something or need to reproduce a command verbatim quickly. The act of writing it and organizing the data also heavily reinforces my ability to memorize thing in my mind in and of itself too. So I'm quite fond of that little program.

          When I tried to use it while learning Chinese I ended up having to turn off the spelling/grammar correction. It just can't function with two languages in the same notebook. All the Chinese text had the red squiggly lines warning you of a mistake and I found no way to enable the support for more than one language. You must select /one/ language for the spell checker in that program.

          Or disable the spellchecker, which is what I did in the end.

          • berkes 14 days ago

            I guess that's the worst case. Where we add gigabytes of data, require magnitudes more CPU and memory, but reintroduce problems that were long fixed.

            I see that a lot with JavaScript apps. When they replace native, they often fail in details. Where e.g. my native text areas can handle multiple languages when spell checking. But where that diy or spellchuck.js npm version cannot.

        • berkes 15 days ago

          My point is that we are improving the experience.

          But that the "cost" to do so, isn't what's technically required to improve it. You can achieve all these improvements, solve all these problems, probably without much more resource usage. Or negligible added resource usage.

          Therefore my conclusion is that the reason e.g. slack usage x1000 what my old IRC or Jabber cliënt used, isn't technical. It's a deliberate choice made for reasons of budget, time to market or another trade-off.

          I'm certain that Slack could build a client that does all what slack does, in a client that's hundred(s) of times snappier, smaller, less CPU and memory using. But probably not with their current pace, budget, team or wages.

          • dale_glass 15 days ago

            > Therefore my conclusion is that the reason e.g. slack usage x1000 what my old IRC or Jabber cliënt used, isn't technical. It's a deliberate choice made for reasons of budget, time to market or another trade-off.

            That was never not the case.

            Jabber makes a big usage of XML, which back in the day was very much seen as overkill. It requires a pretty complex parser, and increases the amount of data considerably.

            They could have gone with a much more compact binary protocol with ID/length/value pairs, where there's not even field names, but say, a 16 bit integer where IDs are allocated from a central registry.

            Even going back to DOS, you could shrink a program with measures like outputting "Error #5" instead of "File not found", and require the user to look up the code in a manual.

        • goalieca 15 days ago

          I don’t know about nextstep but macOS had all this stuff when I first used it 20+ years ago. It featured compositing rendering, had the apps, supported ppc/x64 in an app image, had a microkernel. I even remember it got an emulator for running ppc code on an x64.

          The newest macOS still needs more memory and suffer bloat but 8GB is still perfectly useable if you avoid google chrome. 8GB is also perfectly usable for Linux too.

          • skydhash 15 days ago

            My SO's laptop is an intel MBA with 8GB of ram. Everything's fine until Google Chrome starts (some work tools). Even if the cpu is not as efficient as the m-series, it runs quite well, even with the tropical weather. But launch Chrome and you have a toaster.

          • mike_hearn 15 days ago

            Yes but OS X was widely considered unusably slow for the first few versions. Also OS X was only partly a microkernel. Things like the filesystem and network stack ran and still do run in kernel space.

      • Rury 15 days ago

        The problem is that everyone here is looking at it from a software design standpoint, not a software development standpoint. Once you look from the latter it's obvious why things are the way they are: businesses are trying to cheapen out on software development costs. As a result, software quality cheapens.

        For example, if disk space is abundant and very cheap, and optimizing software to use as little as disk space as possible is relatively more expensive than throwing more disk space at the problem, you shouldn't be surprised that software starts using more disk space than necessary, because what's being optimized is software development costs.

    • giantrobot 16 days ago

      While plenty of software is overly fat, you hit the nail on the head.

      A Word document isn't just text and some formatting sigils. Editing isn't just appending bytes to the end of a file descriptor.

      It's a huge memory structure that has to hold an edit history so undo and track changes works, the spelling and grammar checker needs to live entirely in RAM since it runs in realtime as you edit, and the application itself has thousands of not millions of allocated objects for everything from UI elements to WordArt. The rendering engine needs to potentially hold a dozen fonts in memory for not just the system but any specified but not immediately used fonts from the base document template.

      It's not like Google Docs is any lighter on RAM than Word. Features require memory. Fast features are usually going to require more memory.

      People can use AbiWord if they want a much slimmer word processor. They could also just use nano and Markdown if they wanted even slimmer. But a lot of people want document sharing over the Internet with track changes, grammar checking, and the ability to drag in and edit an Excel spreadsheet.

      The features used in native software follow a bathtub curve. But not just one but several. No two groups necessarily use the same sets of advanced/uncommon features.

      • ptx 15 days ago

        > the application itself has thousands of not millions of allocated objects for everything from UI elements to WordArt

        There are ways to optimize those things, though, which developers might not be bothering with anymore. The Design Patterns book used a word processor as the example when explaining the flyweight pattern for efficiently representing lots of objects. OLE objects like WordArt support different states[0] and don't necessarily have to be active at all times.

        [0] https://learn.microsoft.com/en-us/windows/win32/com/object-s...

    • MonkeyClub 16 days ago

      >> Now we measure things in GB units. 1000X bigger, but what was gained?

      > Functionality and graphics.

      And massive amounts of telemetry.

      But now functionality is moving to the cloud, we'll just be stuck with gigabytes for graphics and telemetry.

  • ale42 15 days ago

    A few years ago, as a 1st April joke, I set up a DOS/Windows 3.11 disk image on our PXE network boot server. It included a functional Word 6 for Windows, and the gzipped image was fitting in 12 MB... PCs back then could still boot without UEFI (and run Windows 3.11 if it was properly configured). It booted almost instantly. Same for opening Word.

    Having used that version of Word when it was the latest one, I can say the current ones have quite some added functionality (lots of very tiny things, and a few bigger ones), but I'm totally sure the same could be done with 10x less memory usage if MS would care about it. But there's no incentive to do it. Computers are faster and have lots of memory, and we don't depend on floppy disks any more. It would just cost them more money. Not saying that this is a good thing (I think the opposite, especially that I start thinking that software bloat might have a non-neglectable environmental impact), but as long as nobody complains strong enough (or as long as the EU doesn't come out with an anti-softare-bloat law... seems just a dream but who knows), that wont change. And I can clearly remember I had the same bloat-software feeling when I tried Office 2000 or XP, compared to Office 97, so there's nothing so new here.

    As a final note, I've recently seen the source code of MS Word for Windows 1.0 on a GitHub repo (MS released it, see original release on Computer History Museum: https://computerhistory.org/blog/microsoft-word-for-windows-...). It was pure C, with even very large parts of code written in assembly! But the code is really ugly... totally incomparable to current C or C++ coding standards, patterns and language capabilities.

  • arprocter 16 days ago

    A while ago someone dropped off an old PowerBook Duo for disposal - I had to fire up Word 5.1 just for nostalgia reasons

    I saw it described once that software is like a gas - it expanded to take up the space we now have

    You see it with live distros too. They used to be 700MB to fit on a CD-R, but now it's getting rare to find one that'll fit on a 2GB USB; although yay for 'minimal' gaining ground

  • teaearlgraycold 16 days ago

    Our docker file for running some ML code at work is 6GiB. That does not include the model files. What the fuck, Nvidia? Am I downloading thousands of permutations of generated code for things I’ll never use?

    • stackskipton 16 days ago

      If you look at the build, yea, it includes everything and kitchen sink. No one cares to parse it down because in most cases, the big GPU servers running this have plenty of hard disk space and since it's a long running container image in most cases, the pull time isn't considered big enough problem for most people to fix.

      Prime example of "Hardware is cheap and inefficiencies be damned"

      • teaearlgraycold 15 days ago

        If only it was viable to analyze which files get used. Then cut down the image to just what’s needed.

        • dpkirchner 15 days ago

          I can show you some of the big, unnecessary files: all the .a files in /usr/local/cuda* (unless you're building software inside your container). That's, IIRC, at least a gig.

    • PeterisP 15 days ago

      As far as I understand, one of things that it includes actually literally is the permutations of code adapted for every different model of supported nVidia hardware; that is a major (and desirable!) part of the driver+CUDA deployment.

    • DrFalkyn 15 days ago

      If using C++, clang and boost are going to take up a substantial portion of that 6 GB

    • baobabKoodaa 16 days ago

      *Docker image, not Dockerfile

      • sgarland 16 days ago

        Spoiler, they’re inlining weights with a heredoc.

  • mihaaly 15 days ago

    Basically what we had in Word 6 is what I use today in the newest Words.

    It only takes longer to use while finding what I want among the bloated set of other things added.

  • jesse__ 15 days ago

    > We not only lost the way, we don’t even know the destination any more.

    Hah! Good one. It's unfortunate, but true.

  • rumad 16 days ago

    I completely agree. I believe it is still not too late to set a destination for the future. But, because of the hardware breakthroughs which are good, no one thinks about software efficiency anymore. They will add another GB/TB to the specs to avoid any bottlenecks.

  • masfuerte 15 days ago

    The exe for Word 2 was about 1MB. I remember it because it was by far the largest exe I had ever seen. It didn't have the red underlines for spell-checking but it did have all the other word-processing features I use now.

rpdillon 15 days ago

It's interesting: minimal software is out there, but folks don't tend to choose it. I spend a fair amount of time thinking about how to be conservative in my dependencies, and this encourages a lightweight stack that tends to perform pretty well. These days, I'm favoring tools like Lua, SQLite, Fennel[0], Althttpd[1], Fossil[2], and the Mako Server[3] and find that great, lightweight, stable, efficient software is to be had, for free, but you have to go a bit off the beaten path. This isn't stuff you hear about on Stack Overflow.

In terms of frontend, which the post focuses on (Google Docs and a 30MB doc), I guess I'm conflicted. While I tend to favor native apps + web pages, I'm also a daily Tiddlywiki user, and I really think web apps have their place (heck, one idea I'm working on is a lightweight local server that lets you run web apps like Tiddlywiki). But without a doubt, Tiddlywiki is more resource intensive than Emacs (my go-to for notetaking when I'm not on TW). My tab for a 6MB Tiddlywiki file uses 155MB of RAM, and my (heavily customized, dozens of open buffers) Emacs session uses 88MB. So I do think the author has a good point.

[0]: https://fennel-lang.org/ [1]: https://sqlite.org/althttpd/doc/trunk/althttpd.md [2]: https://fossil-scm.org/home/doc/trunk/www/index.wiki [3]: https://makoserver.net/

  • thefaux 15 days ago

    Lua is by far the most under-appreciated programming tool that I am aware of. Mastering Lua is one of the best ways to level up your programming skills. Of course it can be misused, but it is almost shocking how small and efficient Lua programs can be compared to most other languages.

a2128 16 days ago

I may be way off-base here but this is what I imagine the problem is:

1. Company executive decides their developers need top-of-the-line hardware to remain competitive in today's market

2. Developers make web apps on their company-provided M5 Ultra Pro Max 128GB RAM powerhouse laptop

3. They never test it on their father's old 2010 family PC, or at least they don't test often/thoroughly enough to realize many parts are broken or unusable

  • fragmede 15 days ago

    Also, network connections. Someone running the app off a Wifi 7 backed by symmetric gigabit fiber Internet at the office is going to have a difference experience than someone running the app at an apartment complex with a shitty wifi router on top of a shitty consumer Internet connection.

  • ThalesX 15 days ago

    This is easily fixable.

    I develop using Dev Tools set to mobile and throttled connection. This way mobile-first responsiveness (limited screen-estate) and potential problems with bad connections are first class citizens.

    Now, what usually happens is that I signal problems to the product owner and they wave them away. So I might update your third point that sometimes they do test it on their father's old 2010 family PC but it's not of concern to more relevant stakeholders.

  • mattl 16 days ago

    So this is part of my current job: testing things on older hardware or less powerful hardware plus older but still used browsers, especially on mobile.

  • drewcoo 15 days ago

    You're not off base. The article was about imagined problems.

  • passion__desire 16 days ago

    Related point, do google android engineers dogfood android phones to themselves? I guess most of them would be Apple users.

    • jwells89 15 days ago

      Not a Google engineer, but I do keep around a device that represents “potato” specs that I test against when doing Android development. Not a perfect solution since there’s a bevy of old/low end SoCs with varying performance characteristics, but I figure it’s better than what many are undoubtedly doing which is testing against their newish flagship and calling it a day.

      I used to do the same on iOS, but came to find that performance differences on older devices there generally weren’t nearly as severe and that iOS users as a whole tend to use newer devices. When combined with reasonably well written Swift, performance on old devices generally isn’t a problem.

    • silverquiet 16 days ago

      Android encompasses $80 burner phones to what... like $2K flagships? It's a big target.

      • passion__desire 16 days ago

        Isn't that the point? Make a system usable on low-spec devices.

oellegaard 16 days ago

We recently moved an old page from plain html and everything generates by the backend to react and we had a drop down take several seconds to open with a thousand or so items in it. Was like 100ms to open the entire page before.

It was suggested to only display the first 100 items and let the user type in 3 characters until it started rendering.

Unfortunately this is the reality for many these days.

Of course instead we just fixed the shitty react code and it rendered instantly.

  • compacct27 16 days ago

    Yup. Common. With all the performance blogs focused on time to first paint and the like, React introduced a whole new category that looks a lot like this

  • werdnapk 15 days ago

    So use a server side rending framework such as Turbo. I've tried so many client side frameworks (what the kids tend to demand these days) and they're all slow with lots of data... except for Turbo.

  • drewcoo 15 days ago

    A select with thousands of options sounds like terrible UX.

    If the new frameworks make the problem blindingly obvious so that someone can actually justify fixing it, all the more reason to use those frameworks.

    • Klonoar 15 days ago

      I can’t believe I’m arguing this, but: it actually might be fine UX?

      As long as they’re sorted and I can jump with the keyboard, that bare-ass drop-down is probably going to “just work” with default behavior. Anything further and we don’t know the intended use case for the element itself, but on the surface… it could be fine.

      • YurgenJurgensen 15 days ago

        There's another condition for that: That you can predict the first few characters of what you want to select. This is rarely true of thousand-item dropdowns; you generally find yourself having to iterate through a bunch of hypothetical naming schemes. Sure, this could be averted with rigorous enforcement of some naming convention, but if you have the discipline to do that, you're probably not making thousand-item dropdowns.

tonymet 16 days ago

Yes because we promote blog posts on “idiomatic ruby” and “premature optimization is the root of all evil”. “Performance is less important than dev time”.

We used to have developers who took less time and wrote better code.

  • dustymcp 16 days ago

    I dont agree, there is way more help in terms of writing efficient code today than there was back in time, ive seen horrible code from that time which would not have been produced today.

    • throwaway35777 16 days ago

      What we don't teach or reward today is the behaviors and engineering process to write high quality code.

      A surprising number of inexperienced developers do the following: "once I get any working solution I should immediately open a PR" and let the senior engineers tell them what's wrong with it.

      When the big money leaves this field I hope there will be more pressure for people to adopt good engineering practices. I love to work with folks who put good effort into trying to make high quality changes. Personal initiative and ethics are how high quality software gets written.

      • dotnet00 15 days ago

        Agreed, this has been one of the habits I've had to break during my computer engineering PhD at a scientific research lab. Initially I was just submitting the first solution I came up with without much additional thought.

        My senior developer mentors ended up having to effectively rewrite all of it because while it was technically correct and efficient, it broke all sorts of other good practices (eg didn't fit the existing coding style), or added in additional library dependencies without much thought towards long term maintainability and backwards compatibility.

        It was taking so much time for the handful of already busy developers to go through my work that I had to learn to slow down, properly study the existing code and think about writing high quality code that fits the existing codebase. They didn't have the time to put down all their other work just to spend hours walking me through improving.

        As you mention, it was like with learning art, it's impractical for a teacher to walk you through everything, you have to learn to identify errors and things you need to improve through your own meticulous study, relying on the teacher to give you hints when you're stuck.

      • tonymet 14 days ago

        You’re right and it’s even worse. Much of the content on “engineering” especially on YouTube teaches resource intensive and overly complex practices sold as good programming. Moving more work to runtime. Increasing dependencies. Relying increasingly on blocking RPC. Wasting memory, cpu and storage.

        A rejection of performance and compatibility as the core principles of software engineering in favor of “syntactic sugar” and “idiomatic Haskell”

      • ornornor 15 days ago

        > folks who put good effort into trying to make high quality changes.

        In my 13 years in the industry, I’ve never worked at a place that valued that. More features faster, how many points this sprint is all that mattered. It’s put me off software engineering altogether.

      • giantrobot 16 days ago

        The process you're describing is the exact thing you want to happen: junior developers are trying to learn to write better code. Why should they waste their time researching not their code base when they can instead learn off their code base from people experienced with it?

        • skydhash 15 days ago

          Because that's what everyone does? You learn by studying other's people work and try to apply it to your own. Painters, musicians, architects, etc … all do it. Why not developers? Instead everyone's rushing to learn React without even knowing the DOM api. Or build a web app with 1000s of dependencies that could be done with a few PHP files. And then they say they need docker and k8s.

        • tonymet 15 days ago

          The entire debate is over which qualities count as "better". Even "readable" is subjective. Some people love Java with sentence-long variable names. Some love 1-letter vars. Some love 100-deep call stacks, some love flat code. Some love microservices, some love resident call-stacks.

          Aesthetics matters.

    • tonymet 15 days ago

      Which parts are you talking about? On one hand there is more telemetry & tooling to help improve efficiency. On the other hand developers are encouraged to build inefficient applications full of run-time checks, poor data structures and blocking RPC calls .

      When I compare apps from 2000 to now it's a general decline in responsiveness and resource utilization.

empiko 16 days ago

Substack is a good example as well. Literally cannot render simple text and image articles without visible and annoying lags. If there are several hundred comments it can take tens of seconds to finish rendering. With functionality that was figured out in early 90s...

  • Ekaros 16 days ago

    I remember using Youtube with Opera GX and two adblockers not that long ago. When you scrolled deep enough in to comments writing replies to them had significant lag on input. And I mean like what felt like seconds before all presses got processed.

    This is flagship product of one of the largest companies, and even they cannot get UI performance right...

lr4444lr 16 days ago

I think about this a lot, and the conclusion I've come to is that the market (simply meaning the popularity) rewards features and intuitive UI above everything else. I think we've been on this trajectory for some time: getting users with less and less actual computer literacy to do more and more with computers.

  • ip26 16 days ago

    I do wish fast and responsive UI was included in what’s rewarded

  • darylteo 15 days ago

    The market rewards VALUE.

    The rest are simply modifiers on that value. More intuitive UI allows users to gain value more efficiently. Performance allows users to gain value more efficiency.

    Efficiency is important, but Value more so.

    • lr4444lr 2 days ago

      Of course, that goes without saying. What I'm adding is the increasingly more computer illiterate expectations of the end consumer deriving the value, and the increasing functional demands he expects at his level of ability to understand the internals.

  • mihaaly 15 days ago

    I wonder if market rules of satisfying user needs and demands is operational here still instead of shoving down things of user's throats ... actually no need to shove, they swallow anything shiny or coming with loud fanfare. It is spectacular what users are willing to work with, wrapping their world around the needs of the poorly made and sometimes even offensive software ...

  • VyseofArcadia 15 days ago

    It could be argued that it is similar to the way the market values Oreos and potato chips over broccoli and lentils.

  • rrr_oh_man 16 days ago

    While eroding that same literacy for the future generation

  • timeon 15 days ago

    Popularity is determined by marketing. There is reason why markets are regulated.

logrot 16 days ago

We aren't making software. Where making features (and bugs) for the next Friday deadline.

  • ornornor 15 days ago

    I think HN audience is a little luckier in that regard, some of the people here work on very interesting and technical problems.

    The rest of us and the vast majority of professional SE work for a marketing/sales person and “number of new features released this Friday” as you said.

contextfree 15 days ago

I liked this post by Terry Crowley: https://terrycrowley.medium.com/software-ecology-bb4653046fd...

"The classic response to accusations of bloat is that this growth is an efficient response to the additional resources available with improved hardware. That is, programmers are investing in adding features rather than investing in improving performance, disk footprint or other efforts to reduce bloat because the added hardware resources make the effective cost of bloat minimal. The argument runs that this is in direct response to customer demand.

It is most definitely the case that when you see wide-spread consistent behavior across all these different computing ecosystems, it is almost certainly the case that the behavior is in response to direct signals and feedback rather than moral failures of the participants. The question is whether this is an efficient response.

The more likely underlying process is that we are seeing a system that exhibits significant externalities. That is, the cost of bloat is not directly borne by the those introducing it. Individual efforts to reduce bloat have little effect since there is always another bad actor out there to use up the resource and the improvements do not accrue to those making the investments. The final result is sub-optimal but there is no obvious path to improving things."

Web pages/applications are probably even worse in this regard because I'm not sure users even conceptualize them as using resources on their local computers, so they don't get blamed for it (people seem to attribute resource usage only to the web browsers, not the sites themselves)

lispisok 16 days ago

The real problem this guy is encountering is making interactive applications for a web browser is hammering a square peg through a round hole at its core. That's why performance is bad, that's why there is a new framework every other week trying to find a better workflow with better abstractions. There is so much inertia, such a huge ecosystem, many billions of dollars invested in it trying to make it less bad so it's not going away.

prmoustache 15 days ago

The big problem with the software industry, is it is working with a developer first, not customer first approach. Every technology or process is chosen for the benefit of the developers, to save them time, not for the end user.

Some will argue that it is indirectly benefitting the users who can get more features quicker. But most people care more about stability and not having to upgrade their computer yet again than features.

  • franciscop 15 days ago

    > "But most people care more about stability and not having to upgrade their computer yet again than features."

    Gonna need some data on that assertion, since there is surely some "balance point", that probably depends on the industry/software, it's not all to one or the other.

    As a personal example, these days I've been using Apple Pages (or whatever is called) and it crashes around once in an hour. But it has some features that allows me to quickly iterate over some document, so I am back to cmd+s as in the old days, vs using LibreOffice interface where it would take me an estimate 2x-5x more time.

miyuru 15 days ago

Doogle Docs is mentioned in the article, but Google Drive is way worse.

Try to upload a folder with 20+ small files(say a picture galley) it takes a lot of time to process and upload them. If you add a new file to the folder and try to upload the folder, it will need to upload the whole thing again.

  • TrackerFF 15 days ago

    Not sure how bad it is these days, but a couple of years ago I was training a CNN and decided to just upload the data (images for training and testing) to my google drive folder - as I used google colab for development.

    It was slow as molasses - unusable, really, and we're not talking about huge amounts of data either.

    At least then, it seemed to be the google drive i/o that was the bottleneck, and the solution was to upload the training files to the colab session / VM.

can16358p 15 days ago

I think convenience is the keyword. Users (not us HN crowd, the rest of the world) don't want to struggle with installing software, most of them don't know the difference between Internet, Browser, or "app".

They just open up "internet" and work on docs, and for 99% of the cases Google Docs works fine despite running in a browser that is much less efficient than a native "app". For most cases it's more than enough for the regular user who is used to "computers being slow" anyway.

  • rumad 15 days ago

    I also love to use and prefer native apps. But, in my case, I had to offer Google Docs to my dad. Because he wanted to transfer his doc file to his work computer to continue to work on it when he is at the office. So, I thought a web app might be good for him.

    • can16358p 15 days ago

      Sure it's a perfectly normal thing. I'm talking about Google side of things: they're probably not caring about performance when dealing with 30MB large docs as it's not what people use in their docs anyway.

cmiles74 16 days ago

I wonder if it's the big software vendors who have lost their way, rather than the average software developer. We have companies like Microsoft and Adobe moving to Web based applications not because they are better but because it's easier for them to enforce licensing restrictions and push people into subscriptions. As the various App stores erode the profits of software developers any tool that will make it faster to add features, even if they are buggy, becomes compelling.

vnuge 15 days ago

I hope to keep seeing posts like these. I believe software "bloat" is a serious issue that should be handled, however if you look at SWE job listings it's not even remotely a concern for employers IMO. Your encouraged to understand complex and heavy frameworks and performance/optimization is not even a consideration.

  • skydhash 15 days ago

    Because those frameworks are easy at first glance. Adding some interactions with React is easier than with jQuery. Even better if you could make the whole page a React app, then you can add those 100s libraries to do…stuff. Optimizing a React app is hard and it will probably require some deep thinking about global state and its modification and we don't have time for that /s.

    By then, the app is built and running. Even though the code is a mess because the developer only know about React, nothing about the DOM and software architecture.

    • vnuge 15 days ago

      To your last point, I like to think of modern professional software development as a trade, it's not much of a science anymore imo. For me it's outside looking in.

shpx 15 days ago

Web apps are much faster to install and update.

1) Your browser is always open, whereas you need to close your current app and open the app store app

2) Google Search is better at giving you what you want in the fewest number of keystrokes than any search from any other company including app store search boxes

3) installing a "web app" is one click after the google search results or if someone posts a link. As you might know from social media, even one click is a lot and most people won't click links in a comment. In the apple app store i have to tap on the app in the search result, tap install, double click the lock button, scan my face then wait many seconds. Some websites load in many seconds but that's not that common and it's considered a bad website

4) software is reinstalled on each use, meaning it's always updated. Native apps update randomly with a big delay and I have to check if there's an update manually sometimes

5) with native software there's a risk it will not support your device. Risk is commonly expressed as a cost, so the fact that in my life I've searched for an app and found out that it's not supported on iPad or not supported in my country can be counted as every single app store download since then took me .1 seconds longer because I now know there's a risk I will waste my time looking for the app. Web apps also don't work sometimes but it's more predictable since it's usually tied to real limitations like the screen size of your device. Another risk is malware, I feel nervous and powerless when installing native software because I don't know who I'm giving access and to and to what, whereas I understand what web apps can track. and also the fact that it's HTML and JavaScript instead of opaque assembly instructions makes ad blockers possible and cheap

Installation is an integral part of using software, and a big the reason the web won and continues to win.

  • ornornor 15 days ago

    > 2) Google Search is better at giving you what you want in the fewest number of keystrokes than any search from any other company including app store search boxes

    That seems to be contrary to the general opinion (at least on HN): google has become utterly irrelevant, serving mostly content farm AI generated junk type of blogspam, and google is more concerned about ad revenue than anything else (including results quality)

    • edanm 15 days ago

      I mean, this is a very HN-bubble thought. Even if people here actually think that, just look at Google's revenue or usage numbers. Clearly a "few" people are still using it!

      People here might say that Google has gotten terrible, but I would bet 99% of HNers still user Google dozens of times a day, just like everybody else in the world.

      • ornornor 15 days ago

        The HN-crowd tends to be more demanding and sometimes ahead of the curve, but it's a general trend. I also wonder how much people enjoy using Google vs suffers through it because they don't know any alternatives.

        I remember reading that even Google is worried about how crappy their results are becoming and at least some of the company sees it as a threat to itself.

        But yes, you're right, Google is still very much mainstream and the #1 search engine. I'm just not certain it's because it's an amazing experience rather than "no other options".

        • edanm 15 days ago

          Eh. I've been hearing of this "general trend" for 15 years. I'm pretty sure if you look at HN in 2010, you'd find plenty of people speaking of the decline of Google search results and how they spell the downfall of Google.

          (It might even be true! I'm just saying it's not new.)

robinhood 16 days ago

Interesting fact that this article has been posted on Medium, resulting in a 7.32 Mb page.

  • canucker2016 15 days ago

    I got 8.3MB on page load.

    Then I got uBlock Origin to turn off JavaScript, remote fonts, and large media items.

    Result: 116KB

    So 98.61% of the page is extraneous...

AdrianB1 16 days ago

We can write efficient software, but many times we decide not to.

1. Why bother optimizing when the developer's time is more expensive than RAM and CPU power? I see this a lot.

2. From the times that I can remember (mid '80) till now, only top developers write efficiently software that is efficient. Most developers are average (this is not bad, it is just an observation) and for the average developer software optimization is too expensive in terms of time invested. Some don't know how to do it, some are not proficient enough to do it in the constraints of the projects given to them by bean-counting managers. "good enough" quality is software management is much safer than "good enough" Boeing planes, so when Boeing is cutting corners then managers of developers cut even more.

  • RetroTechie 15 days ago

    > Why bother optimizing when the developer's time is more expensive than RAM and CPU power?

    The comparison should be between developer's time, and time spent (wasted) by all users combined. This depends on # of users, and how often they run the software.

    For a one-off, with a few dozen users running it occasionally, yes developer's time is expensive.

    For popular software with 100M+ or billions of daily users, developer time is practically irrelevant, and spending weeks/months to shave off 1/10th of a second for each user's run, would be a no-brainer.

    Most software sits somewhere in between.

    But... developer is paid by company not by end users. And company cares about other things than the interests of society-at-large.

    So it's mostly a case of bad incentives. Companies don't care about / aren't rewarded (enough, anyway) for saving end users' time. Open source developers might, but often they are not rewarded, period.

  • contextfree 15 days ago

    Also, RAM, CPU and disk space are shared resources on PCs. If your computer is slow you don't necessarily attribute it to any particular program, let alone website. As Terry Crowley says here https://terrycrowley.medium.com/software-ecology-bb4653046fd...

    "... the cost of bloat is not directly borne by the those introducing it. Individual efforts to reduce bloat have little effect since there is always another bad actor out there to use up the resource and the improvements do not accrue to those making the investments."

michelb 15 days ago

In my MacOS Passwords I have ±3400 entries. A roughly 341KB CSV. Searching in that list in the preference panel or Keychain app is dog-slow on my M1 Pro Studio or on my M3 MacBook Air. How??

  • constantcrying 15 days ago

    It seems obvious how you could make it slow. Just naively iterate over the entire series of objects looking up the name for each.

    This is the obvious low effort, low complexity solution. Of course you could make it fast, but that would take time and effort for a feature most people won't notice.

    • michelb 10 days ago

      I agree on the priority here, but this looks like an OS component that appears in other apps. Filtering history in Safari is equally slow.

    • anonymoushn 15 days ago

      Your proposed solution would run in less than 1ms though.

  • bouke 14 days ago

    When the first MacBooks came out with a Secure Enclave, Keychain was just unworkable slow. You had to type your search query elsewhere and then paste it in, or you’d be looking at a beachball for minutes. Took them a major OS X release before it became somewhat bearable again.

    The new Passwords is a joke. UX errors all over the place, modal based view with a toggle to start editing. If you need to enter a password in another area of System Preferences, you need to back out of the auth flow, switch to Passwords and copy the credentials over to a temporary file.

  • skydhash 15 days ago

    SwiftUI and Catalyst. Somebody has probably implemented that list and try to rebuild the app UI at every keystroke (I think the UI is a single thread on macOS). There are optimizations, but that requires being aware that beneath the declarative world, it's an imperative foundation.

    • eviks 15 days ago

      No, it's been slow long before that, that's just one of the many abandoned apps from the golden age of efficiency and superior UI

Johnny555 16 days ago

One problem is that we use a full word processor that's powerful enough to typeset a book to distribute basic text documents when simple markup would be just as readable, and more easily usable on a wide variety of devices (and easier to make accessible for disabled people)

So when my doctor sends me a one page checklist of how to prepare for a procedure, I have to open it in a powerful word processor and since it's I'm not using MS Word, the fonts and formatting aren't as expected.

  • rahen 15 days ago

    Not until long ago, we had RTF, a simple and universal rich text format that was free, simple and available on all platforms.

    On Windows, Wordpad was plenty enough for most needs, came preinstalled for free and barely consumed any resources, but I understand it's no longer shipping with Windows. Office 365 is now where the money is, even for basic needs.

    • anthk 15 days ago

      Get ReactOS' ISO. Mount it or extract it with 7zip.

      Find the biggest CAB file, it weights lots of MB. Extract it. Copy Wordpad.exe anywhere else. Delete the rest if you want, or enjoy Sol.exe and friends.

jedisct1 16 days ago

Remember the QNX demo floppy?

  • nappy-doo 16 days ago

    I recreated that in grad school, also using QNX. We set it up to run a paging (POCSAG) system that you could put the floppy into any computer, hook up a DSP board and modulator, reboot the machine and you'd have a paging system. My advisor showed it to Andrew Viterbi, and got us grant.

    When I went to recreate it, I contacted QNX and asked if I could speak to the guy who did the work, and he had died the year before. So, I just took apart his floppy, and figured out how he did it.

    The things you can do when you invest your time 100% in something.

  • seanparsons 16 days ago

    Yup, that felt incredible even at the time, let alone now.

bitwize 15 days ago

Even John Carmack has come around now to saying that the primary focus of a software developer should be delivering value to customer. If that can be done using up a gigabyte of memory, and using a gigabyte costs less to deliver than using 50 MiB, so be it. Software is a business. Development=cost, happy customers=profit. Anything that maximizes the latter and minimizes the former is a win. Deal with it.

KayL 15 days ago

Wait, is it true? Did you encounter a bug? or browser extensions issues? It may not be as efficient as native, but I downloaded some sample files and repeated them until hit Google Docs limits. It is about 450 pages with some images. It is smooth and usable. There's zero delay. (and only 1000MB ram usage with many broswer extensions enabled. I think it's fair...)

  • constantcrying 15 days ago

    Who knows how exactly that word looks. Might be Features from word which it is badly emulating.

  • rumad 15 days ago

    The doc file has images, and some simple tables but mostly texts.

pylua 16 days ago

There are a lot of nuanced features that explode in complexity when joined with other features. The number of permutations is massive, and handling those combinations of requirements is what modern day software is about.

Today’s software systems are more generalized, though they are solving the same business problems, just with more details / functionality than before.

jongjong 15 days ago

The answer is yes. Software developers themselves have lost the ability to recognize or value efficient architecture. Software architects have been wiped off the face of the earth and replaced by a combination of script kiddies and scrum masters. Fast forward 10 years, we can see the results. Now we just have to wait to see what happens 10 years from now.

Sad for me who wanted to be a software architect. I had to watch all this unfold in real time from inside various companies and never had the ability to fix the problems. Last time I tried to prevent major architectural flaws from being implemented in software during the design phase, I couldn't convince management and had to quit the company... Then 2 years later, from the outside, I witnessed the project turn into a complete failure. They literally abandoned the whole thing and started using a competitor's platform... Which, to rub salt into the wound, is almost just as awful.

  • nyarlathotep_ 15 days ago

    Non-sarcastic question--how do you define "software architect"?

    Typically the "architect" title I see thrown around today pertains to gluing together some abominations of "systems", typically cloud services to do web stuff at "scale"

    Architect today seems == "cloud infrastructure decision maker" and has no bearing on the code written, libs/frameworks/whatever used

    • jongjong 14 days ago

      Infrastructure could be considered part of architecture but also, it should include other aspects such as how the code is organized on a per-project basis; what frameworks are used, how data should flow through the system, how the different modules are organized to form a functioning product that is resilient to requirement changes.

jeffbee 16 days ago

No, but apparently the arts of rhetoric and reason are completely dead. There is a big gap in this blog post somewhere between "a given program is not optimized for a specific use-case" and "we lost our way of developing optimized, efficient, and performance-wise applications".

egypturnash 15 days ago

“My father told me he wanted Microsoft Word on his laptop. So I told him to use Google Docs instead. When that turned out to suck, I installed Libre Office instead.”

Why didn’t you just, like, get him Word? Why did you make him try to use a shitty web app that assumes everyone’s computer is brand new, then install an open-source program that’s going to be constantly playing catch-up with Word’s updates and may cause problems down the line when Dad wants to work with someone else’s Word docs?

Maybe there was a perfectly good reason for this choice. I can think of a few. Maybe you helped Dad enter The Year Of Linux On The Desktop recently. Maybe Dad didn’t want to pay for Office. Who knows. Whatever the reason, you didn’t put it in this post. And you ended it with a plug for your completely unrelated SAAS, too.

  • rumad 15 days ago

    My dad didn't want to pay for the Office. I might ended my story badly. Sorry about that. But, I don't agree with that the SaaS is not related. It is. API optimization is also important and it can directly affect the performance of web apps and even native apps.

    Update: I updated the article.

  • YurgenJurgensen 15 days ago

    Aren't Microsoft trying their best to turn Word (and the rest of Office) into a shitty web app?

    • rqtwteye 15 days ago

      Totally. It’s getting difficult in Outlook to figure where window boundaries are. A lot of menus also look pretty amateurish compared to native menus.

zrn900 15 days ago

Yes: Endless amounts of investments and inflated stock prices enabled by the zero-interest economy caused the cash-awash tech organizations to become more like research institutions, churning out interesting, promising, but esoteric and complicated concepts and software that has scarce applications in the real world and business. They became a cross between actual companies that do business and extensions of grad schools. Like a continuation of the college that you went to. And some companies openly pursued that mix and advertised themselves as such to attract talent even.

Now that the zero-interest economy is over, the entire tech sector is readjusting.

codingdave 15 days ago

> I think we lost our way of developing optimized, efficient, and performance-wise applications.

I'm not going to deny that we could do better, but it is more nuanced than that:

OP uses Word as an anecdotal example, but Word is not designed with a goal of being optimized. It is designed with a goal of being backwards-compatible to decades of history.

We cannot assume that all software shares the same goals because they simply do not. When we look at the problem any given software is trying to solve, performance optimization is almost always important, but almost never #1... #1 is "solve the problem". Doing it fast is always secondary to doing it at all.

  • eviks 15 days ago

    So why doesn't Excel solve the famous problem of data loss on importing dates?

    Also, Word (despite the general fame) isn't compatible with decades of history (remember a post that found that LibreOffice was more compatible with history (though not current designs))

vardump 15 days ago

33 MB file, that's probably docx, thus it's actually a ZIP'ped collection of XML-files. Could well be 200 MB uncompressed.

Once all that XML is naively loaded as nodes, we might be talking about more than 1000 MB of RAM usage.

geor9e 15 days ago

Google Docs might be able to directly edit .docx files now, but that is pretty new feature, and it doesn't surprise me that it's slow. It's a proprietary format owned by Microsoft. The argument that software ought to be efficient is valid, but the example of editing a non-native format seems unfair. Any time you add emulation layers you should expect things to slow down.

Did you consider trying Microsoft's own browser-based Word editor? It's free too. And .docx is it's native format.

Or, consider doing a conversion to Google Docs native format first (you'll lose some formatting though, possibly a lot of it).

manquer 15 days ago

We have always been making more efficient software than ever before .

It is just a different kind of efficient, there is economic incentive in making software development process more efficient and not that much incentive for making software itself efficient .

Software development is more accessible to millions and millions of new developers due to the work on higher level languages, frameworks , IDEs , libraries , low code , copilots and so on .

Each of these innovations made software development more efficient (not necessarily faster ) .

Nobody buys or uses software because it is faster , only cheaper .

quonn 15 days ago

The problem is really the business side or pressure from management. 30 Years ago if the software was not efficient it would be either dog slow or just not work at all.

Now it is merely not elegant or not as fast as it could be or at least fast enough initially during development and so no effort is spent on making it better.

I believe this could only be fixed through regulations which either make the engineers liable (thereby empowering them to make decisions) or by regulating energy use and user experience.

If we don‘t want this, then things will just be the way they are now.

fennecbutt 15 days ago

I don't mind many things being written in higher level languages and being less efficient, makes turnaround time cheaper.

But what are companies doing with this fast turnaround time? Features suck and are largely incomplete in modern software. For example: Sonos speakers, if the WiFi goes down they don't reconnect. Why? Why is basically every device and every bit of software chock full of obvious stuff like that? Do we really need AI or something to tell us how to build something properly?

ags1905 13 days ago

IMO the author of this article is right about what he is saying. And I am glad I am not the only one who sees the increased inefficiency of software and thus computers. Different times have different fashions and programming is also very influenced by fashion. But it seems like Free Pascal survived and Lazarus and fpc seem like a sane alternative, worth taking a look at.

fireflash38 16 days ago

Where in that stack is the biggest loss of performance? DOM? Javascript? Browser? Or is it because everything needs to sync to the cloud as you edit it?

  • jakub_g 16 days ago

    IMO the biggest problem is simply that nothing is tested anymore on big inputs, and that frameworks and the modern way of writing code hide the complexity.

    Also, in the past you had to care explicitly about how much memory you allocate etc. which stopped you to think. Now you can pretend you have infinite resources because everything happens implicitly.

    Compounded with this [0]:

    > O(n^2) is the sweet spot of badly scaling algorithms: fast enough to make it into production, but slow enough to make things fall down once it gets there

    you get what you get. Ever opened a GitHub pull request with 2000+ files changed? It hangs the M1 MBP. The solution is probably not rocket science, if someone really prioritized the fix.

    [0] https://twitter.com/BruceDawson0xB/status/112038140670042931...

  • mike_hearn 15 days ago

    It's going to be a mix of:

    1. JS doesn't support multithreading, nor many other features that are useful for performance (e.g. mmap). This severely limits what you can do and makes it hard to scale up by parallelizing.

    2. JS is a very pointer heavy language that was never designed for performance, so the CPU finds it harder to execute than old-school C++ of the type you'd find in Word. It's hard to design tight data structures of the kind you'd find at the core of Word.

    3. The browser's one-size-fits-all security model sacrifices a lot of performance for what is essentially a mix of moral, legal and philosophical reasons. The sandbox is high overhead, but Docs is made by the same company as Chrome so they know it isn't malicious. They could just run the whole thing outside of the sandbox and win some perf back. But they never will, because giving themselves that kind of leg up would be an antitrust violation, and they don't want to get into the game of paying big review teams to hand out special Officially Legit™ cards in the same way that desktop vendors are willing to do.

    4. The DOM is a highly generic, page oriented structure, that isn't well suited for app-like UIs. As a concrete example Chrome's rendering pipeline contains an absolute ton of complexity to try and handle very long static pages, like tiled rendering, but if the page takes over rendering itself like Docs does then all this just gets in the way and slows things down. But you can't opt out (see point 3).

    • skydhash 15 days ago

      Or it's just using the wrong tool for the job. With Google's resources, they could build a UI framework that runs on top of OpenGL and have people download an app built with it instead. Next, we'd have people building interfaces on top of latex.

      • mike_hearn 15 days ago

        They already built several UI frameworks that run on top of OpenGL. Most obviously, Blink Chrome's HTML renderer runs on top of OpenGL :)

        Also, Docs uses a custom UI framework already. It implements all the UI controls itself, the browser is only really used for rendering text and styled boxes. I remember the first version of Docs when it was called Writely used the browser's built in editing support but they had to abandon it because it was too buggy, so they moved to using JS to lay out every character and draw their own cursor. It was considered wild and crazy at the time but Chrome was getting fast enough to make it work. Of course, it's more efficient to have editing be implemented fully in C++ but browser makers never managed to make that work properly, so, slow path it is.

  • brabel 16 days ago

    I think it's just the way web apps are architectured. If you use vanilla JS you can make the browser render stuff really fast even if you change the DOM, as long as you do it efficiently (don't update ALL the DOM on every change!). But I bet Google uses some reactive framework that keeps a bloody copy of the whole DOM in memory, so when you throw 30MB of content into the DOM, that copy that was meant to make things faster starts being the bottleneck.

  • constantcrying 15 days ago

    >Where in that stack is the biggest loss of performance

    Exclusively the minds of developers and the stance of management.

    It is of course possible to built responsive, high quality and performance websites. But that is hard, much, much harder than to make something work, which maybe takes a few seconds to load, sometimes doesn't work quite right and can be a bit tedious to use.

mif 15 days ago

Reminds me of some of the remarks Joe Armstrong made a while ago [1], and which I came across via another submission a couple of weeks ago (which escapes me). It’s a great talk about the physical limits of computers and computation.

[1] https://youtu.be/lKXe3HUG2l4?si=nMkDTMfvBK1AKCYI

jillesvangurp 15 days ago

Making software efficiently is more valuable than making more efficient software. Making software inefficiently takes a lot of time and people's time is expensive. A good example of an expensive use of time is optimizing something that doesn't really need optimizing. I know a lot of people go OCD on this stuff and I've been there and done that myself. But it mostly does not matter a lot in terms of value of the software.

Anyone that can't be bothered to update their ten year old laptop because it's slow is also not going to spend a lot on faster software. There's just not a lot of money in optimizing stuff. And if you have a modern laptop, it doesn't really benefit a lot from the type of optimizations you'd do to make things run smooth on a ten year old laptop. Especially when optimizations are simply about turning stuff off that aren't really in the way on the faster laptop. Like having some cheap 3D effects, pretty colors and animations, etc.

Anyway, I'm old enough to know that this is not a new debate. We never lost our way on this front. It was always like this even when computers were several orders of magnitude slower.

  • p0w3n3d 15 days ago

    the decision to shift towards web apps and electron apps is mostly because of the cheapness of the development, and binding to service-oriented architecture thus monthly payment... in other words: money

    • jillesvangurp 15 days ago

      That kind of is the point. That's just the latest iteration of this. You could have made the same argument about writing a business application in visual basic vs. writing something in C/C++. The latter required more time and skills. Both of which cost money. Which is why visual basic was so popular. It's also why cobol caught on in the sixties even though there were more optimal things around even then. And hardware was still really expensive.

      • p0w3n3d 15 days ago

        However I could see a paradigm shift towards standalone apps if there was a proper language or C++ mod increasing its usability and easiness, AND there was no pressure to do so much on cloud-based.

        Also interoperability I guess is a problem (html really IS a consistent UI for every platform)

asvitkine 16 days ago

This just sounds like an inefficiency in Google Docs. Native software can also be inefficient, even if it's written in C or asm, if the data structures or algorithms used can't handle certain types of data well. Just in this case, the native software seems to be able to handle that file better.

graboid 16 days ago

The same techniques that native apps use to process large documents should be possible in JS/WASM land, no? Given, you probably won't reach native performance, but it seems to me a snappy text editing experience even for large documents should be perfectly doable using web tech.

nwah1 16 days ago

Office 365 probably would have worked. Microsoft's proprietary format is handled best by Microsoft.

Also, cloud-based synchronization using CRDTs is a complex problem that is significantly more complex than just loading the document.

Can't claim we are going backwards when comparing apples and oranges.

  • nottorp 15 days ago

    That's clutching at straws tbh. I have a "native" google docs spreadsheet that has atm 4 sheets, all at 50 lines or less. No fancy formulas except sums. It just tracks hours for billing.

    It uses 500 Mb of ram fresh, and in a couple weeks it goes up to 1.5 to 2 Gb and I have to kill the tab and reopen it.

    This is the modern javascript world...

    • nottorp 14 days ago

      Hey I just killed a github tab that was at 1.2 Gb :)

  • eviks 15 days ago

    It's not a proprietary format, nor should sync take that much in resources in UI thread

troyvit 15 days ago

Seems like a post on Medium is a little bit overkill when the answer is simply "Absolutely. Yes. No question. How could you think otherwise?" Is there some sort of micro-twitter they could've used instead?

j45 15 days ago

The way mislead a long time ago when too many projects selected a framework that was way more complex and heavy than what they were solving.

There are some new simplified approaches that are starting to be interesting again.

clircle 15 days ago

Dad wants to work on a doc file, so why tf didn't OP install MS Word?

p0w3n3d 15 days ago

In my opinion, the reason is purely economic. Writing software in web frameworks (even a phone app, which is often written in javascript and runs in a separate browser which takes up to 300MB of RAM at the very start) is much faster, has a less steep learning curve, shorter test loop, and faster and easier release cycle. Additionally, recently every company has tried to move to SaaS business because it's giving more steady income (pay 60$/month or 10$/month but with a promise to pay the whole year, which vendor-locks a user)

While I've been involved in releasing both types of software (a ye olde Windows standalone app written in Qt framework and the new software that's being released every 2-3 days or so) I find the new way much less painful. I couldn't imagine releasing a Qt app in such a cycle (how to do updates? maybe like Minecraft - every other launch there's a new package of 64MB downloaded)...

...but as a user, I feel much more comfortable when code is on my computer, available for me to run. While extensively using Google Docs to share documents with my wife (calendar, spreadsheets) and band (songlist, lyrics), I'm scared that if I moved everything to on-cloud (paid mail, photo enhancing (thankfully I'm a standalone Lightroom buyer), vector graphics (inkscape user), photoshop (gimp user), etc.) one day I would hit the roof in payments of my per-month-but-bound-to-year plans. In 2030 it would be also netflix, shmetflix, televisionix, phonix, car-as-a-service, heated-seats-as-a-service, air-conditioning-as-a-service, toilet-as-a-service, smartwatch-as-a-service etc. building up to an unbearable rent for something that used to be free or paid once. I don't feel like my duty is to provide a constant supply of money to my beloved corporations

darepublic 15 days ago

I always didn't care for google docs. I wonder though how one would go about achieving good word editor performance on a 30mb doc file in the browser.

anthk 16 days ago

A modernized Inferno for the web (with unicode) would have been much better than... this.

It looks like a joke seeing an i5 choking on files opened by PC's from 2003.

ultra_nick 16 days ago

We learned that efficiency is just one feature of many.

qingcharles 15 days ago

Anyone have good recommendation for a Web-based Google Sheets alternative that isn't killed by a spreadsheet with 50K rows?

drewcoo 15 days ago

Did we lose our way in arriving at the file size non-goal?

No.

xvilka 15 days ago

Yes. And Electron is the pinnacle of this.

osigurdson 15 days ago

Unexamined phrase based development ("root of all evil") is likely responsible for a lot of it.

emorning3 15 days ago

If AI was a real thing, instead of phony baloney probabilistic bullsh!t, then we'd already have AI-driven tools that take crappy, bloated, slow, but correctly working systems built by humans and turn them into smaller, faster, correctly working AI-enhanced systems.

Right?

  • patrulek 15 days ago

    What if current AI is also crappy, bloated and slow?

matricaria 12 days ago

A 30 MB doc file is a problem per se.

xg15 15 days ago

Yes.

high_na_euv 16 days ago

People shit on web devs, but lower lvl things arent better either - try compiling llvm

  • vinyl7 16 days ago

    LLVM is overly complex for what it does

nrvn 15 days ago

TLDR; the metrics that we are missing as an industry are:

- how much computing power is needed to present a single useful bit of information to the user;

- how much computing power is needed to process a single useful bit of information;

- how much total data transfer is needed to transfer a single useful bit of information.

of course to answer the above questions you need to give the definition to the term "useful single bit". And the hint here is: if we agree that - say - rainbow has 7 colors then the information about all 7 colors would take just 3 bits of data, wouldn't it?..

long story below:

this questions pops up at least twice every year by someone completely frustrated by the current state of things in the computer industry. And if you think it is limited to only software side of things, then well... ignorance is bliss.

Think of it from the incentives and rationale perspective.

Whenever you encounter a bloated piece of software or an over-engineered hardware box put yourself into the shoes of the author of this. Once you delve into the details of why and how was any specific tool or technology created you can understand why it looks so bad.

Some notorious examples of most hated programming languages were created in a very short period of time without any strategic thinking involved by people who had not had any experience with designing programming languages.

And people continue with this pattern in all types of software driven by business requirements rather than their engineering and scientific aspirations and talent (or lack thereof) most of the time.

In other words, bloated software is the result of time limitations imposed on developers. Efficiency, size, quality, stability and security go out of the window when you need to pursue other, more "important" goals.

"We need to go ahead of the competition and the time to market is our priority. We'll cut corners and burn cash. No thorough think through, just do it."

Another perspective is resource limitation. You as a developer have access to virtually unlimited computing, networking and storage resources. Remember this "memory is cheaper than developer's time" mantra?

Now put yourself in the shoes of a NES game developer. You need to squeeze in the whole universe into 32kb, with graphics, music and gameplay that will look attractive and responsive running on a 1.8MHz single-core 8-bit CPU.

Or put yourself into the shoes of the Voyager 1/2 team whose objectives are to keep a small piece of metal afloat in hostile environment for the next 50 years. With remote debugging capabilities, over-the-air software updates and continuous telemetry transmission back to Earth.

If Brendan Eich had not been given just 10 days to draft the javascript specs would we see something different in the frontend world today? Or we'd still see the 10MB garbage being downloaded by every other website just for the sake of keeping the cables busy?

And here is one of my favorite quotes by Alan Kay:

"Think about it. HTML and the Internet has gone back to the dark ages because it presupposes that there should be a browser that should understand its formats. This has to be one of the worst ideas since ms-dos, this is really a shame. it's maybe what happens when physicists decide to play with computers."

tracer4201 16 days ago

I’m in leadership ic role at a big tec company. We pride ourselves in engineering.

hiring bar was dropped. expecting a mid level engineer to work with a byte buffer is considered “too complex” and non differentiated work.

the literal goal is to pump out features written up by mba/product team. none of these mbas use the product mind you. theyre chasing stupid features they think vice presidents want, because the thinking is it will drive promotions.

this is a cynical post and i will stop here. my org has problem of incentives. nothing else. you incentivize wrong things then this happens

Zpalmtree 15 days ago

Dude just increase your development costs 10x because apps sometimes run slow

  • mihaaly 15 days ago

    Perhap we should compare the cost of development - and maintenance - for an efficient classic software and a bloated and modern one once ... ; ))

    (probably the cost of making an ancient Word and the newest one... : )) but there could be lots of other examples of 'modern' ones made along current trends vs. similar feature set classic ones, I wonder how this cost characteristic would play out .... : ) )

eezing 15 days ago

Google Docs is pretty efficient. It loads incredibly fast and you never have to manually update it. But, it clear has limits… 30MB Word files.

eternityforest 15 days ago

We did lose the way, and then we found it again. Not everything has caught up yet but a lot has.

Moore's law kept going, and software started getting a little bit faster, which was enough to stop undoing the gains made by hardware, and now things are back to mostly snappy.

Occasionally you'll get a 30mb file that's slow... but subjectively things sure seem better than 10 years ago when you couldn't even think about optimization without someone beating you over the head with a "premature optimization is the root of all evil" quote.

  • nocman 15 days ago

    > and now things are back to mostly snappy.

    This is the inverse of my experience. There are few applications that have a UI that I would refer to as 'snappy'. In fact, I am trying to come up with a single example, and atm I can't even think of one.

    • eternityforest 15 days ago

      All mainstream apps are vast on recent Android phones. Most everything on recent Ubuntu is fast.

      Most mobile sites in general are fast, maybe not "2kb of HTML" fast, but fast enough I don't notice or think about performance when browsing.

      Although, I suspect a lot of programmers can think faster than average and seem to be bothered more by small delays than the rest of us, since they're able to do things like figure out a Vim keystroke sequence instantly.

      • nocman 10 days ago

        I disagree. I use Linux (including Ubuntu), Windows and Android apps regularly, on fairly decent hardware. Most apps I would describe as having "OK" to "tolerable" performance. Some I would call "draggy", and a few I would call outright pigs.

        Again, off the top of my head I can't think of a single application I would describe as "snappy". Or, put another way, I can't think of an application that, in recent memory, really impressed me by how well it performed.

fnordpiglet 15 days ago

This seems to take a moral stance that efficient is better. However that’s not always practically true. Efficiency often means inflexible and brittle with respect to change and time. Abstraction generally trades efficiency for ease in change over time, or parsimony for the developer in exchange for expense at runtime.

If it doesn’t matter - it doesn’t matter. If the goal is making a document format that is flexible enough to accommodate history, concurrent editing, various layouts and embedding, etc, all this comes with abstractions that add inefficiency. The trade off is ease in adding and changing to the format and the software that consumes and produces the format. If the consequence in the real world is effective unobservable in any material way, who cares?

Maybe as a moralistic measure it’s offensive that something lacking parsimony is practical. But from any meaningful measure - the users perspective, the developer, even the company paying for the processing - if it doesn’t matter - it literally doesn’t matter.

Comparing Google Docs to a program hosted on an Apollo era flight computer is obtuse to an extreme, and I would rather write my collaboratively edited documents with Google Docs than Apollo era flight computer any day no matter whether one is less parsimonious than the other.

  • rpdillon 15 days ago

    > Maybe as a moralistic measure it’s offensive that something lacking parsimony is practical. But from any meaningful measure - the users perspective, the developer, even the company paying for the processing - if it doesn’t matter - it literally doesn’t matter.

    Except the post you're responding to was literally in response to a user problem trying to edit a 30MB document in Google Docs. So it very much does matter, from the user perspective.

    > Comparing Google Docs to a program hosted on an Apollo era flight computer is obtuse to an extreme, and I would rather write my collaboratively edited documents with Google Docs than Apollo era flight computer any day no matter whether one is less parsimonious than the other.

    Straw man. The post compares Google Docs to LibreOffice (a competing product), and points out that LibreOffice solves the user's problem (editing a 30MB document) and Google Docs cannot.