ploxiln 7 years ago

Reminds me of how Windows Vista's "Multimedia Class Scheduler Service" would put a low cap on network throughput if any sound was playing:

http://www.zdnet.com/article/follow-up-playing-music-severel...

Mark Russinovich justified it by explaining that the network interrupt routine was just too expensive to be able to guarantee no glitches in media playback, so it was limited to 10 packets per millisecond when any media was playing:

https://blogs.technet.microsoft.com/markrussinovich/2007/08/...

but obviously this is a pretty crappy one-size-fits-all prioritization scheme for something marketed as a most-sophisticated best-ever OS at the time:

https://blogs.technet.microsoft.com/markrussinovich/2008/02/...

Many people had perfectly consistent mp3 playback when copying files over the network 10 times as fast in other OSes (including Win XP!)

Often a company will have a "sophisticated best-ever algorithm" and then put in a hacky lazy work-around for some problem, and obviously don't tell anyone about it. Sometimes the simpler less-sophisticated solution just works better in practice.

  • cm2187 7 years ago

    Reminds me of Apple's Jim Reekes "engineers are retarded".

    https://m.youtube.com/watch?v=C5d151lqJsA

    (The annecdote really starts at 2:30).

    • bunchspoiler 7 years ago

      Well, he would know about engineers being "retarded". I found him extremely difficult to work with, unwilling to accept that any behavior of his code was a bug, and an all around obnoxious guy.

    • Systemic33 7 years ago

      Haha, what a great video.

      To some extent he's onto something. Sometimes you are in this zone when everything will work if you just do this one quick little hack, and then the hack ends up not getting replaced with a proper solution, and in the end you end up with a harddrive right next to a speaker >.<

      • btschaegg 7 years ago

        There actually seem to be people whose default mode of operation seems to lie entirely in that zone though. I'm pretty sure everyone arrives at a similar conclusion if they realise that attempts to fix a certain person's code always end in a express trip to the 6th stage of debugging [1]...

        [1]: http://plasmasturm.org/log/6debug/

        • jzwinck 7 years ago

          Definitely. And that "quick hack by default" person will often end up being the favorite of management because they get so many things "done."

          So it's important to figure out what kind of manager you have, and try not to be the slow guy on a team full of quick-hack people. You won't look good and nobody will be happy.

          • richmarr 7 years ago

            > So it's important to figure out what kind of manager you have, and try not to be the slow guy on a team full of quick-hack people. You won't look good and nobody will be happy.

            I'd also add that it's important to understand the context in which your team is working.

            If I were managing this hypothetical team in a pre-product-market-fit startup I could see the 'slow' person as a potentially greater risk than the others. Not because of speed per se, but because he may be investing too much time in directions that don't help the company learn about their market, and he may be building grand architectural visions that are hard to delete when we realise the product is going in a direction that person didn't anticipate.

            On the other hand if I were managing the same hypothetical team in a highly defined context, for example a mature product or an open source library with a large userbase, the 'quick hack' people would need to change their ways.

            Obviously this analysis is incredibly shallow and would need a ton of conversation and observation; I'm just making the point that different phases of product market fit require very different approaches & it's worth being aware.

          • chii 7 years ago

            > the slow guy on a team full of quick-hack people

            isn't that called 'cultural fit'?

            • lostlogin 7 years ago

              That assessment is best made by the People and Culture team, not you.

        • whipoodle 7 years ago

          We do ourselves a kindness when we skip to step 4.

          • khedoros1 7 years ago

            Except step 5 needs two variants, then: "Oh, I see why it happens" and "Oh, I see that it doesn't, so let's find a way to communicate better with the user".

            • whipoodle 7 years ago

              There are a lot of ways that things can go, but starting out with the disposition that either the bug report is wrong or the other machine is somehow made of worse silicon than our own, is rarely helpful.

              • RugnirViking 7 years ago

                However, investigating every bug report while a worthy cause is something not practical for most shops, especially when the underlying cause for these 'only sometimes and only on my machine' issues is some bizarre bug in some underlying dependency or even windows.

                Now fixing these issues can often lead to really lean mean software that flies, but if you're in the all too typical situation of overambitious deadlines then you're already in triage mode and long hard fixes are just about the bottom of the pile

              • btschaegg 7 years ago

                > starting out with the disposition [...] is rarely helpful.

                Of course, I fully agree. And yet I have witnessed enough developers stating the first stage almost verbatim as a response to a complaint that I grew really quite fond of the link.

                It is, actually, quite useful in that regard - it allows you to point out the absurdity of the statement in a humoristic way.

                That, and it wouldn't be a good parody of the Kübler-Ross model if it didn't start with Denial :-)

              • khedoros1 7 years ago

                Agreed. The direction I was going is this: If there's a bug report, then there is a problem. The problem might be the bug that was reported, or it might be that you surprised the user/tester in a way that they perceived as being a bug. Obviously something needs to be improved, even if it's just communication with the user, rather than changing the behavior of the program.

    • alexc05 7 years ago

      That was amazing. I love crotchety grizzled engineers more than anyone I think. This was classic and hilarious. I can't wait to share with everyone at work.

  • amluto 7 years ago

    I thought Windows had a fancy interrupt priority system that should, in principle, allow sound playback to preempt network interrupts? AMD64 added a fancy feature (CR8 access to the task priority register) just to accelerate interrupt prioritization.

    (This is all very vague memory. I know how this stuff works on Linux. Linux does not have interrupt priorities.)

    • sharkbot 7 years ago

      I could be wrong, but my understanding of some sound cards is that they have essentially a single memory buffer that they are reading from when instructed to play a sound. Most sound cards let the OS split the buffer into two halves and raise an interrupt when one half of the buffer completes playing.

      Interrupt prioritization doesn't help much, because the sound data is likely being generated from user mode, and the playback complete interrupt is being handled in kernel mode, which would require a transition back for further processing. When receiving data from a network card, no transfer back to user mode is required and will have implicit priority over sound generation. Therefore, network processing is likely to starve out sound generation barring choices like those described.

      (This is somewhat hearsay, corrections welcome :))

      • xorblurb 7 years ago

        Needing to wake up usermode is not necessarily an insurmountable obstacle; under Linux you can use a PREEMPT config. Storming a network card having unwanted consequences can also be mitigated: under Linux you have some network device driver that use an hybrid interrupt/polling approach with the NAPI. Combining both should theoretically allow a system to generate sound smoothly in the described conditions.

        • amluto 7 years ago

          IMO you shouldn't really need to wake user mode that often. Just let user code buffer the output samples well ahead of the hardware needing them and have the kernel do the copies when needed.

          • xorblurb 7 years ago

            Of course if you can do that, it's really good. However, sometimes you need low latency.

    • xorblurb 7 years ago

      Windows does not really have interrupt priorities IIRC. It kind of sort of have some, but the applicative code (device driver included) can't chose at which level they run their ISR, so its quite random whether or not the ISR you prefer should preempt others.

      Plus typically general purpose OSes don't map hw interrupt priorities mechanisms to their own priorities (and very soon the hw interrupt controllers do not even know whether an ISR is running or not). I believe neither Linux nor Windows can benefit from hw assistance for interrupt priorities (and they mainly don't care at sw level anyway)

      • barrkel 7 years ago

        Windows interrupt priorities are called IRQLs, see https://blogs.msdn.microsoft.com/doronh/2010/02/02/what-is-i... for a discussion.

        Mostly PnP decides the levels, unless you have multiple IRQs and you need to specify the priority.

        • xorblurb 7 years ago

          That very page makes me think device drivers don't choose their DIRQL. Any more precise pointer to how they can do that?

  • adsfqwop 7 years ago

    Yeah thanks Microsoft. They apparently never thought of someone playing mp3's from a network drive!! It drove me nuts for months, until I figured out exactly what you just posted.

  • ShabbosGoy 7 years ago

    Thanks for the links and reading material.

    It's amazing how important the scheduler is in an OS. I'm no expert on OS-level programming, but couldn't they have just used a simple round robin scheduler?

    • zeta0134 7 years ago

      In this case the scheduler isn't as much of a problem as the interrupt service routine, which by its very nature interrupts the normal thread-level scheduler to do some work with the hardware.

      Both the Network Interface Card (or wireless chip) and the Audio hardware generate interrupts for a variety of reasons. Network cards generate interrupts when data is finished sending or receiving. Audio hardware generally generates its own interrupts when its buffers have emptied out, to signal the OS to provide it more audio to play.

      In this case, it sounds like Vista's network interrupt was running for so long that it was taking priority over other interrupts (audio) and causing them to not be serviced quickly enough. This is definitely a problem, but the fix should not have been to limit the amount of work done per interrupt, but instead to delegate that work to some other process with lower priority.

      It's weird to think of audio as higher priority than networking traffic, but in this case that actually makes a ton of sense. Audio generation only requires ~88.2 KB of data per second, but latency is much lower and less tolerable by a human listener. Modern network connections are working with multiple MB per second, and are (relatively) more tolerant of slight delays. Same with mouse and keyboard input, it's why your mouse tends to move smoothly on most Windows machines even when the rest of the OS is obviously struggling with some heavy task.

      • tinus_hn 7 years ago

        On Windows, like other modern operating systems, a driver is supposed to do only the minimum amount of work possible in the interrupt handler, to avoid this problem. The real work is supposed to happen later when it can be scheduled.

        • AndrewGaspar 7 years ago

          While it's true that networking device drivers on Windows defer all meaningful work to post-ISR, it only moves the goal posts a little bit. On Windows, a lot of network traffic processing (e.g. tcp/ip acknowledgment, virtual switch packet forwarding) is performed in a DPC[1] queued in response to the device's interrupt. A DPC generally runs at dispatch level, which gives it more-or-less exclusive access to the CPU until completion. This preempts the thread scheduler and can result in user-visible stutters in audio and video when the network load outpaces the CPU's ability to quickly service the network traffic.

          [1] https://docs.microsoft.com/en-us/windows-hardware/drivers/ke...

          • tinus_hn 7 years ago

            These are hardly complex tasks, they could be done by computers from the 70s. It's quite possible that Windows makes a mess of it anyway, of course.

            But the top comment was about how interrupts from the network card interfere with interrupts from the sound card so it can't keep its buffer filled. That shouldn't happen.

            Unless the CPU just isn't fast enough to do networking and decode or generate the audio at the same time, but then it's never going to work properly.

    • radicalbyte 7 years ago

      Just look at the process you have running, 90% of them are doing nothing. Round Robin is a waste for an OS.

      • DSMan195276 7 years ago

        Processes that are waiting on some event (like polling some file descriptors) will never be scheduled until they're woken up. It doesn't matter what type of scheduler you are using. The algorithm just decides which to run within the set of programs actually running.

andrewstuart 7 years ago

Its bizarre because I bought something from the PlayStation store on my PS4 and it took DAYS to download.

The strange part of the story is that it took so long to download that the next day I went and bought the game (Battlefield 4) from the shop and brought it back home and installed it and started playing it, all whilst the original purchase from the PlayStation store was still downloading.

I ask Sony if they would refund the game that I bought from the PlayStation store given that I had gone and bought it elsewhere from a physical store during the download and they said "no".

So I never want to buy from the PlayStation store again.

Why would Sony not care about this above just about everything else?

  • ripdog 7 years ago

    Because they are in a dominant market position. Honestly, it's your fault for buying a console - you knew you'd be at sony's mercy due to the monopolistic nature of consoles. The cost of switching to the one comparible console option is so great, most people will not bother unless sony's issues become too great - and network speed is presumably not a deal-breaker for enough people for sony to care.

    So you will continue to suffer abominable download speeds on your PS4, because the cost of switching is too great for most. Basically, suck it up.

    I hate to get all PCMR on you, but this is why everyone should game on PC. If my steam purchase downloaded super slow, I could simply get a refund on steam (an automated process) and purchase the same game on GOG, or on Humble Store. Sometimes Origin or Uplay. PC means freedom, or not being at the mercy of a single uncaring corporate entity.

    • andrewstuart 7 years ago

      I left PC gaming behind years ago.

      I like to sit on the couch and I want my gaming device to be a black box that does its job and is dedicated to couch based entertainment (I use it for TV too).

      PC gaming requires a...... PC... usually sitting at a desk, much more powerful than I need, lots of configuration etc etc to hard.... I'm a couch gamer in the evening after the programming is done.

      I don't care much about the PlayStation download thing. I generally prefer to buy my games on DVD anyway.

      Sony should care however.... despite not wanting to spend money on digital downloads I find I spend ever more with Apple App Store because despite my hesitations they make it easy and it works, whereas I don't want to buy digital with Sony and they make it easy not to do that.

      >>>PC means freedom, or not being at the mercy of a single uncaring corporate entity

      I just don't care about that stuff any more.. life's too short to worry about who our corporate overlords are.

      • mciancia 7 years ago

        > I don't care much about the PlayStation download thing. I generally prefer to buy my games on DVD anyway.

        Yeah, wait until you will have 30GB update for a game to download.

        • WillPostForFood 7 years ago

          I don't think you deserve downvotes here. I only buy physical copies of games, and the standard experience is to come home, insert the disc, and wait for a multi gigabyte patch to download and install before playing.

        • NTripleOne 7 years ago

          Yup, this was my first experience with my brief stint owning a bloodborne machine.

          Put in game, 20GB update. Cool, guess I'm not playing that today then.

          Sold that thing a couple of months later, what a waste of money the whole rigmarole was. Bloodborne wasn't even that good imo either.

      • jdormit 7 years ago

        I have a Steam Link, and I get all the benefits of PC gaming from my couch. It's a pretty seamless best-of-both-worlds situation.

        • MBCook 7 years ago

          You also get the downside of having to maintain/upgrade the PC, which is one of the things I was happy to leave behind when I became a console-only gamer.

          • ekianjo 7 years ago

            > You also get the downside of having to maintain/upgrade the PC, which is one of the things I was happy to leave behind when I became a console-only gamer.

            Oh, you mean the 15 mins swap of a graphics card every 3 years?

            • MBCook 7 years ago

              Plus researching WHICH graphics card to buy, plus constant driver upgrades, plus the driver upgrades may cause issues with some of your games...

              It's not as bad as it used to be… but it's still not like an appliance.

              • ekianjo 7 years ago

                On Linux driver upgrades are pretty easy and I have not recently experienced any game break related to drivers. Getting much better than before, thats for sure.

          • theWatcher37 7 years ago

            I have less free time than most here and somehow I find a way.

            "Maintaining" a PC is super easy...

            • ew 7 years ago

              Go watch some of the major streamers on Twitch. They're constantly trying to figure out why they're dropping frames or why SLI is suddenly fucking up. The illusion that the PC-Master-Race folks like to push is that maintaining and building PCs is easy. Having built many, and owned a computer repair shop, I know that it simply isn't that easy for your average consumer to troubleshoot a computer.

              • ric2b 7 years ago

                > They're constantly trying to figure out why they're dropping frames or why SLI is suddenly fucking up.

                But consoles drop frames as well, people just don't care. You can also not care on PC, like lots of people do.

                And hardly no one uses SLI or even recommends it, precisely because it's annoying.

              • kogepathic 7 years ago

                > I know that it simply isn't that easy for your average consumer to troubleshoot a computer.

                I don't see how you can state that it's any easier to troubleshoot a console. Usually your only feedback is a generic error code which if you Google refers to 20 different problems all with the same error, or an LED blink code which is equally vague.

                Sure, PCs have their own unique problems that consoles don't suffer from (e.g. malware on Windows) but I find that PCs typically have much more descriptive error codes, and usually lack the kind of design flaws present in consoles (e.g. Xbox 360 RRoD due to insufficient cooling).

                • MBCook 7 years ago

                  See that's the thing. I've been a console gamer since the NES and I don't remember ever really having to troubleshoot one.

                  And since the hardware is fixed games don't have incompatibilities. None of the "this version of the driver causes this but that version of the driver causes that" issues. No worrying about system requirements. If it sold for the system, it will play.

                  • Aweorih 7 years ago

                    I can't remember that I ever had the problem of not having the right driver since windows 7 or so. Since Windows 10 drivers are also "automatically downloaded and installed through Windows Update" ("for many devices").

            • govg 7 years ago

              Upgrading a PC is a one time thing you need to do once every few years. You'd be "upgrading" your console by putting it in the bin and buying a new one, so I'm not sure what the argument is against PC gaming here...

              • HumbleGamer 7 years ago

                There is a convenience factor. You dont have to think if you will be able to play 'xxxx' game on your ps4, if its a ps4 game. That plays heavily in keeping me away from PC gaming. I just dont want to have to even consider can I play the game or not.

                • mikewhy 7 years ago

                  Until the Pro and One X have been out for a bit, then the question might come up. Or "Pro only games", which they've said won't happen, but I can see any way around that other than "well you'll just have to accept the awful performance".

                • MBCook 7 years ago

                  Right. If I was using a PC all the time at home for something else it wouldn't be such a big deal, but given that I was only using it for gaming and most games I wanted to play were on the console it was just easier to give up.

                  Like you said, it's an appliance and I never have to think about it. You buy a game for the PS4 and it works on your PS4 and that's all there is to it.

            • notyourwork 7 years ago

              It is about where you play your priorities.

          • nabla9 7 years ago

            > downside of having to maintain/upgrade the PC

            Easy solution:

            You can treat your PC like a console. Just clue everything so that you can't maintain and upgrade it.

            Now you have PC that is like a console. Buy a new one when this is not good enough.

      • Drdrdrq 7 years ago

        > ...life's too short to worry about who our corporate overlords are

        This is not what freedom of choice is about (to me). What I care about is the amount of power they yield over my experience. In Linux their power is low. Windows? Mac? Too high for my taste.

        • Pulcinella 7 years ago

          I don't disagree that this I inportant to you.

          Right now my infant child wields "absolute power" over my life so the convenience and time savings of a console is worth it to me.

      • ekianjo 7 years ago

        > I just don't care about that stuff any more.. life's too short to worry about who our corporate overlords are.

        Oh this state of things can change faster than the duration of your lifetime, so you should care.

    • ekianjo 7 years ago

      > If my steam purchase downloaded super slow, I could simply get a refund on steam (an automated process) and purchase the same game on GOG, or on Humble Store

      I'm with you in general (PC Gamer here) but your example does not really work well. There are many games only available on Steam and not GOG, and while you get most Steam games (but not all) on Humble Bundle, Humble Bundle only gives you a Steam key so you are back to Steam and its potential issues.

      Unfortunately, because of Steam's predominance in PC Gaming, you are more of less stuck with the same problems as the PS4, as in "if it does not work, suck it up". There is no other store platform on PC that has the breadth of choice of Steam.

      • ric2b 7 years ago

        > Humble Bundle only gives you a Steam key so you are back to Steam and its potential issues.

        No, you can download the game directly from Humble Bundle but sometimes (often) they also give you a steam key.

        • ekianjo 7 years ago

          No, only drm free purchases are like that but most of what you buy on the HB store are just steam keys.

  • colechristensen 7 years ago

    When I read stories like this I'm bothered by the fact that the author didn't go to their credit card company and chargeback the unwanted purpose. Your credit card is a powerful tool to mitigate retailers not giving you what you expect, using it motivates bad actors quite well. Chargebacks are expensive and too many chargebacks leads to unfavorable terms with the credit card companies which are inescapable to the retailers. (just like PlayStation has a captive market, so does Visa)

    • benwilber0 7 years ago

      On the other hand be careful with chargebacks especially against a retailer (Sony PlayStation store) that you would like to keep your account with. They could very easily just close your account and then your PlayStation is mostly worthless.

    • cuckcuckspruce 7 years ago

      Sony bans your PlayStation Store account and you lose all your games when you do a chargeback[1]. Yet another reason to never buy digital downloads from Sony.

      Edit: Steam puts your account in a restricted mode when you issue a chargeback[2].

      [1] https://www.reddit.com/r/PS4/comments/2ivsz3/beware_if_you_s...

      [2] https://support.steampowered.com/kb_article.php?ref=6687-HJV...

      • colejohnson66 7 years ago

        Why is that even legal? Consumer protection laws are supposed to protect you from terrible retailers. But, if I issue a chargeback against Sony for recourse, but still lose, that defeats the purpose of a chargeback.

      • criddell 7 years ago

        I bought Katamari on the PS4 store and had trouble getting it to work. I called them up and they just refunded my money instantly. I was hoping for tech support to help me get it working.

    • stevenwoo 7 years ago

      There is a problem with doing a chargeback to a console/platform exclusive service - they can give you your one refund then terminate your service and you lose access to the service. IIRC one of the big game services does this.

    • ew 7 years ago

      You try issuing a chargeback against Blizzard or Valve and let us know how long you keep your account with them.

  • badthingfactory 7 years ago

    I got a PSTV a few years ago and was itching to play some old Metal Gear games one morning. The PSTV is advertised as being able to play Vita games, so I purchased them through the PS Store on my PSTV. The game downloaded and was ready to install and I got an error that said the game was unplayable in my region on that device. I was annoyed, but figured I would get my money back. I contacted customer service and they said games were not refundable. They allowed me to purchase and download a game on a device it was unplayable on and then refused to give me money back. I was never able to play the games I bought.

    A few months ago, my account was hacked and someone charged $50 to my card. I contacted support and they told me "normally all sales are final, but they would do me a one time courtesy and refund the money since my account was indeed hacked." WOW THANKS FOR BEING SO KIND SINCE THIS WAS CLEARLY MY FAULT.

    Sony no longer receives money from me.

  • hd4 7 years ago

    If you're in the EU, it's worth checking consumer laws. Valve and Sony and others have been allowed to ignore local laws too much until now.

erikrothoff 7 years ago

Totally unrelated but: Dang it must be awesome to have a service that people dissect at this level. This analysis is more in depth and knowledgable than anything I've ever seen while employed at large companies, where people are literally paid to spend time on the product.

  • PaulHoule 7 years ago

    It is astonishing how oblivious people who are "paid to spend time on the product" can be about the products they ship.

    Case in point, Microsoft shipped a EXPLORER.EXE with Windows 10 that is 20% whitespace because of an Adobe invented metadata format that I bet nobody involved with Windows ever thought about it. See

    http://ontology2.com/essays/LookingForMetadataInAllTheWrongP...

    • peterwwillis 7 years ago

      Wasting space is a byproduct of having too much space. Just because it's bloated does not mean they didn't know it was bloated. They probably didn't care.

      As time goes by, minimum hardware requirements for operating systems increase. This isn't because it is more difficult to write software, or the hardware stopped being capable of doing the same thing it used to do. It's because software design wastes more resources as those resources become increasingly available.

      Today, people ship statically compiled binaries of tens to hundreds of megabytes in size because it is convenient to do so. That would have been absolutely insane before increased bandwidth and storage space made it practical. I have an internet connection so fast my wifi card can't use all the bandwidth.

      What a time to be alive.

      • Drdrdrq 7 years ago

        This is one of the reasons for Docker containers to be so popular. They are basically statically linked apps, carrying everything they need within themselves.

        But - I like that. Not having to worry about some library version because I know the obe that is supplied is the same one the developer put there. Nice.

        • kiallmacinnes 7 years ago

          I believe this is a myth! You still very much have to care - for security updates.

          However, more often than not, you've just lost visibility into what libraries and versions you're actually running..

          • peterwwillis 7 years ago

            Yep. The production box no longer needs to have the exact same library packages installed as the dev box, but now it needs the same docker images installed. It's just moving the goal posts - with respect to "dependency problems".

            The actual benefit to containers is abstracted and unified software components in a complex system. But as we create new, different cloud computing tech, this becomes more difficult, and defeats the purpose somewhat.

          • Drdrdrq 7 years ago

            Sure, but that should be part of updates. Even if developer doesn't take care of it, you can rebuild the image to update libraries. Of course app might stop working, but at least other apps/containers still work and you can revert the change easily.

      • thanksgiving 7 years ago

        I don't know. I like static linking. I'd rather I like things that just work ™. I mean look at PHP for Windows. I can't run php --version because I don't have visual C++ installed. Why? Wouldn't it be easier to include everything you need with your binaries? How much extra stuff are we talking? To me, RAM and processor are much scarcer resources.

        • colejohnson66 7 years ago

          To me, that just speaks about Microsoft’s failure to make the C++ redistributible API compatible across versions. If it was, they could just bundle them with the OS.

    • bitwize 7 years ago

      The PC release of Sonic Mania was delayed because of slowdown issues.

      Slowdown issues. On a game that is literally nothing but 90s graphics.

      • jmkni 7 years ago

        Unrelated, but about a year ago I downloaded Sonic the Hedgehog 2 from the Xbox Store onto my Xbox one (I think it cost £0.99).

        Anyway, it was about 500Mb in size and it runs Sonic 2 on the Xbox One's Xbox 360 emulator, which is then emulating a Sega Mega Drive.

        That just blows my mind! 8 x86 cores to emulate 3 PPC cores to emulate a mega drive.

      • coolsunglasses 7 years ago

        The game is more graphically intensive than you might think and the people that care about this kind of game very much value a solid 60 FPS frame rate.

        • djhworld 7 years ago

          Works at 60fps on Nintendo Switch though, which isn't exactly the most powerful machine in the world.

          • coolsunglasses 7 years ago

            Not just works, it's 100% solid on the Switch outside of the mode-7'ish bonus game with no stuttering or frame drops.

          • Pulcinella 7 years ago

            I imagine it is easier to get working on a known hardware configuration.

        • bitwize 7 years ago

          It's about Sega Saturn levels of graphically intensive, and should be well within the range of capability of even pleb-tier GPUs from the modern era.

        • quazeekotl 7 years ago

          yet somehow it ran at 60fps 20 years ago on hardware that was 10000 times slower

          hmm

          • jhasse 7 years ago

            The game includes garphics which wouldn't be possible on the Mega Drive / Genesis. For example alpha blending or lots of sprite rotating.

            Also Sonic on the Mega Drive had multiple framerate drops, e. g. when you lose lots of rings.

            • TulliusCicero 7 years ago

              It would've been possible on the Saturn, and on the Dreamcast at 60 fps, easily. How much more powerful are even mediocre modern PCs compared to the Dreamcast?

              • Lorkki 7 years ago

                Saturn would most likely choke on memory requirements, given how much is going on in the levels, as well as lack of transparency support. Symphony of the Night was the state of the art 2D platformer for that hardware generation and it has frequent loading pauses, despite being much more static.

                Dreamcast would probably work, given enough effort by people who know the platform inside and out. But then, a much lower barrier to entry is probably a significant reason as to why Sega was willing to fund another 2D Sonic game in the first place.

          • anmorgan 7 years ago

            Though we do have 4K tvs now.

          • gambiting 7 years ago

            That's being incredibly unfair in comparison. Old hardware had built in support for sprites, and as long as you kept within the limit the game would run smoothly. It had nothing to do with CPU speed - the machine had hardware support to draw say 8 sprites at the same time so if you were only drawing 8 it was fine, but try drawing 9 and you won't render even a frame a second. Nowadays we're doing an extremely brute force approach because everything is its own object that has to be refreshed every frame - and in a game like sonic mania there is so much animation going on it's insane. In a way it would be easier to do a 3D game than a very complex 2D game like this.

            • w0utert 7 years ago

              Uhm, no? Even a PC video card from 10 years back can literally render millions of textured polygons (=basically sprites) at a steady 60 fps. Any modern card with programmable shaders would be able to do all the transforms and effects on the GPU with virtually zero CPU overhead at the same time.

              There really is no excuse whatsoever for a 2D game like sonic mania to run badly on even low-end PC hardware even with built-in graphics hardware.

              • Lorkki 7 years ago

                Note that they actually did set the minimum requirement to "a PC video card from 10 years back": http://store.steampowered.com/app/584400/Sonic_Mania/

                Memory bandwidth is the main concern when blitting several layers of pixels 1:1, unless you have some kind of crafty system in place for avoiding overdraw. Although Sonic Mania seems to genuinely use a 320x240-ish internal framebuffer, so my wild guess is that the bottlenecks relate to scaling and post-processing, or possibly some funky driver use causing sync issues.

      • voltagex_ 7 years ago

        There's slowdown in some sections on the Xbox One version, too.

    • ship_it 7 years ago

      that is hc dedication level, props.

    • Sukera 7 years ago

      > Quick note: The situation is not different on Linux, MacOS, or other operating systems because all executable formats have some way to embed images. What publishers should do is remove unncessary metadata before publishing, which is easy to do in the case of the PNG because we can simply omit the iTXt chunk.

      From your link - so I guess the situation on linux isn't different? Or am I missing something, other than that it depends on the embedded PNG?

      • jwilk 7 years ago

        I'm not quite sure what exactly the author's point is.

        Yes, you could embed images in ELFs, too, if you really wanted, but why? There's no standardized way to do it, and putting the images in separate files is easier.

        • jzwinck 7 years ago

          There is a standard way to embed any file into an ELF. The program "objcopy" creates object files from arbitrary input files. You then simply link the objects into your executable. The program can then access those arbitrary embedded data chunks by their symbol names, just as if they were literal strings in the source code.

          I explain how to do this in the context of embedding a version file here: https://stackoverflow.com/questions/16349557/does-gcc-have-a...

      • kalleboo 7 years ago

        If we're comparing platforms, Xcode (Apple's IDE for iOS/macOS apps) runs pngcrush on all PNG assets by default.

      • PaulHoule 7 years ago

        I don't expect it to be different on Linux or any other operating system.

  • pjc50 7 years ago

    > people are literally paid to spend time on the product

    Well, that's just a job. It's the people spending their time voluntarily on a project who are the few that really care.

    Also often a project team develop blind spots; I'm on an internal initiative at work helping out another team with some long-standing performance problems, and it's remarkable how much a fresh pair of eyes helps. It challenges assumptions and asks basic questions like "how does this work exactly" and "why are you doing that".

g09980 7 years ago

Want to see something like this for (Apple's) App Store. Downloads are fast, but the App Store experience itself is so, so slow. Takes maybe five seconds to load search results or reviews even on a wi-fi connection.

  • matthewbauer 7 years ago

    I've noticed this too but it seems to be the same thing on both macOS and iOS, maybe it's on the server end?

  • vegardx 7 years ago

    Not just slow, the search is horrible in other languages than English. The results are all over the place, I often get something completely unrelated in another foreign language as the first match, and with some luck the application I was looking for under it.

  • MBCook 7 years ago

    Reportedly the redesigned App Store in iOS 11 is much faster, hopefully they'll bring that change over to the Mac soon.

  • dfee 7 years ago

    I’m glad I’m not the only one who’s noticed this.

    • wingworks 7 years ago

      Me too, it's painfully slow

  • Tanegashima 7 years ago

    It’s their backend that’s just slow.

    Thankfully, iOS 11 AppStore is much faster for everything.

  • ex3ndr 7 years ago

    Same with Google Play since day 0.

    • pier25 7 years ago

      I use both iOS and Android on a daily basis. In my experience the Play store is consistently faster.

      I'm not surprised, Google is better at these kind of things.

    • dingo_bat 7 years ago

      Have you used Apple store? Google play is much faster. There is no comparison.

    • ripdog 7 years ago

      Really? On my galaxy note 4, Google Play pages open within ~400ms consistently.

cdevs 7 years ago

As a developer people seemed surprised I don't have some massive gaming rig at home but there's something about it that feels like work. I don't want to sit up and be fully alert - I did that all day at work I want 30 mins to veg out on a console jumping between Netflix and some quick multiplayer game with less hackers glitchin out on the game. It seems impressive what PS4 attempts to accomplish while you're playing a game and yet try and download a 40gig game and some how tip toe in the background not screwing up the gaming experience. I couldn't imaging trying to deal with cranking up the speed here and there while keeping the game experience playable in a online game. Chrome is slow? Close you're 50 tabs, want faster PS4 downloads, close your games/apps. Got it.

  • dcow 7 years ago

    But it's just one app and boom 7kb receive window. The author goes into how terrible the impl is. I'm sure it might seem crazy if you're not familiar with kernel networking. Also I hope you get more than 30 min of free time each day (=

    • cdevs 7 years ago

      I know my comment came off as putting the article down but I always have praise for anyone on the hunt for issues of why Firefox does this or why PS4 does this. I shoot for 30 mins of game time but we all know how that turns into 2-3 hours when I ment to start some programming side project. Games are the devil.

ckorhonen 7 years ago

Interesting - definitely a problem I've encountered, though I had assumed the issues fell more on the CDN side of things.

Anecdotally, when I switched DNS servers to Google vs. my ISP, PS4 download speeds improved significantly (20 minutes vs. 20 hours to download a a typical game).

  • jsnell 7 years ago

    Yeah, there are definitely other possible reasons. E.g. one workaround I found had Australian users block a specific IP in their firewall. Turns out the DNS configuration was bad, and they were getting an overseas CDN server every now and then. Due to the way the PS4 schedules the downloads (chunks downloaded serially, chunk boundaries at static byte offsets rather than based on time) having even one bad server in the rotation can really ruin performance.

    But CDN configuration errors can happen to anyone. I think this client-side mess is a much more creative way to screw up :)

    • FridgeSeal 7 years ago

      I don't suppose you'd know what that ip is and if this fix still works do you?

  • menacingly 7 years ago

    I have slow downloads too (with google DNS), but I would think this wouldn't impact it much. It's not like it's performing a DNS lookup for each chunk, right?

    • philsnow 7 years ago

      Well no, the issue is that the ps4 is constraining the tcp receive window (which is how much data is in flight at once), so if your ISP DNS gives you a CDN node that's 200ms away but google's DNS gives you a CDN node that's 10ms away, you get receive_window bytes every 10ms instead of every 200ms.

      There's a similar thing happening (different DNS giving vastly different experience) with Netflix as well: Netflix has local nodes _inside_ many ISP's networks, close to ISP subscribers. If you use your ISP's DNS (and if they're doing their job properly), they'll send you to one of the local nodes and it'll take much less time to buffer, your stream will start faster, you'll drop less etc. Whereas if you use Google's DNS, they don't (necessarily) know the inner workings of your ISP so they send you to a generic Netflix CDN node that's near-ish but still probably 10-100x "farther" as the photon flies.

      If you want to have a mix of DNS settings per domain name, you can set up a dnsmasq inside your network and configure it to recurse to google's DNS normally but recurse to your ISP's DNS for just netflix.com, nflxvideo.com etc. see for instance [0] for one way to set that up. they have it running on a dd-wrt router but I have it running just on some rando linux machine somewhere in the house and configure my DHCP to tell everything to use that machine's IP as the primary DNS.

      [0] https://github.com/ab77/netflix-proxy/wiki/setting-up-netfli...

      • stephen_g 7 years ago

        Hmmm, does that actually improve speeds for Netflix, or was that somebody just doing it because they thought it would?

        It's been a while since I've looked into it, but I thought that Netflix's devices peer with the provider's routers and intercept traffic through that way, so the DNS server shouldn't matter (because they would share the same IPs as the publicly accessible CDN load balancers).

        I thought most CDNs worked that way now - using DNS never really worked well.

        • hug 7 years ago

          Anecdata: Office 365 will put you on an APAC CDN instead of an AU CDN if you use Google Public DNS from Australia, but not so if you use local ISP DNS (like, say, Telstra's).

          • voltagex_ 7 years ago

            I'm using Internode's DNS here and I get servers from Hong Kong and Singapore mostly.

        • philsnow 7 years ago

          I have only my own anecdata, which strongly support the hypothesis.

      • menacingly 7 years ago

        pretty helpful comment, thanks

  • tyfon 7 years ago

    Personally I've never had any issues but I see people complaining about it.

    I guess it's location more than anything. I have a fiber connection in Norway and it always downloads games at maximum speed.

lossolo 7 years ago

DNS based GEO load balancing/CDN's are wrong idea today. For example if you use DNS that has bad configuration or one that is not supplied by your ISP, then you could be routed to servers thousands km/miles from your location. Last time I've checked akamai used that flawed dns based system. What you want to use now is what for example cloudflare uses which is anycast IP. You just announce same IP class on multiple routers/locations and all traffic is routed to the nearest locations thanks to how BGP routing works.

  • skuhn 7 years ago

    A DNS based GSLB system makes a lot of sense when you don't control (or trust) the client and you need to make smaller transfers where latency is more important than throughput -- i.e. web pages being loaded by any random web browser.

    It's definitely not an ideal method when throughput is more important than latency, as in the case of 40GB game downloads (or streaming video, for that matter).

    Sony could easily implement an alternate system where DNS based GSLB is used to first locate a nearish node that then issues a redirect to an actually nearby node after doing some basic lookups (to resolve the "I am not where my DNS resolver is" problem).

    Sony could also implement additional logic on the client (since they control the client after all), where it obtains a list of possible sources and runs races against them before deciding. Some server control is still desirable, in the event of a business reason to prefer a less optimal path or an overloaded source. But again, since you control the client, you can have it respond to your "chill out" packets and move to its next nearest target.

    That Sony doesn't appear to do any of this probably speaks more to a lack of expertise than anything else. They run a pretty decent online service with PSN, but I get the sense that they aren't really hiring people with a ton of Internet content distribution experience.

    • jsnell 7 years ago

      Sony control the client, but not the servers. They don't run their own CDN, but use at least half a dozen different commodity CDNs. So extra server side logic won't really be an option, it's just static files over HTTP in bulk.

      But yes, there's a lot of client side smarts they could benefit from.

  • toast0 7 years ago

    As with all things, there are tradeoffs. As you say, GEO DNS doesn't work very well when the authoritative server doesn't get enough information to give a good answer; for example, if all of a nationwide ISP uses the same pool of recursive DNS, and doesn't proved EDNS client subnet, the authoritative server will not get any signal about where in the country the users are. Or, if you use DNS servers in another country for some reason, you may be assumed to be nearby to that country.

    Anycast isn't perfect either though. It may be more likely to get you a route with fewer hops, but that's not necessarily the lowest latency link. And if there's a congested link in the path, it's much harder to steer a portion of users, but not all to alternate servers. You also have to deal with the potential of routing changing from packet to packet, and potentially different locations getting different parts of the same connection. It's not 'just announce the same IPs from multiple locations' and take a nap.

    Anyway, unless you have thousands of datacenters well distributed throughout the globe, a lot of people are going to have high latency to whatever server is chosen, and high latency and very small windows == poor throughput, regardless of how you get to a server.

    • mjevans 7 years ago

      The DNS server is the wrong place to make that decision.

      If there is a set of choices and a decision to make, letting the client choose would be the best option.

      I think I'd start with a client shuffling the list of results and asking for small fragments of the dataset from each. Based on the latency, bandwidth detected, and the size of the requested data it would then pick the most optimal set of servers. This also allows for fallback in case servers are unresponsive.

    • lossolo 7 years ago

      > Anycast isn't perfect either though.

      Sure, but it's better than DNS based solution.

      > It may be more likely to get you a route with fewer hops, but that's not necessarily the lowest latency link. And if there's a congested link in the path, it's much harder to steer a portion of users, but not all to alternate servers.

      This is also true for DNS based solution because of DNS caching. Smart networks will change routes before your DNS changes get propagated.

      > Anyway, unless you have thousands of datacenters well distributed throughout the globe, a lot of people are going to have high latency to whatever server is chosen, and high latency and very small windows == poor throughput, regardless of how you get to a server.

      Depends what's your definition of high latency. If you live on desert in Africa or some forgotten village in India, then you will always have high latency because no one will build a POP there.

    • londons_explore 7 years ago

      Do any big networks do anycast tcp stream reassembly when a connection from a user gets some packets split between different datacenters?

      One could imagine a system where one datacenter forwards the packets (encapsulated) to the other so that the stream isn't broken.

Reedx 7 years ago

PS3 was even worse in my experience - PS4 was a big improvement, although still a lot slower than Xbox.

However, with both PS4 and Xbox One it's amazingly slow to browse the stores and much of the dashboard. Anyone else experience that? It's so bad I feel like it must just be me... I avoid it as much as possible and definitely decreases the number of games I buy.

  • s_kilk 7 years ago

    > it's amazingly slow to browse the stores and much of the dashboard

    Same, my PS4 can take up to a few minutes to load store pages. And sometimes it just hangs, or times out, and I need to quit the app and try again. It's really not a good experience at all.

  • spike021 7 years ago

    The PS4's PSN store's search function is terrifically awful. Why on earth do they think "typing" is easier using a spin dial for each character in the search query? Not only that but the dial moves so slowly.

  • rado 7 years ago

    Not network related (?), but the slow and choppy PS4 UI/dashboard is one of my "favourite" features. What were they thinking?

    • jhasse 7 years ago

      "JavaScript is just as fast as C++ nowadays."

foobarbazetc 7 years ago

The CDN thing is an issue too.

Using a local DNS resolver instead of Google DNS helped my PS4 speeds.

The other "trick" if a download is getting slow is to run the in built "network test". This seems to reset all the windows back even if other things are running.

  • jsnell 7 years ago

    > The other "trick" if a download is getting slow is to run the in built "network test". This seems to reset all the windows back even if other things are running.

    I did not see that in my testing. The built in speedtest runs with a large receive window, but the store downloads are not affected. (You can see an example in the last graph; there's a speedtest early on that has a receive window 100x larger than the PSN downloads that are crawling along).

    It's probably just a placebo.

    • matwood 7 years ago

      > It's probably just a placebo.

      Possibly, but I think the OP above was onto something about it 'resetting' the windows. I just tested downloading a large game while I had another game running. The PS4 showed it was going to take 6 hours. I closed the other game I had running, and it still showed 6 hours. I ran the speed test, and now it shows it's going to take 50 minutes.

      BTW, thanks for running these tests as I have some stuff to try instead of waiting hours for a game to download :)

      • jsnell 7 years ago

        My experience is that the finish time prediction isn't reliable. It's based average speed over the whole connection, not the current speed. So the projection will lag behind if the speed changes dramatically. Maybe the speedtest somehow resets the projection?

        The measurement really needs to be done on the wire.

      • mlonkibjuyhv 7 years ago

        But how long did it end up actually taking?

  • lathiat 7 years ago

    > Using a local DNS resolver instead of Google DNS helped my PS4 speeds.

    that's not a surprise, as when using Google DNS the CDN has much less information about you to try and route you to a good choice of CDN endpoint. will be variable based on where the google DNS endpoint is and you are [plus the connectivity of a given CDN to both of those places]

    More information about that here including "EDNS Client Submit" a solution they deployed to send a partial client IP to the server but only for specific CDNs they have set it up with: https://developers.google.com/speed/public-dns/faq "I've read claims that Google Public DNS can slow down certain multimedia applications or websites. Are these true?"

Tloewald 7 years ago

It's not just four years into launch since the PS3 was at least as bad.

tgb 7 years ago

Sorry for the newbie question, but can someone explain why the round trip time is so important for transfer speeds? From the formula I'm guessing something like this happens: server sends DATA to client, client receives DATA then sends ACK to server, server receives ACK and then finally goes ahead and sends DATA2 to the client. But TCP numbers their packets and so I would expect them to continue sending new packets while waiting for ACKs of old packets, and my reading of Wikipedia agrees. So what causes the RTT dependence in the transfer rate?

  • colanderman 7 years ago

    In a network protocol you generally want to find out as quickly as possible when the receiver has run out of buffer space. Else you're just sending packets it will drop on the floor, because it's unable to process them at the rate you're sending them. (And the buffer space should be small, to minimize latency under load.)

    But also, you want to keep the pipe full (as explained in the article). So, to balance these opposing requirements, the receiver generally has a buffer which is only somewhat larger than the capacity of the pipe (the bandwidth-latency product).

    In TCP, the receiver reports the size of this buffer as the receive window. This way the sender knows exactly how much it can send before the receiver's buffers are full. And the receiver's buffer must be large enough to hold at least an RTT's worth of packets, since that is how long it takes (at a minimum) after the sender started sending the data for the sender to hear back from the receiver that it has processed some data and made room in its buffer. Any less and the sender has to stop and wait (like in the article).

  • preinheimer 7 years ago

    I think this is a good question, I did some reading myself!

    The blog post talks about "receive windows" which comes into play here. The server will send up to the number of bytes specified in the receive window before needing to start seeing some ACKs.

    So the shorter the round trip time, the less likely the server is to spend time twiddling it's thumbs waiting for an ack, because it's exhausted its receive window. Of course, increasing the size of the receive window would also help.

    (I read this answer to get here, I could be way off: https://stackoverflow.com/questions/9613197/what-determines-... )

    • syncsynchalt 7 years ago

      That's right. In the case of a download, the receive window can also be visualized as a limit on the number of packets that are allowed to be "in flight" at once.

      Shrinking the window is the wrong tool for the job here, like hammering in a screw. It's sensitive to round-trip-time in a way that makes it impossible to equate a given window with a desired speed. A better method might be to monitor download speed and delay ACKs when it exceeds the limit, but traditionally Unix doesn't have a standard API to do this.

  • syncsynchalt 7 years ago

    Because Sony has (naïvely) limited download speeds by limiting how many packets can be "in flight" at a given time.

    Say for example that they clamped it down to a single packet in flight. In that case, if the server was 30ms (round trip time) away then you could only get a packet every 30ms, and if the server was 60ms away then it would cut your speed in half.

    In normal situations this doesn't happen: your max-packets-in-flight limit (aka your receive window) will eventually scale large enough that it doesn't matter how far you are from a server, the speed will eventually reach the bandwidth limit of the slowest part of the route.

  • motoboi 7 years ago

    Think about the path between sender and receiver as a pipe.

    You will have data inside de pipe all the time, the amount of it is a product of bandwidth an latency (pipe length), which is measured as RTT (round-trip time).

    The sender puts data on the pipe and must wait for acknowledgment from the receiver. How much data should it put on the pipe before the first acknowledgement arrives?

    If the pipe is too length, the sender will have a lot of data on transit, but this amount will grow over time based on some algorithms (congestion control).

  • londons_explore 7 years ago

    There is something called the receive window. That is the size of data that can be sent before getting more ACK's back.

    If the delay is longer, and the receive window is the same size, then total throughput is lower.

Companion 7 years ago

I actually dread downloading patches and whatnot from PSN for this reason. I have a 500 mbit connection, that works perfectly well on all my other devices, but my Ps4 Pro, well, it's incredibly fickle. There'll be days where download speeds are good, and then there'll be days where even downloading 200mb is a challenge. It's all wired, so it's not a wifi related problem. I went through different routers and even changed ISP's once and the problem still persisted, so I think I have ruled it out as being on my end. It seems to be some weird QoS feature of the PS4, or possibly the PSN not being up to scratch - I don't know. Stuff like closing all background apps, or changing DNS, they don't really seem to do anything for me. Sometimes pausing/unpausing does help, though..

tenryuu 7 years ago

I remember someone hacking at this issue a while ago. They blocked Sony Japan's server, of which the download was coming from. The Playstation the fetched the file from a more local server, of which the speed was considerable faster.

Really strange

  • motoboi 7 years ago

    Closer server means lower rtt. All things equal, this translates to a faster transmission.

lokedhs 7 years ago

As one piece of information I offer my own experience with PSN downloads on the PS4.

I'm in Singapore and my normal download speed is around 250 Mb/s, sometimes getting closer to 300.

However, I sometimes download from the Swedish store as well, and those download speeds are always very slow. I don't think I've ever gone above one tenth of what I get with local downloads.

That said, bandwidth between Asia and Singapore are naturally more unpredictable, so I don't know if I can blame Sony here. My point is that PS4 downloads can be very fast, and the Singapore example is evidence of this fact.

  • rohmish 7 years ago

    I've experienced similar thing with PS3. That said, unlike the author I am too lazy to inspect traffic on that.

jumpkickhit 7 years ago

I normally warm boot mine, saw the speed increase with nothing running before, so guess I was on the right track.

I hope this is addressed by Sony in the future, or at least let us select if a download is a high priority or not.

deafcalculus 7 years ago

Why doesn't PS4 use LEDBAT for background downloads? Wouldn't this address the latency problem without sacrificing download speeds? AFAIK, Macs do this at least for OS updates.

hgdsraj 7 years ago

What download speeds do you get? I usually average 8-10 MB/s

  • lqdc13 7 years ago

    I've gotten anywhere between 100KB/s and 60MB/s on the same ISP.

ambar123 7 years ago

I dont want to know...

Cryptoboss 7 years ago

I rented Final Fantasy XV last night to try it out. I put it in and had a 13 Gig update. So of course I had to download it all night to even get a chance to play it. Then I finally open it and it's takes another hour to install it.

Now I'm not even sure I'll get to play it much before I even have to return it. It's a problem.

galonk 7 years ago

I always assumed the answer was "because Sony is a hardware company that has never understood the first thing about software."

Turns out I was right.

bitwize 7 years ago

This is so that there's plenty of bandwidth available for networked play.

The Switch firmware even states that it will halt downloads if a game attempts to connect to the network.

  • jonny_eh 7 years ago

    Even if the game is suspended? It seems reasonable that you're right, but it also seems like an oversight to not lift the throttling when games are suspended, or if the game's not being played online.

frik 7 years ago

PS4 and Switch have at least no peer-to-peer download.

Win10 and XboxOne have peer-to-peer download - who would want that, bad for users, wasting upload bandwidth and counts against your monthly internet consumption. https://www.reddit.com/r/xboxone/comments/3rhs4s/xbox_update...

  • tortasaur 7 years ago

    I want that, as it normalizes the need for large upload bandwidth. The only reason it's such a scarce commodity now is that households aren't perceived as needing that bandwidth.

    The proliferation of legal peer-to-peer systems will hopefully drive the spread of symmetric connections.

    • MBCook 7 years ago

      One of the reasons I switched off of AT&T uVerse is they seemed to give you the absolute minimum upload bandwidth needed to achieve the advertised download bandwidth.

      So if you were downloading or streaming and uploaded anything more than a small image your download speed would plummet as you ran out of upstream bandwidth.

      Online backups/syncing? Hope you're not trying to do anything else that requires bandwidth.

      • apenwarr 7 years ago

        That's not quite right. The most common problem with (especially, but not only) DSL is bufferbloat, where you fill up your upload buffers, and this greatly increases RTT. Then your ACKs in response to downloaded content will take a long time to get through the queue and tell the server it's okay to send more data. You would have to have a truly tiny upload speed limit (like 0.01 Mbps) for the actual ACK throughput to be the limiting factor.

        You can work around bufferbloat if you have a DSL modem with fq_codel (rare), or if you insert a router with a rate limiter and fq_codel, so basically you never fill up your DSL modem's queue.

        • MBCook 7 years ago

          Ah. What I posted is my assumption of what was going wrong. Buffer bloat makes sense. Thanks.

          I 'solved' it by going to a provider with more upstream (cable) then later fiber. Even without this issue the download speed wasn't very good for the price so leaving was an easy decision.

          • namibj 7 years ago

            You could also use μTP for transfering the data to a rented server and then from there to wherever you need. That protocol is designed to fill the pipe as much as possible, i.e. never let it dry out, while bloating the buffer as little as possible, by basically doing minimum one-way ping measurements and then using a target delay that is choosen so that the minimum delay is never actually hit. it does slow down internet browsing though, so you have to be carefull if there are other users who like to complain. the bandwith is not affected and it does yield to tcp though.

      • stordoff 7 years ago

        > Hope you're not trying to do anything else that requires bandwidth.

        I get the same on Sky (UK) ADSL (fibre[TTC] seems to be unaffected). When I'm uploading e.g. a video to YouTube, even something like Hacker News will often time out.

        • ripdog 7 years ago

          You get ADSL backed by FTTC? Not even VDSL? Who thought that was a good idea?

  • derefr 7 years ago

    It works well if both devices are on the same LAN. Like inside a corporate office.

    What organization has multiple XBOs, though? Game dev studios? Hospitals? It must be much rarer than the Win10 case.

    • kec 7 years ago

      college dorms

      • ygra 7 years ago

        Don't those put everyone on a different VLAN? At least that was the case when I was in uni.

  • deelowe 7 years ago

    Do people really have issues with bandwith limits on uploads? I've always heard the opposite.

    Is it P2P or more like torrent where there are multiple peers? That would make more sense.

    At the end of the day, isn't the switch and xbox faster at downloading? That's what really matters.

    • seabrookmx 7 years ago

      Yes definitely.

      For example, my home connection (in Western Canada) is 150mb down and 15mbit up. For general home usage (Streaming Netflix and whatnot), you definitely need more download than upload, but if I'm working from home over VPN, syncing pictures to Dropbox, etc. you can definitely saturate 15mb. And most North American households have much less bandwidth than I do.

    • Spooky23 7 years ago

      My network has multiple iOS devices, two computers and a bunch of appliance devices I also use voip.

      Maxing out the standard 3mb uplink is easy when photos sync or any number of other use cases.

      • kuschku 7 years ago

        3mb uplink? That’s... I’m sorry for you.

        Maybe you can find another ISP, I’m in northern Germany and my ISP provides 100mbps down, 40mbps up via DSL.

        • Spooky23 7 years ago

          Now I pay extra for "enhanced" service. 50/5!

          Here in America we believe that competition involves 1 cable company and one legacy phone company who offers 2/0.5 DSL for $70/mo, and will not actually install it if you try!

    • MBCook 7 years ago

      Many companies offer asymmetric bandwidth, especially on DSL. So depending on what kind of broadband you have you may not have much upload bandwidth at all.

      See my other comment in this post for an example.

    • rubatuga 7 years ago

      I'm lucky to get 2 Mbit upload speeds. However, my download speeds are around 25Mbit/s.