voidwtf 13 days ago

I had the benefit of growing up on the during the transition period from CRT to LCD. I had a friend who got an awesome Sony Trinitron WEGA and had the opportunity to play PS1 and PS2 games on it. I struggle to see the appeal of the CRT for modern day anything. For games designed on a CRT, the allure is obvious. Games designed for CRT often looked their best on the CRT because of the subtle blending and screen pattern which the graphics were designed around. The games weren't meant to look like pixelated blocks with clearly defined edges like they do on a modern screen, and they didn't look that way on a CRT.

However modern content on modern, decent, LCD panels and especially on OLED panels, blows CRTs out of the water. The vibrancy of the colors, the overall quality of the picture, not having that CRT 'glow', readability of text is all improved in my opinion. CRTs also had a number of maintenance items, the screens attracted dust like a magnet, having to adjust vertical and horizontal alignment, color adjustments (both which often ended up out of whack for some reason).

I'm sure there are benefits, for older games I see it, but for modern games I sometimes wonder if this is people waxing poetic or being nostalgic. I'm sure there are some people who will make claims about gaining an edge in online shooters, but I'm curious how much of that is real considering other losses in the pipeline like digital to analog conversion and how low the refresh rate is compared to modern gaming panels.

  • kmacleod 13 days ago

    I worked for a video titling company in the 80s during their transition from Analog titling to Digital titling. One key thing I learned there is that CRTs have about three times the horizontal resolution folks give them credit for. Most folks measure horizontal resolution in how many clear pixels you can go from light to dark and back. What they don't account for is the pixel-shift, you can position those start and end transitions at three or more times that resolution.

    Nowadays we call that sub-pixel rendering or antialiasing. But in the 80s so many people were convinced TVs could only do 640x480. Our titling systems were typically doing 2400x480 to get good quality character aliasing and image shading on CRTs. This is still somewhat true today analog-wise but common sub-pixel rendering or antialiasing does achieve the same effect on LCDs.

    • grishka 13 days ago

      Strictly speaking, because the signal is continuous, analog video doesn't really have horizontal resolution, does it?

      • kmeisthax 12 days ago

        No, resolution is not a digital concept. Pixels are a digital concept[0], but in an analog signal there is still a limit on how much frequency content you can resolve.

        CRTs additionally impose their own limit on how sharp the picture can get horizontally. Confusingly, these are called "television lines", even though you'd think that refers to the horizontal scanlines that do discretely divide up the picture vertically. Color CRTs further limit the amount of color resolution independent of luminance resolution, and how they do that depends on what specific phosphor pattern they use, how they separate electrons from each color gun, etc.

        [0] And one that's just as misleading as sample-and-hold diagrams of digital audio that make people go out and buy really expensive DACs for reproducing ultrasonics they can't hear

      • kmacleod 13 days ago

        Exactly. Black and white TVs worked very much like oscilloscopes and had almost no definition of horizontal resolution. Very few color broadcast TVs were made with three separate beams and three separate color filters (like rear-view projectors). Your typical color TV effectively created "pixels" by their use of shadow masks and three separate beams. The analog signal, however, could still be in transition along the entire horizontal sweep. The biggest difference between high-end studio monitors and your average TV was how packed the shadow mask was.

        • abstractbeliefs 12 days ago

          I don't think that's quite how it worked.

          The colour was created by different phosphors on the inside of the screen, there was nothing different about the electron beams. The number and pitch, the resolution, of these different phosphor dots determined the resolution of the screen.

          The shadow masking was to prevent the beam for, say, the red phosphors sweeping blue/green subpixels when moving from one pixel to another, since it wasn't practical to turn the beam off and then back on once the steering coils had been changed. Steering was continuous, so without shadowmasks, you would illuminate on the neighbour subpixels you pass on the way.

          You could have done it all with a single beam, if you really wanted, but it's not very practical - you'd need to sweep slower since you could only illuminate 1 subpixel at a time, it'd take much longer to to steer, illuminate at the right level, and move to the next with the right beam power selected.

          • mlyle 12 days ago

            > I don't think that's quite how it worked.

            I think he described exactly what you did (in fewer words) in the second half of his post. ("Your typical color TV ..."

            The first half discusses that some projection screens had 3 different tubes. ("Very few broadcast TVs ...")

            Either that or he edited his post.

            • abstractbeliefs 12 days ago

              My objection is:

              > Your typical color TV effectively created "pixels" by their use of shadow masks and three separate beams.

              It's true that almost all colour TVs are pixelated and have shadow masks and three separate beams, but it's got the cause and consequence the wrong way around.

              The pixels are created by the phosphor dots, not the mask/tube design. It's possible to use this design without three tubes or the screen, strictly, but a consequence of this design suggests the optional addition of a shadow mask and additional tubes as a natural, later progression for improved performance.

              • mlyle 12 days ago

                > It's possible to use this design without three tubes or the screen, strictly,

                Without shadow masks or an aperture grille seems pretty dang hard. Even ignoring the alignment concerns, it requires a much, much higher bandwidth.

                The shadow mask and aperture grille approaches were the only approaches that ever really worked for color CRTs.

                Phosphors and some kind of masking is how you make different colors on a single tube. Indeed, the shadow mask or aperture grille was offset 3 times and used to etch for the phosphor coatings in most manufacturing processes.

                > l addition of a shadow mask and additional tubes as a natural, later progression for improved performance.

                There's just one tube-- multiple guns/beams, though.

        • butlike 11 days ago

          That really opened my eyes. I played GTAIII a long time ago on a portable black and white TV, and I vividly remember it looking so much better than our main TV (which was still a tube at the time)

      • cesarb 12 days ago

        AFAIK, you're still limited by the bandwidth of your analog signal. If you tried to treat it as if it had unlimited horizontal resolution, you'd use so much bandwidth that you'd end up stomping over the color subcarrier (and probably the audio subcarrier too).

      • Dwedit 12 days ago

        It has the NTSC Subcarrier of ~3.58Mhz, and that limits color resolution.

        Let's say you're displaying a 640x480 screen on a TV. Your chroma information is sampled from an area about 2.1 pixels wide. So even if the signal is continuous, your color information is the result of sampling a boxed area centered on that pixel.

        And because you don't want the wave pattern to be displayed as luma, you also take the average luma of that area. Now you have a blurred fat color pixel, but it can be positioned anywhere continuously.

        • epcoa 12 days ago

          NTSC is just one specific implementation of analog color video transmission, has nothing to do with CRT functionality outside of North America/Japan/a few other places and generally when speaking about CRT monitors for computers, arcades and the like they operate on discrete RGB signals with a wide variety of timing: https://en.wikipedia.org/wiki/Coordinated_Video_Timings, NTSC isn't a thing.

          Furthermore, even when using NTSC video timings, the 3.58 MHz subcarrier is specific to transmission of composite video (carrying brightness and color combined in the same signal). A DVD player for instance with a component or SVideo output produces NTSC timing but there is no subcarrier because the color signal is discrete.

      • jgalt212 12 days ago

        No, I think it's the other way. The horizontal lines are digital, the meridians are continuous.

        • tverbeure 12 days ago

          Think about it this way: lines are discrete, because they are separated by an HSYNC pulse that is used to fly back the electron beam from right to left. But the 'pixels' within a horizontal line are continuous: there is nothing in the video signal that delineates them.

          To the number of horizontal pixels is essentially limitless (if you ignore the bandwidth limit and the CRT scan raster) while the number of vertical pixels/lines is always discrete.

          • jgalt212 12 days ago

            there are a fixed number of horizontal lines. The number of implied vertical lines is determined by Nyquist's theorem.

            • saltcured 12 days ago

              To realign your decoder to interpret the earlier poster: their "horizontal resolution" is the frequency along the horizontal axis, i.e. the density of signal changes within one scan line.

  • jrmg 12 days ago

    the screens attracted dust like a magnet

    You just brought back vivid memory of something I haven’t thought about in years!

    For those who never experienced it: the constant firing of electrons into the screen caused it to become statically charged over time.

    It would, yes, attract dust - and if you ran your hand over a TV that had been on for a while, you could feel the ‘fuzz’ of the electric charge directly with your fingers. I can remember the feeling so vividly now!

    • sh1mmer 12 days ago

      And the great big thunk if you had a model that supported degaussing.

      • cpuguy83 12 days ago

        Loved doing a degaussing.

    • rightbyte 12 days ago

      Fingers? Wasn't it the hair on the top of your hand that felt it? I might remember wrong!

      • brirec 11 days ago

        You could definitely feel little pops/tingles on your fingertips if you touched the screen. A little while it’s on, but even more strongly immediately after it turns off and the fields all collapse.

        But also, hair itself has no living tissue or nerves within. You can feel hair standing on end because it’s anchored in your skin, which does have nerves.

  • Workaccount2 13 days ago

    The core argument for CRT's is their low latency and high refresh rate. Nobody is using a CRT for picture quality (outside the aforementioned blurred retro gaming look)

    However this article is from 2019, and I know they have some pretty snazzy gaming monitors today that might well be better than old top-end CRTs.

    • jonhohle 13 days ago

      The low latency only works when the source is sending an analog signal which is essentially controlling the electron gun. An NES for example, could decide between scan lines what the next scaling to emit would look like.

      Most modern systems (and computers) are sending digital signals where entire frames need to be constructed and reconstructed in a buffer before they can be displayed. Many HD CRT televisions have terrible latency because they’ll take the analog scan line signal, buffer it into a frame, scale it to the native CRT resolution, then scan it back out to the screen. A high end PVM might allow a straight path, but there is maybe one Toshiba HD CRT that doesn’t introduce several frames of latency (iirc).

      That said, from 1999 to 2008 I ran 1600x1200 on 19in CRTs and except for professional LCDs, nothing had resolution, pitch, and color that came close. For 2008 was the inflection point where cost and quality of LCDs exceeded CRTs.

      • toast0 13 days ago

        > Most modern systems (and computers) are sending digital signals where entire frames need to be constructed and reconstructed in a buffer before they can be displayed.

        HDMI and DVI-D are a digital representation of a tv signal --- there's all the blanking and porches and stuff (audio data is often interleaved into the blanking periods). You could process that in realtime, even though most displays don't (including CRT HDTVs; I trust your maybe one doesn't).

      • cmrdporcupine 12 days ago

        What you're describing is fully a product of the display controller architecture, not the digital signal. You can "chase the beam" with HDMI/DVI just as well (or better) as you can with analog. There's no need to buffer the whole screen and be even one frame latent.

        I've done it in Verilog on FPGA, for example.

        But we do so on our machines because that's how the GPU/display controller pipeline works easiest from a software POV. That latency would be present on a CRT as well. What would be missing is the internal latency present in some monitors.

        • mabster 12 days ago

          I was doing the "chase the beam" approach, but never got it off the ground (home project, mostly curiosity). But always on the back of my mind was: even if you're doing everything in low latency HDMI signals, there's no guarantee that the display isn't still buffering it, even if you have "game mode" on on your TV.

          • cmrdporcupine 12 days ago

            yes, absolutely, the monitor vendor can do whatever they want

            still, it's usually only 2 or 3 frames in game mode on a good display

            • jonhohle 11 days ago

              > on a good display

              This is a thing in the last 5-10 years, but was absolutely not a concern to the majority of display manufacturers before gamers complained. 100ms is significant in twitch games where network latencies are 20ms or lower.

      • tverbeure 12 days ago

        It makes little sense to buffer a full frame before up scaling. Why would you do that? It’s a total waste of DRAM bandwidth too.

        The latency incurred for upscaling depends on number of the vertical filter taps and the horizontal scan time. We’re talking order of 10 microseconds.

        The only exception is if you’re using a super fancy machine learning whole-frame upscaling algorithm, but that’s not something you’ll find in an old CRT.

    • Aurornis 12 days ago

      > The core argument for CRT's is their low latency and high refresh rate

      The article is outdated (like you said) because LCD/OLED displays have long since surpassed CRTs in latency and refresh rate.

      A modern gaming LCD can refresh the entire screen multiple times over before an old CRT scans the entire frame and returns to the top to redraw the next one.

      • KennyBlanken 12 days ago

        Yeah, and 90% of the commenters in this discussion clearly haven't looked at TV or computer monitor made in the last few years; they're at least 5 years behind the market.

        There are Samsung TVs that have around ~6ms input delay, and many, many gaming monitors have 1-2ms input delay.

        If you spend $400 you can get a 1440p monitor with 1ms input delay, 2-3ms grey-to-grey timing, that will do 240hz.

        All with more contrast, color gamut, and resolution.

      • remlov 12 days ago

        Point being? Your modern LCD still uses buffers and sample and hold. They will always have inherent input latency compared to CRT regardless how fast they can refresh.

        • Brananarchy 12 days ago

          They will have inherent _output_ latency.

    • WHYLEE1991 12 days ago

      As someone who almost never plays video games but owns several CRTS for the sake of media I can attest to the fact that at least some of us purely own CRT's for their picture quality, or in my case the fact that I think pre 2000's 4:3 media was also kind of intended to be watched on screens like that (much in the same way I see video gamers arguing)

      I'll take a wild guess though that my group (crt media watchers) is slightly less easy to take advantage of with actual hardware than video gamers which I'd guess to be the reason why there are few articles or HOLY GRAIL crt's like this fw900 widescreen in the media watching community. Not that we aren't often suckers for things like ORIGINAL VINTAGE POSTERS and SEALED MEDIA lol.

    • butlike 11 days ago

      I mean, a decidedly smaller argument (but still an argument!) for CRTs is you could put doodads and action figures on top of the box.

  • LocalH 13 days ago

    Rhythm gaming is a big one for me. All other display devices commonly used for gaming have at least a small amount of lag between input and final display. I have a midrange LED set that gives me something like 12ms lag calibration when I play Rock Band 3 on it. It's a very slight difference, but it is noticeable, and I'd much rather have the 0 calibration that a CRT could provide.

    • comex 13 days ago

      > and I'd much rather have the 0 calibration that a CRT could provide.

      Would it really be 0 though? Assuming 60Hz, the bottom of each frame is scanned out 16ms after the top. Assuming that Rock Band 3 renders an entire frame before displaying it (definitely true) and that it actually renders at 60FPS as opposed to rendering at a higher resolution and tearing (definitely true on console, might not be true on an emulator?), the video latency will range from 0ms for the top of the frame to 16ms for the bottom, for an average of 8ms.

      Admittedly, I don't know what Rock Band 3 calibration numbers actually measure, e.g. whether they already take into account factors like this.

      If you can manage to render at a high frame rate then you could reduce latency by tearing, but at that point, I feel like you're leaving a lot on the table by not using a 240Hz OLED or something, which can show the entirety of every frame.

      Supposedly OLED has comparable response times to CRT anyway. The article says that OLED gaming monitors are unattainable, but it's 5 years old and things have changed.

      • LocalH 12 days ago

        The game is pretty good about taking things like that into account. Even aside from that, a CRT generally adds no lag of its own, displaying the scanline as soon as it begins receiving it, assuming a stable set of sync pulses.

        The company that made Rock Band, Harmonix Music Systems, has amazing beatmatching technology that has been used in countless games and is even available in the newest versions of Unreal Engine, thanks to being currently owned by Epic Games. They've been developing this technology since the mid 90s.

        The main difference between CRTs and other display types, is that the other display types are all largely sample-and-hold - they display the whole frame line by line, hold it for an amount of time, then begin the process again. They generally lack the decay that happens with a CRT, where the image fades by the time the next frame is to be drawn. Some displays can do black frame insertion, but this is a poor substitute. It requires double the frame rate of the intended output, and it reduces effective brightness. I'd like to see an OLED display that could simulate the phosphor decay on a per-pixel basis, to better simulate the scanning process of a CRT.

      • toast0 12 days ago

        I agree that Rockband almost certainly does render a whole frame before scanout. So, yes, there's necessary latency there between the input processing and game logic and everything and the output. Higher refresh mitigates that, but that's not really an option on the kinds of gaming machines Rockband runs on; hopefully it's all locked at 60Hz, because that's what TVs run on (unless maybe ew PAL, but ew), and all the machines have a 240p mode, if they're not cool enough to run 480p or better.

        But modern displays add unnecessary latency on top of that. HDTV CRTs often did too. Input devices sometimes add latency too. It's really not too awful as long as it's consistent, real musical instruments have latency too (especially pipe organs! but organists manage to figure it out). It's not as much fun if you need to hit the note when it's visibly far from the 'goal' section, or if the judging becomes visible a long time after the note --- shortening the feedback loop is good.

        • LocalH 12 days ago

          Rock Band's core game logic is fairly well decoupled from frame rate for the most part, actually. I usually run Rock Band 3 (with a fan mod called RB3 Deluxe installed) on an emulated PS3 at 75Hz and it runs perfectly. The only real issue is certain venue post processing effects behave weirdly at high frame rates (like the handheld camera shake effect) but with the aforementioned mod we can disable that.

      • ssl-3 12 days ago

        An analog CRT is as close to zero latency as the source (in this example, Rock Band 3) can provide[0].

        More-modern displays (obviously including LCD and OLED, but also CRT displays that themselves have a framebuffer, as was also a thing) always add additional latency.

        0: It's not like we can just somehow convert an existing closed-source thing like Rock Band 3 into a 240Hz source.

      • sjm 13 days ago

        CRTs far exceed 60Hz though. This FW900 for example goes up to 160Hz.

    • gamepsys 12 days ago

      Look at a Beatmania IIDX Lightening Cab or a Sound Voltex Valkyrie cab. Modern flat panels are are providing better rhythm gaming experiences than CRTs ever did.

      • iamevn 12 days ago

        I don't think this is as simple as that (and it's not entirely the display's fault). 120hz iidx is a big upgrade over the previous lcds but I'm not sure it is over the games that ran on custom hardware that wasn't just a Windows PC.

        I've got a firebeat at home that I play Pop'n on and it feels magical to play with absolutely minimal difference between when the audio and visuals are to where they game expects you to hit notes. Same cabinet but with a windows PC plugged in for modern games feels worse thanks to all the extra latency you get going through Windows (it's especially bad with the audio).

        I know modern iidx and sdvx use newer audio apis that are lower latency but it's still more delay than the pre-PC systems imo.

        • gamepsys 12 days ago

          > I'm not sure it is over the games that ran on custom hardware that wasn't just a Windows PC.

          It is. I've played on both modern ligthening cabs, and on a variety of old official hardware setups in the CRT and Twinkle era. I'd rather play on a Lightening.

          Specifically there is a source of lag even greater than the display lag introduced on almost all of the official LCDs. The input lag, or lag between a button being pressed and the game engine registering the input, is critically important in keysounded rhythm games. The bio2 board introduced in the IIDX25 hardware upgrade made a bigger reduction of this lag than the lag that was introduced in official LCD monitors.

          The "custom hardware" or Twinkle system used by Konami for IIDX 1-8 is essentially modified PS1 hardware. It's really not much different than building ontop of Windows Embedded with the amount of custom hardware inside the cabinets.

      • mbilker 12 days ago

        I can definitely agree on this point. Same for the newer Taiko versions utilizing a 120 Hz panel as well.

        High refresh rate displays have been gaining a lot of traction in the past few years with 1080p or 1440p displays running at 120 or 144 Hz being affordable.

    • timw4mail 13 days ago

      I've played modern games on CRTs through HDMI->VGA convertors, and still felt lower latency than LCD. OLEDs will eventually catch up, I think, but LCDs are always going to have some lag.

      And it's not the refresh rate, it's the time from input -> display picture updates. With a CRT that can happen during the current field being displayed, but it will take at least one frame for any LCD.

      • tverbeure 12 days ago

        This is just not true.

        You can update an LCD in the middle of the screen just the way you can for CRT.

        The only hard limit to the latency of an LCD panel is the finite amount of time that’s needed to flip the fluid in the LCD cells. There is nothing that requires a full frame delay.

        Most LCD monitors have a frame buffer to look back in time for things line overdrive compensation, but you can easily do without. In fact, some of the monitor scaler prototypes that I have worked on were initially direct drive because we hadn’t gotten the DRAM interface up and running yet.

        Desktop LCD panels themselves typically don’t have memory, laptop panels do but that’s for power saving reasons (to avoid the power of transferring the data over a high speed link and to allow the source to power down.)

        • SketchySeaBeast 12 days ago

          > You can update an LCD in the middle of the screen just the way you can for CRT.

          And in fact this is such an annoyance there are multiple different ways people try to deal with the screen tearing, between V,G,and FreeSync.

          • tverbeure 12 days ago

            That monitor scaler without DRAM was the prototype of the first G version. ;-)

    • KennyBlanken 12 days ago

      sub-6ms TVs are common and many, many gaming monitors have 1-2, maybe 3ms delay. If you only need 1080p, it doesn't even cost much.

  • autoexec 12 days ago

    > However modern content on modern, decent, LCD panels and especially on OLED panels, blows CRTs out of the water.

    You can find some modern screens that do better at certain things than others but compared with what most people have CRTs are still likely to have better color accuracy, better contrast, zero motion blur, no dead/stuck pixels, far better viewing angles, no fixed resolutions, higher refresh rates, and zero backlight bleed. It's not all nostalgia (although I really do miss the degauss button) but CRTs were also before DRM, data collection, and ads being pushed at you and it's hard to not be nostalgic about that.

    • buran77 12 days ago

      The best CRTs may have been ok but most CRTs in the history of their kind were not what you describe. I've seen and used my share of them over many decades. Even in the 2000s any of the hundreds of CRTs in an office would have been rather characterized by some CRT glow ruining the contrast particularly around text, constantly needing focus and geometry adjustments if yours was fancy enough to offer the option, commonly 60Hz but somehow always flickering, a bulging screen, in time weird color distortions or fading in some corner of the screen that never went away, the inevitable scratch of the antiglare coating, peaking at 1080i resolution if you were willing to sell your aunt for it, the fine stabilizing wires from the aperture grill creating OCD triggering shadows, huge but still realistically limited to 24-32" screen (in reality most were still huge but only 15-17"), hernia inducing heavy, and above 32" gave your dolly a hernia (300lbs worth), hard to adjust ergonomically, likely some buzzing noise, dust magnets, the smell of overheated plastic and dust being burned off the tube, and I'll stop here before it starts looking like CRTs stole my girlfriend and spit in my coffee.

      Don't get me wrong, that smell will always take me back to a time when I had hair and it will without a doubt put a happy smile on my face any day. But it's all nostalgia anchored by a couple of technical advantages that pale in the face of overall tech today.

      • olyjohn 12 days ago

        Limited to 1080i? That's a 16x9 aspect ratio... I had a CRT that did 1600x1200 back in like 1995. Are you thinking of just TVs? Because most of them did suck. There were lots of excellent computer monitors however...

        • WWLink 12 days ago

          One thing that does make me chuckle when I see "retro" setups on reddit is the use of a crappy 13"-14" monitor from 1993 surrounded by hardware from 1998 or 2002 or so.

          This doesn't surprise me though. The high end Sony monitors always had trouble with flyback transformers and people that had those were quick to replace them with whatever the latest tech at the time was. The crappy 14" monitors were cockroaches that never broke. Now they get to torture zoomers with 60hz 800x600 lol.

        • hellotheretoday 12 days ago

          The limit was mainly due to lack of hdmi support. There are obviously some crts with hdmi but they’re fairly uncommon as display manufacturers basically abandoned crts quickly in favor of significantly lighter yet larger display tech that was much more popular with average consumers. Obviously crts can do much higher resolution over dvi or even vga. with home av stuff they abandoned those connectors as well as component early on in favor of hdmi.

          One of the other niceties associated with crts as a result. A monitor that you could just plug in to a source and it would work, from a time when manufacturers didn’t have the ability to kneecap your setup with handshake bullshit. Now hooking up stuff is just a bit more likely to be frustrating and nonstandard setups are much more likely to simply not work without pricey extra hardware. Want to split your bluray player or playstation 5 between a an oled television and a projector? It may just work with a basic hdmi splitter, or it may not. It may only work with certain hdmi splitters. It may only work if one display is on at a time. With dvi you could’ve gotten a basic splitter and it would’ve worked guaranteed regardless, but hdcp means that hdmi splitting will give you headaches because god forbid you hook up a secondary source to record a copy of content you purchased. Despite that, it’s still easily defeated with relatively cheap hardware, making it completely and utterly pointless as well

    • majormajor 12 days ago

      > compared with what most people have CRTs are still likely to have better color accuracy, better contrast, zero motion blur, no dead/stuck pixels, far better viewing angles, no fixed resolutions, higher refresh rates, and zero backlight bleed

      You're comparing the best CRTs ever made with the "average" modern LCD.

      The run of the mill CRT that "most people had" was dim, blurry, flickery, and bloom-y.

    • Aurornis 12 days ago

      > CRTs are still likely to have ... better contrast,

      This isn't true, and hasn't been true for decades. CRTs have relatively poor display contrast due to all of the light bleed. CRT contrast in real scenes (that is, not just comparing all-black to all-white screens) can be around 100:1 or 200:1. LCDs have been better than that for a very long time. Even the cheap oens.

      > zero motion blur,

      Also not true. CRT phosphors have some persistence. Even the best CRT monitors back in the day had several milliseconds of persistence, depending on what you're measuring. Definitely not zero!

      Modern LCD monitors have motion blur in a similar range, perhaps less if you're getting a gaming panel.

      > no fixed resolutions

      At the cost of some very blurry text, of course.

      > You can find some modern screens that do better at certain things than others but compared with what most people have

      It's almost certainly easier to find a high performing LCD than a quality CRT. A working FW900 CRT can easily fetch $1000-2000, which will buy a modern OLED display that completely blows it out of the water.

      The only reason to buy a good CRT monitor is for the nostalgia/retro effects.

    • kllrnohj 12 days ago

      Most people have IPS or OLED phones with accuracy, gamut, brightness, contrast, density, and responsiveness that the best CRTs don't come close to.

      You're absolutely deep in nostalgia land lol

    • gloryjulio 12 days ago

      The current oled panel with total blackness + hdr is way beyond crt now. The only drawback is that it could have burn in.

      • pezezin 12 days ago

        CRT can also suffer from burn in. I have seen some retro arcades with text like "insert coin" burnt in.

        • oyashirochama 12 days ago

          I mean OLED has the reverse issue, they burn out specifically on Blue and it mostly depends on the brightness/screen on time as well as heat can affect it too.

  • theshackleford 13 days ago

    > However modern content on modern, decent, LCD panels and especially on OLED panels, blows CRTs out of the water.

    Sure, if you like motion blur that makes your content look like a slideshow. I personally don’t. It’s embarrassing that I’m still faced with worse motion qualities than I had 30 years ago.

    The sample and hold motion blur of LCD/OLED ruined gaming for me for a long time. 240FPS OLED panels have begun to just make it bearable again when such rates can be achieved.

    • kllrnohj 12 days ago

      OLED has near instant pixel response times. No idea what display you are looking at, but if you're getting smearing of any kind it wasn't an OLED

      • simoncion 12 days ago

        > ...if you're getting smearing of any kind it wasn't an OLED

        I have a Pixel 5a.

        It has _comically_ slow switching times between dark-ish colors and blacks that's most obvious with dark blue-family colors.

        If you have such a phone, go check out the "Gunnerkrigg Court" webcomic (<https://gunnerkrigg.com>), scroll down to the border between the comic and the blog part, and jostle the viewport up and down.

        If you go check out today's episode (comic 2929), you get an especially long trail of slow-transitioning pixels in the art that is particularly noticeable in the bottom panels of the comic, itself. (Examine the black borders of the people's hair, as well as the black frame of the comic panels. If you're not noticing it, scroll at a rate of travel of roughly one-quarter of the screen's height per second.)

        • gknoy 11 days ago

          I realize that this is not even on topic, but _what a jolt of nostalgia_. I haven't even thought of that comic in what feels like a decade. Perhaps that's the optimal way of reading a webcomic (waiting until it's binge-able)? It's tripled in length since I last read it, and now my phone actually will display it :D (now if only the web interface tracked what page I was on, so that coming back months later didn't show Today's page, and instead showed where I last read ...)

          On-topic, I have tried jostling the page on my Pixel 6a (also a 60hz OLED screen), and I don't notice _at all_ the display latency or issues that you mention. The only time I've noticed any kind of delay or ghosting like that is when I have it in extremely low brightness settings (reading at night in the dark), with "extra dim" enabled.

          • simoncion 11 days ago

            > Perhaps that's the optimal way of reading a webcomic (waiting until it's binge-able)?

            Man, I tend to think so. (Though... I say that, and I've been reading Gunnerkrigg thrice a week for what seems like five years now, so sometimes I think one thing and do another sometimes. ;))

            It's a good comic, IMO.

            > The only time I've noticed any kind of delay or ghosting like that is when I have it in extremely low brightness settings (reading at night in the dark), with "extra dim" enabled.

            I would have never thought to play with the brightness settings.

            So, I nearly always have my screen on "8->10% brightness, automatic brightness adjust, 'extra dim' disabled.". I played with the brightness settings just now and discovered that I have to get the brightness up to 100% before the smearing is _nearly_ gone. At 50%, the trails are much, MUCH smaller than at ~10%, but still obviously there. 75% is not _that_ much better than 50%.

            > [I have a] Pixel 6a

            Hmm! Maybe in the year between the 5a and the 6a's release, they got on display manufacturers' collective asses to substantially reduce the amount of color smearing in the display that they selected for their next phone.

        • mrguyorama 12 days ago

          This is a limitation/downside/flaw of "VA" LCD panels. OLED panels will not experience this, and neither will several other LCD technologies.

          • simoncion 11 days ago

            > This is a limitation/downside/flaw of "VA" LCD panels. OLED panels will not experience this...

            I don't know what to tell you, man. I'm seeing this behavior on a Pixel 5a. Purchased _directly_ from Google's store. Google says this model of phone has a OLED display... an HDR one, at that. [0]

            Not even the worst color transitions for my my VA monitor (a BenQ EWE3270U) streak this badly.

            [0] <https://blog.google/products/pixel/pixel5a-with-5g-new-googl...>

      • theshackleford 12 days ago

        > OLED has near instant pixel response times.

        Pixel response times are one small part of the equation. The vast majority of modern display blur comes from sample and hold motion blur, not pixel transition times.

        https://blurbusters.com/faq/oled-motion-blur/

        > No idea what display you are looking at, but if you're getting smearing of any kind it wasn't an OLED

        I am looking at a 27" LG27GR95QE. A 240hz WOLED panel produced by LG. Just one of multiple OLED panels I currently own. This is just in addition to a few high end LCD FALD panels, a couple of 21" Trinitrons and a handful of JVC broadcast CRT's.

    • Sohcahtoa82 12 days ago

      > Sure, if you like motion blur that makes your content look like a slideshow. I personally don’t. It’s embarrassing that I’m still faced with worse motion qualities than I had 30 years ago.

      What crappy panels are you buying?

      I know 10 years ago, IPS panels were known for having terrible response times, but there are other types of LCD that have pixel response times in the 1 ms range.

      • simoncion 12 days ago

        Manufacturers quote those response times, but worst-case response times are often at least a full frame, and sometimes multiple frames.

        Unlike with CRTs, response time is a factor of what color you're switching to and from, and their intensity, it's highly-variable, and it makes me so sad whenever an otherwise-lovely game happens to put colors together that very obviously smear on my monitors. (Also, see above for my report on the laughably bad color smearing on my Pixel 5a.)

        RTNGS usually has more-detailed breakdowns of best and worst-case color transition times for the monitors they test, as well as over- and under-shoot reports... which is a phenomenon that's just as bad as slow transitions between colors.

      • theshackleford 12 days ago

        > What crappy panels are you buying?

        Exactly zero. I own the highest end panels you can buy, regardless of whether it's OLED or LCD. I have a variety of CRT displays, desktop and broadcast on top of this.

        > I know 10 years ago, IPS panels were known for having terrible response times, but there are other types of LCD that have pixel response times in the 1 ms range.

        The vast majority of motion blur on todays displays is due to sample and hold, not pixel response times.

        https://blurbusters.com/faq/oled-motion-blur/

        • Sohcahtoa82 11 days ago

          Ahhh...I understand now.

          You're talking about motion blur created from eye tracking, rather than slow pixel response like I had assumed.

          Meh, I don't see it as a problem, really. And as frame rates get higher, it becomes even less of a problem, as the distance an object moves across the screen between each frame is less. I'm already running at 144 fps, and I'll probably be looking for 240 fps in a few years.

          • theshackleford 11 days ago

            > Meh, I don't see it as a problem, really

            You personally might not see it as one, but it objectively is. Whether or not you personally are bothered by it is irrelevant to the reality of sample and hold displays poor motion handling. I am happy to hear it does not bother you, and I wish I could be you, however I am not. Probably a side effect of until not too long ago, having spent my entire life with displays that were not complete garbage in this aspect.

            > And as frame rates get higher, it becomes even less of a problem

            I have the best hardware today money can buy, but cant reach my 240hz capability now in any relatively modern title. It mostly only helps in esports/far older content. It also ignores the fact that we have decades worth of content that will never exceed 60FPS that looks terrible today on modern displays, and will continue to do so.

            > I'll probably be looking for 240 fps in a few years.

            240fps has made my usage of modern displays far more tolerable, still cant touch a CRT though. I'll be moving to 480hz OLED soon enough to further improve that. However all of these increased refresh rate displays suffer from the same issue.

            Not everything can or will run at the refresh rates required to take advantage of the reduced motion blur, not even with top of the line hardware. This will probably eventually be resolved with software/hardware based framerate amplification, but we aint there today and probably wont be for at least another half decade or more.

    • foresto 12 days ago

      Have you tried a DLP projector with a high-speed color wheel? When I still owned one, the sample-and-hold slide show effect was far less pronounced than it is on any LCD display I've used (assuming the same frame rate).

      • theshackleford 12 days ago

        Yup, until just a few years ago I gamed almost exclusively on a DLP projector on a 120" projection screen when using displays outside of my CRT's. I miss it and have considered going that route again, that or picking up a nice late generation 1080P plasma.

  • majormajor 12 days ago

    > Games designed for CRT often looked their best on the CRT because of the subtle blending and screen pattern which the graphics were designed around. The games weren't meant to look like pixelated blocks with clearly defined edges like they do on a modern screen, and they didn't look that way on a CRT.

    This seems to suppose a world where, had LCDs been common, people wouldn't have used low-resolution bitmaps. But... there were not a lot of alternative technologies available! There are a few vector-art CRT arcade games out there, but generally very black-and-white/virtual-boy-style, and I believe bitmap graphics even without super-high-resolution would've won out anyway.

    This also suggests that you couldn't tell the difference as much between NES/SNES/Playstation resolutions of bitmap-art games, when you DEFINITELY could. The old games never looked "great", it was just the best we had.

    • bluefirebrand 12 days ago

      > This seems to suppose a world where, had LCDs been common, people wouldn't have used low-resolution bitmaps. But... there were not a lot of alternative technologies available

      It's less about "They would not have used low res bitmaps for displaying graphics" and more about "The artists would likely have arranged the pixels in the bitmaps in different ways"

      There are some really interesting articles out there about how the artists for old NES and SNES games designed the sprites to look better with scanlines, and without scanlines (like when playing on modern emulators) everything kind of looks worse than it did when it was rendered on CRTs

      • majormajor 12 days ago

        It's not the impact that's claimed at all. No sprite magic in the world will get around trying to display 200 pixels of image on a 65" modern TV. Play an NES game on a super-high-quality 13" flat panel and it looks a lot better than on a 75" one even without any fancy scanline filters or such.

        Which pixels in Mario them would you move around, exactly, to prevent big square pixels from being really obvious?

        Hell, those games didn't really look great on 60" RPTV CRTs either. (For anything other than 4x split-screen, my friends and I always preferred a smaller screen since the big screen exposed all the limitations of the graphics and by the N64 era PC games had advanced to the point that we knew what we were missing.)

        And of course anything Mode 7 or full-3D era suffers all the same problems of looking bad blown up to modern display sizes without "CRT optimized sprites" being a factor anymore since the models are seen from all sorts of perspectives and sizes.

        They looked better back then because it was new and magical and there were no 4k games to compare against, not because 256 pixels looked more like 1000. Like with VHS vs DVD - it was absolutely easy to tell even on a pretty small CRT.

        Look at this video - https://www.youtube.com/watch?v=gNx89J0MBHA - the pixels are perfectly apparent even just viewing it on a laptop screen in a browser on a video from a fairly big CRT.

  • justinpombrio 12 days ago

    That's all nice, but you can't draw on an LCD screen with a whiteboard marker.

  • bernawil 12 days ago

    I just want to clear the myth that pixel art style wasn't real and it's just an artifact of running games meant for CRT displays on modern screens: portables had LCD screens since the beginning and their art style was absolutely intended to look blocky. I'd say the biggest influence for pixel art today is the Gameboy generation, problably the pokemon games.

  • Yeul 12 days ago

    CRT versus modern screens is a lot like vinyl versus FLAC. Hipsters gonna hipster.

  • hulitu 12 days ago

    > Games designed for CRT often looked their best on the CRT because of the subtle blending and screen pattern which the graphics were designed around

    No. They looked their best because CRT were able to reproduce more colors than an a LCD.

    Only now, after more than 10 years, LCDs with HDR come close to CRTs.

    I remember vividly my first LCDs who were marketed as 24 bit but you could see the colour gradient like in 16 bit mode on CRTs.

    • epcoa 12 days ago

      > No. They looked their best because CRT were able to reproduce more colors than an a LCD.

      > LCDs with HDR come close to CRTs.

      This is pretty meaningless and conflating gamut with dynamic range. The vast majority of CRTs back in the day would be driven with 8-bit per channel RGB DACs, so not HDR, and most CRTs would have an sRGB or similar gamut (so not a wide gamut). It is true that both the dynamic range and gamut of cheaper LCD panels is pretty poor (~5 bits per channel) and not even complete sRGB, and this set the tone for many low cost TN displays of the early 2000s (and still adorns the lowend of laptops and even some Thinkpads to this day).

      However affordable LCD monitors have been around for YEARS with wide gamut (ex Adobe RGB or DCI-P3), superior to all but the most expensive reference CRT monitors that virtually no one owned, and long before HDR becoming commonplace. I bought a 98% Adobe RGB monitor about 14 years ago for less than $800, color reproduction and contrast wise completely blowing any CRT out of the water I ever owned. But even a cheap <$300 IPS display on sale for the past 15 years including all MacBooks will exceed most CRTs as well. In practice CRTs also have middling contrast ratio as well unless you work in a pitch dark room, which almost no one does.

      > I remember vividly my first LCDs who were marketed as 24 bit

      IPS and true 8-bit TN panels have been mainstream for a long time now. Nothing to do with recent uptake of HDR.

      • guenthert 12 days ago

        > unless you work in a pitch dark room, which almost no one does.

        Is that actually true? In the very early 2000s I worked at a start-up (with not quite well defined target, ended up in the game industry) with darkened office. At that time CRTs were still common and iirc we all used them and I don't recall to have minded. Twenty years later I worked again at a small software shop which had the main office darkened for unclear reasons (I suspect a personal quirk of the fellow running that part of the company) as we now all had LCDs (of course) and there was little to no visual media produced. In between, I visited the earlier company, which grew and moved a few times since, but still kept the office dark (not sure if to benefit content creators or as a fashion statement).

        Personally, I rather not sit in the dark during the day for prolonged times; at the very least, it messes with the sleep rhythm.

        Oh, flash-back to having visited the lab of a physics PhD student who was working on ultra-short pulse lasers in the early nineties. The underground lab was totally dark all day, only every five minutes or so briefly flooded in light meant to pump the lasers. That was the time I decided that physics isn't for me ...

        • epcoa 11 days ago

          > CRTs were still common and iirc we all used them and I don't recall to have minded.

          I am addressing some CRT enthusiasts claim or pine for near infinite contrast ratios which of course is much better than any LCD - because the beam can theoretically be completely off or full on, unlike an LCD that is a filter. But the reality is that amazing contrast ratio is only with no ambient light and very little bright material displayed. Look up on/off CR and ANSI CR. These figures of merit are also distinct from black level.

          At the end of the day it’s a box with a huge glass window.

  • theodric 12 days ago

    Here's my €0.02:

    In 1998, I liked the active matrix LCD on my Gateway laptop a lot more than the GDM-17E11 Trinitron on my SGI because the Trinitron had these 2 fucking wires across the picture, which annoyed me, and also the RGB convergence was off on the edges, and the geometry was poor and heavily bowed by about 0.5" on the bottom. Gross.

    In 2020, I bought a cheap new-old-stock CRT monitor for retro nonsense, and threw it on an underpowered Linux system in my office for funzies, and I was like HOLY FUCK the response time on this is INSANE, and the blacks are UNREAL. I felt a Responsive Computer-Feeling I hadn't felt since using my CRT'd dual PPro Debian system in college. Blew away every aspect of every LCD in the house, even my overpriced top-of-the-range Sony LCD TV-- apart from the abysmal, headache-inducing max 60Hz refresh rate at a low 1024x768 resolution, distracting flyback transformer whine, and high-ish power consumption, that is.

    Conclusions: every monitor sucks. Always have, always will. CRTs flicker, LCDs muddle, OLEDs over-contrast and burn in. With apologies to Dorothy Parker: you might as well live.

    But the glow is real! Gorgeous! I want it on my 70s terminals, but don't need it on my workstation.

  • detourdog 12 days ago

    I had the oppertunity inthe early 90’s to work woth Sony analog HDTV products. The picture was the best I have ever seen and it immediately demonstrated how bad digital HDTV is compared to analog HDTV.

  • ktosobcy 12 days ago

    Eh... I don't understand current fashion for "pixelart" games... It was done on purpose in the past to look good on CRT but nowadays it just looks terrible...

    • hakfoo 12 days ago

      i suspect the appeal is the constraints. That you can tell a story and express emotion with 16x16 figures is an interesting counterargument to the "we need a trillion polygons per second to render a few hairs".

      • ktosobcy 12 days ago

        Erm, but do we always have to end up with extremes?

        I think that pixelart games became somewhat of a cult at this point...

        And no you don't need gazzilion polygons and GPU that requires private power plant to tell a good story - a while back I played Limbo and it was amazing.

tombert 13 days ago

You know, zero-latency is cool, but I gotta admit that I do not miss cathode ray tube TVs. They are really heavy, the picture was fuzzy, I never personally liked the scanlines, they're bulky, and I never liked that high pitch squeal that they make. I'll confess a little nostalgia for the CRTs sometimes, my first TV was a hand-me-down RCA from my parents and that served as my only TV until I was 20, but the second that 1080p (or better) LCD TVs got cheap, I never looked back.

Even in regards to the latency, I'm kind of convinced that those claims are a little overblown. LCDs do increase latency, but some of the more modern LCD TVs have a "low latency mode", that claims to get the latency to below 15 milliseconds; assuming most games are 60FPS, that's below a single frame, and I don't think that a vast majority of humans can even detect that. and for the few that can, OLEDs have you covered with latency on the order of like 2ms.

  • morsch 13 days ago

    If you were around when CRTs were popular, the high pitched squeal is probably not much of an issue for you anymore :P

    • tombert 13 days ago

      No, I don't think it is, a friend of mine got one recently and I could only just barely hear the squeal, and I suspect that in another year or two I won't hear it at all.

      Still, I don't miss it, I never really liked it. People love to crap on LCD TVs, but honestly I'm an unapologetic fan of them. Even pretty cheap LCD TVs nowadays are really decent [1], and give a really sharp, nice picture with very few downsides. I have a MiSTer plugged into my $400 Vizio in my bedroom plugged in via HDMI, and SNES games just look so much better on it than they ever did on my CRT as a kid.

      [1] Except for the speakers. Somehow built in speakers have gotten way worse than they were in the 90s, and TVs are borderline unusable without a soundbar or something.

      • toast0 12 days ago

        Most people were playing SNES via composite video or RF out. MiSTER is going to be using RGB. RGB cables for SNES didn't really happen in the US (YPbPr component video seemed to come out with DVDs, and there weren't contemporaneous cables for that for SNES, although they exist now), but S-Video was available and is much better than composite.

        On speakers, it's pretty simple. Physical depth is very useful for simple speaker designs, and today's screens are very thin and try not to have a bezel. It's pretty common for speakers to be downward or rear facing to make the front of the tv beautiful. This provides sound, but it's not very good. And you can't get much bass out of a small speaker anyway. A CRT tv was pretty big, adding a sizable speaker wasn't a big deal. Even early flat screens had room for an OK speaker, usually oval to use the width of the screen without adding much to the height.

        • tombert 12 days ago

          Yeah, I'm not claiming that my HDMI setup for the MiSTer is the most accurate to the original experience, I'm claiming that I just like the raw HDMI output better than I liked the composite/S-Video signal. It's just a much clearer picture, I like the sharp, chunky pixels.

          Yeah, I figured it had something to do with how thin they are.

  • gamepsys 12 days ago

    My current main display (Odyssey Ark) weighs 91 pounds, so I think I'm at the point where "small size" is not longer a benefit of modern displays

    • zamadatix 12 days ago

      Until you consider a CRT 1/4 the screen dimensions would have weighed the same and now you don't have to lug the equivalent of a large oven on to your desk :p.

      I remember with my first multi-monitor setup the desk heavily bowed in the middle from all the weight. Now I have a 3x2 monitor setup, all larger than the largest of those, supported much more easily.

    • tombert 12 days ago

      I couldn't find any generally-available 55" CRT that ever sold; the largest I could find was the 40" Sony KV-40XBR800, which some quick Googling seems to indicate weighed upwards of 300lbs.

      So your 91 pound TV has a significantly larger screen, and still weighs less than a third. This still seems like a win to me.

    • enoch_r 12 days ago

      Do you use the Ark for software developement? How is it? I've been trying to decide whether to take that plunge since it came out...

      • gamepsys 12 days ago

        I do. It's great, basically unlimited screen realestate on the same screen. No concerns about burn-in you would have on oled. I'm a huge fan of the 1000R curve as well. My biggest complaint is that my webcam is now at an odd angled compared to ontop of a normal monitor.

        Edit: Monitor prices can vary significantly based on the time and the place you buy it from. I payed about $800 less than MSRP. It was still super expensive.

        • tombert 12 days ago

          Just out of curiosity, how much better do you think this is than a decent 4K TV? The reason I ask, my main "monitor" is a 55" Samsung UHD display. I could find the model number but it's not important because it's decidedly "not fancy", something that cost like $450 in 2020. I plug in my Macbook via HDMI with a 4k@60hz thunderbolt adapter, and I have a decent sound bar plugged into the TV and I listen to audio via HDMI ARC.

          I really like it; 4k is enough pixels to be pretty sharp, even for high-contrast stuff like text, as well as enough room to able to cram a bunch of stuff on screen. Also, my brain is kind of bad and as a result I am able to comprehend what I'm reading if the text is huge. I have not measured the latency on this screen yet, but FWIW I was able to beat Donkey Kong Country 2 (a pretty challenging game) with the MiSTer plugged into it.

          People have told me that dedicated monitors are better though, both in refresh rate and just in general for more desktopey stuff...do you think that's true? How much more worth it do you think that is? I've debated buying a dedicated monitor, but getting something of a comparable size and resolution is pretty pricey.

          • gamepsys 11 days ago

            I justify expensive displays because I sit at my desk 12-16hours a day 6.5 days a week. I usually buy a top of the line display (or array of displays) every 3-5 years. The difference between a great flat panel and a good flat panel is huge on the picture it outputs. However, this translates to marginal productivity gains. The 1000R curve is much better ergonomically, it means a much smaller head/eye movement to focus on different parts of the display. The 120hz display, especially with moving text, reduces my eye strain. Display lag, image settings, color quality, and HDR are big for gaming. The KVM switch can be good productivity, depending on your situation.

            • tombert 11 days ago

              Yeah, that's why I was asking; I work from home 3 days a week, and do other work at my desk the remaining days, so I could pretty easily justify something better. I would very much like a 120hz refresh rate, because that's really the only thing in fancier displays that I think I'd actually notice; I appear to not be nearly as sensitive to "darker blacks" and "brighter whites" as other people, at least in my limited friend group, but I definitely do see a difference in a higher refresh rate.

  • agumonkey 12 days ago

    I think there's a human sense of global experience. I do miss some stuff that, by all measures, is bad. I still miss them, they make for a strange blend on noise, curves, sensations and limitations. It's another plane of judgement entirely, one that is a bit less reductionist.

jsheard 13 days ago

This article is a few years old, so the mention of OLED monitors is behind the times. They're available in desktop sizes (27-32") for around $1000 now, still expensive compared to LCDs, but getting more accessible.

https://www.rtings.com/monitor/reviews/best/oled

  • TylerE 13 days ago

    I’d never own a OLED for desktop use due to burn in.

    • jsheard 13 days ago

      They haven't solved it completely but it's getting better, we're at the point now where some manufacturers like Dell are confident enough to cover burn-in under warranty for 3 years. If they can get that up to 5 years I'd probably be okay with just upgrading after that point when it starts showing burn-in.

      • TylerE 12 days ago

        3 years isn’t nearly enough. My average screen lasts about a decade.

        • kube-system 12 days ago

          Your LCD is out of warranty by then too.

          • o11c 12 days ago

            Warranties are irrelevant if the LCDs don't actually fail though.

          • shiroiushi 12 days ago

            Do you throw away everything you own when its warranty expires?

    • KerrAvon 13 days ago

      This is a completely reasonable take, though do note that they’ve come a long way over the past decade, in both improvements to materials and active management to reduce burn-in risk. I use a newish OLED TV for gaming and it’s both glorious and I’m not actually worried about burn-in.

      I’m not sure I’d risk full-time desktop use, but in a couple of years? Maybe.

      • TylerE 12 days ago

        One man’s “active management” is another’s “blatantly obvious pulsing”. My barely a year old OLED TV reduces brightness so aggressive it makes bright white menus look lime they have an animated background. They don’t, it’s just the white dripping to almost gray after a second or two as the whole display dims.

        You can reduce this effect by dripping the peak nuts down… but then you lose much of the contrast ratio that’s the main point of OLED in the first place

        I’ll also add that there have been several TV manufacturers that swear they’ve solve burn in, but the rtinge long term testing is… rather less kind.

    • neRok 12 days ago

      My Gigabyte OLED monitor is 2 years old now and gets used for many hours on most days and shows no changes. I bugged out the time counter for a few months by putting the panel in to service mode, but still it is up to 4040 hours, or 5.5 hours per day for 2 years.

      Your other post mentions the screen brightness dropping with white menus open, but that is not because of a burn in protection feature, but instead it's because of limits on the total power consumption of the panel (either because of power efficiency laws, or too much heat).

      • TylerE 11 days ago

        It's the same thing. Heat is the major cause of burn in.

    • denkmoon 12 days ago

      That's a tradeoff like everything else in life. You're trading visual quality for longevity.

    • cryptonector 11 days ago

      For me the pulse width modulation is just too annoying: it causes terrible headaches.

  • tigen 12 days ago

    OLED itself doesn't solve motion blur. But at least the Alienware one they recommend has a high enough refresh rate to mostly solve it. Getting such high refresh rates at 4K is sort of unrealistic without some other visual tradeoffs at least. Though of course motion blur only "really" matters in certain games. (To my eyes it becomes "acceptable" at around the 144 Hz level.)

    The BlurBusters site has explored various other approaches over the years. Black frame insertion to reduce persistence has been tried, but requires high brightness and can create flicker. AI upscaling or interpolation stuff could help with frame rates.

  • smrtinsert 12 days ago

    I tried the Samsung OLED, a recent one. It was absolutely burning my eyes. Headache, eye strain, the whole nine yards. Switched back to my ancient Samsung LCD and it all went away. My working theory was something about PWM modulation in the backlighting. Either way I don't need it.

    • MichaelMug 12 days ago

      I use LG C2 OLED's and I experience eye straining that I don't on LCD monitors. I haven't had the time to look into the cause.

      • neRok 11 days ago

        First I would make sure the brightness isn't too high, and then I would consider if the problem is text on screen. The LG OLED panels have WRGB pixel layout, so it doesn't work quite right with the likes of ClearType on Windows. Instead I use MacType on my PC with a modified profile and 117% zoom in windows, and now it's fine to me (the font was an issue when I first got my OLED).

    • kanbankaren 12 days ago

      Yeah. PWM is used to control brighness of OLEDs. Some people are sensitive to PWM frequencies as high as 200Hz.

yardie 13 days ago

Hacker tip:

I had one of these FW900s I got for free from work. The screen was extremely dim and it was assumed the flyback had gone bad. Since I am a card carrying member of "program the damn VCR" generation I knew that it was probably a bad capacitor. Throwing complete caution to the wind I removed the case and shields around the multiple circuit boards. And with nothing more than a multimeter and a few cans of Redbull (gives you wings, and you're going to need them) I found the offending cap on the back of the tube (IIRC, the D-board).

With nothing more than time, a 10c capacitor, and the immortality of a young, 20yo I was able to inline a working capacitor and got a free 150lb, $2000 monitor for my effort.

If you're comfortable with working on HV electronics and know how to solder and use a multimeter getting one of these for cheap is completely doable.

  • akira2501 12 days ago

    You would expect a bleeder resistor to be part of the system. The real danger is, that resistor may stop functioning, and there's likely going to be no visual clue.

  • jve 13 days ago

    > If you're comfortable with working on HV electronics and know how to solder and use a multimeter getting one of these for cheap is completely doable.

    Err, you missed the most important one: knowledge to find the bad cap

    • buildsjets 12 days ago

      It's usually the exploded looking one, but not always. I have replaced a great number of blowed up RIFA film caps in ancient Apple hardware. https://www.crackedthecode.co/replacing-failed-capacitors-in...

      • hakfoo 12 days ago

        The fun thing about Rifas is that they're most there as filters; half the time the equipment still works fine, even AS it's emitting clouds of smoke from the final act of the capacitor's life.

    • yardie 12 days ago

      This was during the bad caps era [0] of computing. Visual inspection and then testing with a multimeter was more than enough. The 10k volts was my bigger worry! But I was fearless for a free, large monitor.

      [0] https://en.wikipedia.org/wiki/Capacitor_plague

sjm 13 days ago

CRTs are still unmatched for gaming. Aside from input lag, they don't degrade low resolutions the same way that LCDs do. For the longest time I stuck with a 19" flatscreen 4:3 CRT that did 170Hz at 800x600, an absolute beast. As a kid I'd lug it around to QuakeWorld and Counter-Strike LAN tournaments.

I've gone through a bunch of high refresh rate LCDs since, but nothing has matched the perfect hand-eye sync of a CRT running at high FPS and high refresh rate with a 1000 Hz mouse, zooming around dm3 in a complete state of flow.

  • rightbyte 12 days ago

    You kinda forget how low latency feels like until you relive it and thinks 'oh ye this is how it should feel like'. Like the reaponsiveness of a PS/2 keyboard.

  • _carbyau_ 12 days ago

    I'd append: with the reflexes of youth.

    Seriously, I'm just throwing a ball with my kid and I notice that shit now.

bloopernova 13 days ago

I stupidly gave myself a hernia by trying to lift one of the big CRT widescreen TVs in the late 90s. It was a really nice picture though, PAL was decent resolution for back in the day.

The hidef 1080p CRTs used by Sony Broadcast R&D in the 90s were absolute beasts. Gorgeous displays, I remember so many people really were blown away at the hidef content, and it wasn't even close to 4K! (I haven't yet seen any 8K content on an 8K screen, I prefer high refresh rates for 4K, at least 120Hz, rather than more pixels)

  • bombcar 13 days ago

    4K is nice, but even that it is a bit much at "TV" sizes; I swear the main reason it sells so well is that the line of TVs at Best Buy or Walmart is right next to you, so you're a few inches away from the pixels.

    • albrewer 12 days ago

      I have a 4k@120 Hz projector and it's downright amazing on my 120" "screen" :)

      They're not that expensive, relative to some of the larger TV screens[0]. Plus, when it's off, there's no bulky appliance to navigate around.

      [0]: https://www.benq.com/en-us/projector/gaming/tk700/buy.html

      • bombcar 11 days ago

        Those sizes are where the resolution starts to matter again

    • _carbyau_ 12 days ago

      I'm a big believer that if the story isn't good enough to "suck you in" then pixels isn't going to help.

      • bombcar 12 days ago

        Once you go from VHS to DVD, everything else is just gravy (assuming the DVD isn't a very bad VHS rip).

        1080p, 4K, and 8K are nice to have but they're not groundbreaking the way VHS to DVD was.

TheCleric 13 days ago

My back hurts just thinking about picking up my Trinitron. Those things felt like they were made out of lead.

  • FuriouslyAdrift 13 days ago

    I worked at a high end stereo store in the 90's. When new models of things came out, sometimes the older ones would get sold (or given) to employees as a perk.

    I ended up with a 34" 16:9 Sony hi def CRT. No one wanted it. When I went to pick it up, I found out why... the thing weighed over 200 lbs!

    I remember getting 3 of my friends to help me move that thing into my second story walk up apartment. I wouldn't be surprised if it's still there.

    • Arrath 12 days ago

      There was a period of time there where a friend of mine, real estate agent, was complaining about bigass rear projection big screens which had been overtaken by modern flat panel tvs, often living in dens, rec rooms or basement man caves. And per seller's wish, bequeathed to stay there because they did not at all want to go to the effort of removing the unwieldy beasts. Included with the house along with other bulky stuff like a washer and dryer or a pool table.

      Naturally, the buyers often didn't want them either and it could become a point of contention between the parties! He even lost a couple sales because neither side would budge.

      • aleph_minus_one 12 days ago

        > Naturally, the buyers often didn't want them either and it could become a point of contention between the parties!

        Where is the problem?

        After the sale has been done, you invite some strong friends (either one of them has a big van or pickup truck, or you rent one for half a day), and together you bring the devices to get rid of to the junkyard. After that, you empty crates of beer together.

  • jsheard 13 days ago

    They kind of were, there was a substantial amount of lead mixed into the glass for radiation shielding. Bigger ones had several kilograms of lead in them IIRC.

  • tombert 13 days ago

    I still kind of have PTSD from having to lug my old RCA TV up four flights of stairs in my dorm in college. I'm not in spectacular shape now, but I was a scrawny guy who never exercised when I was 19, and so I was sore for like a week after that endeavor.

    Fortunately my roommate at the time was very nice, and also a huge gym bro and was able to help me lug it back to my car after the semester was over.

    • nottorp 12 days ago

      I've jumped from a small ish CRT directly to LCDs, but I've helped a friend move the same 16:9 huge diagonal CRT TV twice. We still reminisce about those two back breaking experiences.

  • qingcharles 13 days ago

    I saw one for free on Craigslist, but I happily paid two guys $100 to come and help me lift that beast. Even with three of us it was a job.

    I have no idea how I moved my 36" WEGA back in the 90s. I've blanked the pain out.

  • convolvatron 13 days ago

    its also hard to image now exactly how much space they take up. even when you put it back in the corner of an angled desk, it still reaches most of the way towards the front

KptMarchewa 13 days ago

A lot of people are saying there are advantages to visual quality for CRT monitors. I don't disagree, but most what I remember from my gaming obsessed childhood are strong headaches from prolonged sessions, which disappeared after move to CRT.

I don't play games anywhere close to what I used to do, but I can't imagine working for 8-10 hours on CRT monitor and not losing health.

  • nottorp 12 days ago

    Interesting, because I still think lcd monitors BLIND ME a lot more than the old CRTs.

    But then when you're programming with a black background with the brightness turned way way down on a crt, there is very little light shining into your face.

    Maybe oled will fix that, but it still has some growing up to do.

    • Sohcahtoa82 12 days ago

      > I still think lcd monitors BLIND ME

      User error, IMO.

      I'm of the opinion that if your monitor showing a pure white screen is painful, then either your brightness is too high or you don't have adequate ambient light.

      • nottorp 12 days ago

        They’re always turned down as much as I can without losing contrast and I do have light in my office.

        It’s still not the same as a black screen that’s not emitting light.

        You said it yourself actually: lcd monitors do require ambient light to not be painful. CRTs could do without.

        • Sohcahtoa82 12 days ago

          I've always felt the opposite.

          When I was a kid, if I was playing NES/SNES games on my TV with no lights on, after an hour or so, my eyes were burning.

          • nottorp 12 days ago

            Hmm when I was a kid kid I was only playing on my TV in the living room during the day with natural ambient light :)

            Anyway I've never been burned burned by tv or monitor lighting, but I've started to see less well in the dark just a few years after switching to lcds. Being under 30, maybe closer to 25 than 30. To the point that my wife asks me why I keep turning lights on at dusk in rooms where she sees perfectly with what's left of the sunlight.

      • amlib 12 days ago

        Even tough LCDs sample and hold (so, no refresh flicker like a CRT) many employ a PWM to modulate the backlight brightness which will cause it to flicker in excess of your refresh rate, with the pattern varying according to the set brightness.

        Some PWM flicker might reach a frequency that is low enough to bother you, I've read about people who bought a brand new LCD TV and had to return it due to headaches, supposedly attributed to the rate of flicking by the PWM.

        Some monitors are sometimes even reviewed as not employing a PWM thus being easier on the eyes.

        • Sohcahtoa82 12 days ago

          I don't think I've ever noticed flicker on an LCD screen, even when I first started using LCD some time in the mid '00s.

          And I definitely notice the flicker in a CRT, especially when I'm not looking straight at it (oddly). Heck, I see the flicker of LED Christmas lights.

          • amlib 12 days ago

            You will never visually see flicker on an LCD unless it's malfunctioning or you have activated black frame insertion. But the flicker is still there on PWM equiped ones, only visible with special equipment or slow motion cameras.

            The problem them is that your brain might be subconsciously sensitive to some specific flicker patterns, resulting in the aforementioned headaches.

        • kanbankaren 12 days ago

          From what I have read, the PWM frequency for LCD backlights is in the order of several kHz which is unlikely to be a nuisance.

          • amlib 12 days ago

            Yes, but some implementations or edge cases (like putting some TVs/monitors in it's max brightness/dynamic mode) can have the PWM frequency drop to as low as 120hz with god knows what duty cycle.

  • pezezin 12 days ago

    I love visiting retro arcades every time I get the chance, but after an hour of playing I get a headache.

    And let's be honest, many CRT don't look that great, they tend to suffer from bad geometry and color aberrations.

  • xeeeeeeeeeeenu 12 days ago

    This was likely due to flickering because of low refresh rate. On CRTs, 60 Hz feels terrible, 85 Hz is much more comfortable.

    • pezezin 12 days ago

      Back in the day, 85 Hz was the absolute minimum that I could tolerate. Less than that and I would get serious eye strain very quickly.

mbreese 13 days ago

(2019)

Things have changed substantially, especially regarding the availability of OLED gaming oriented monitors. CRTs were great (especially the Trinitrons), but they were stupid heavy. I’d much rather have an OLED today.

  • Yeul 12 days ago

    CRTs were ridiculously tiny. I wouldn't want to play games on anything smaller than 55 inches.

    • Sohcahtoa82 12 days ago

      Meanwhile, I like my 27" monitor and don't want anything bigger and am annoyed by the trend of bigger and bigger monitors.

      But I think my problem is I sit VERY close to my monitor. If I'm sitting up straight, my monitor is only ~20 inches from my screen.

      My eyes are just too accustomed that focus at that distance.

physicsguy 13 days ago

My overwhelming memory of CRT monitors is struggling to see what was on the screen when there's daylight shining on them at all. Computer rooms at school and university always used to be dark caves (even more so than they are now) because they had to have heavy blinds so that you could do anything...

  • kiwijamo 12 days ago

    That's still an issue with LCD as well though.

    • jwells89 12 days ago

      Less than it used to be, though. There’s several 600+ nit panels that are plenty usable in even sunny rooms.

      These aren’t going to be cheap, though. Most LCD monitors with good brightness are midrange and up.

      • asdff 12 days ago

        I can use my laptop in direct sunlight now. Brightness has reached its end game.

squarefoot 12 days ago

Not a gamer, but have vivid memories of playing on CRTs and I don't miss them at all. They're like vinyl records: they bring wonderful memories, but technically speaking there's no reason to go back to them. That said, SED technology could have been employed to create products with the best of two worlds, if only it wasn't killed by patent and licensing issues. https://en.wikipedia.org/wiki/Surface-conduction_electron-em...

maxrecursion 12 days ago

I play retro games regularly, and as good as emulation is, there are tons of old fast paced games that are unplayable for me on modern TVs due to input lag.

Recently was playing Zelda 1 on the NES when the game crashed and deleted my save half way through the 2nd quest. Tried to play it on an emulator and on the switch, and both felt clunky compared to playing on the NES. Could probably get through it, but it wouldn't have been fun.

nutshell89 13 days ago

Unrelated to anything in the article, but I'm curious as to why there haven't been OLED or LCD panels manufactured specifically for general-purpose CRT emulation in hardware + software (via curved panels, artifacts unique to CRTs such as pixel blurring, phosphor persistence, scanlines, etc) for retro video games, film preservation, and artwork.

  • Night_Thastus 12 days ago

    Filters for scanlines, blur, etc have existed for years. No need to bake that into the panel - it can be done in software easily.

    • nutshell89 12 days ago

      There are software solutions that exist yes, (DOSBox Staging as mentioned below, certain RetroArch shaders look very good) but neither really represent an actual product — DosBox Staging is supposedly zero config but simultaneously requires configuring a DOS environment. Replicating a shadow mask, or curved screen for example requires fiddling with shaders, not to mention aspect ratio, the size, or display resolution.

      In any case, most emulation software is geared towards gaming — many art galleries for example still use CRTs for preservation (https://www.nytimes.com/2023/10/17/t-magazine/technology-art..., https://youtu.be/rHBtmPZx82A?t=1828) and these software solutions don't really fill the gap of a general-purpose CRT display.

    • p1mrx 12 days ago

      DOSBox Staging has the best CRT emulation I've seen. It's subtle, but it really feels like going back to a VGA monitor.

      See the screenshots on https://dosbox-staging.github.io/

mistyvales 12 days ago

Nobody seems to remember plasma TV's? It was the closest to CRT I used. 240hz, 1080p bliss. Yes, they were kinda heavy and power hungry but I still miss my 2013 Panasonic Viera

Edit: It was actually only up to 96hz, but gaming on it didn't feel like there was too much motion blur

snakeyjake 11 days ago

I find neither the aperture grill nor shadow mask aesthetically appealing.

Throughout the 80s, 90s, and early 2000s I spent obscene amounts of money trying to minimize them. Smaller and smaller grills, the switch to shadow masks and their infernal lines, it never stopped.

To me, any screen technology where I can look at the clock in the upper right or lower left corner and see the pixels comprising the numbers (or a blurry smear) is trash.

The same with colors. Any screen technology that cannot accurately reproduce colors is trash. Asking a CRT to accurately reproduce colors, even something as minimally acceptable as Rec.709, is like asking me to perform brain surgery: it ain't happening brah.

bombcar 13 days ago

The big advantage for these was the flatness of the screen, created by what seemed to be inches of glass to refract the curve straight.

For those big sizes, there was nothing comparable - everything else was so convex that it was hard to use as a monitor once you got about 19 or 21 inches.

NikkiA 13 days ago

I had one of them, it was a great monitor except for 3 things, 1: it's interlaced mode was literally painful, and it was all I could ever get at higher than 1080p, 2: it weighed 50kg, and partially related to 2, 3: it was so deep a tube that the only place it would go on my desk was on a corner section where it made the whole thing sag badly.

Oh, and thinking about it, I just had a horrible flashback to the intersection of two horrible aspects of that time of my life, a day when I needed to use the interlaced mode and the neighbouring resort had one of their weekly reggae concerts with 120+ dB sound, my headache was epic.

  • jeffbee 13 days ago

    Was this 1080 mode a limitation of your graphics card? I drove mine with a Matrox G400 Max (360MHz RAMDAC) and it was good for 1440p.

    • NikkiA 12 days ago

      Most certainly yes. I can't recall which card I had the experience with, but around that time I upgraded from a G200 to a Geforce 2 Ti of some type, whichever it was, it was almost certainly not beefy enough a RAMDAC for it, and resulted in a ~43Hz interlaced mode. It was a rare enough occasion that I needed to run with a Y resolution of > 1280 that it wasn't a huge deal, but it physically hurt when I did. The viewsonic that I'd used previously to the FW900 had a more palatable interlaced mode.

axegon_ 13 days ago

For the longest time I thought I was having survivorship bias + nostalgia. Back in the late 90's my dad coughed up a lot of money on then the second family computer with all sorts of peripherals and accessories. One of which being a 17 inch Sony Trinitron CRT. I had this monitor until the second half of the 2000's when I went to study abroad. None of the computers and monitors I had in the future felt like a downgrade compared to the Trinitron, even till this day, even with a few several high end oled monitors at home. Seems it wasn't survivorship/nostalgia...

WalterBright 12 days ago

I used CRTs for maybe 20 years. Once I acquired an LCD, that was the end of CRTs for me. I didn't like the weight, the size, the heat, the warmup time, the power consumption, the whine from the flyback transformer, the resolution, or the fuzzy fonts.

The LCD beats it in every metric. Even in price - I can get an LCD monitor for $10 from the thrift store.

P.S. I can't hear the whine anymore in a CRT. Gettin' old.

  • prpl 12 days ago

    I tend to think of the heat from the song “Sleeping is the only Love” by the Silver Jews

    “I had this friend his name was Marc with a C, his sister was like the heat coming off the back of an old TV”

moudis 12 days ago

I had the fortune of owning a Dell P1110 (rebranded Sony Trinitron CPD-E500) many years ago, bought quite cheaply at a surplus auction while in college. 21", flat glass, and weighed in right at 70lbs. Lugged it to a few LAN parties here and there.

It wasn't until OLED monitors came around that I finally felt like flat panel displays had really caught up.

  • M95D 12 days ago

    I still have my Dell P1130. I too lugged it to a LAN party. Only once. It bent the table and I still have back pains after 20 years.

mcbuilder 13 days ago

I'm glad to have a 19in CRT nearby my desk, I do enjoy the retro vibes, but the big disadvantage has gotta be their weight and heft. There is no way the average consumer wants a FW900 taking up their entire desk. It would be cool if the CRT industry didn't completely die and instead it existed as a niche product.

  • jeffbee 13 days ago

    It weighs 40 kilos.

    I owned one, but I didn't want to. My 21" Trinitron blew up in the final month of its 5-year warranty and they didn't have a way to replace it so they sent me the 24". Crazy, must have cost a fortune for them just to ship it.

    • deadlydose 12 days ago

      It was never the weight of them that bothered me it was the uneven distribution of the weight. 40kg isn't much until it's all shifted to one side (the front) of a cumbersome object.

TradingPlaces 12 days ago

I held on to the last Sony XBR Trinitron, the KD-34XBR970, for an unreasonable amount of time because the picture was so lively. Eventually sold it to a retro gamer, who was thrilled, and so was I.

RajT88 12 days ago

One of the gems of my retro collection is my 36" Sony WEGA which has a native 720p resolution.

HD image which works with the many retro light gun games.

sssilver 12 days ago

I occasionally get the itch of purchasing a brand new CRT monitor, and fail to find any.

Has humankind stopped manufacturing CRTs entirely?

g42gregory 12 days ago

LCDs still can not change resolution away from native max, without screen getting blurred. As I recall, LCDs started to get wide adoption about 2004 or so. It has been only 20 yrs. Why can't they make them to be able to run on resolutions other than native? CRT could do that with no problems.

Night_Thastus 12 days ago

I'm sorry but I really don't believe a 540 Hz or even 360 Hz LCD is going to be beaten in total end-to-end latency by any kind of CRT. I've yet to see any real world tests to the contrary.

And of course, modern OLED beats the pants off of both in that regard.

  • ranger_danger 12 days ago

    CRT is still the king of motion clarity by a sizeable margin.

    • Night_Thastus 12 days ago

      Is it though? Could you really say a 200 Hz CRT is going to beat a 540 Hz LCD with backlight strobing? Has anyone actually done thorough testing to confirm it? Motion clarity is made up of a lot of parts, it can't be ballpark guessed.

      See here: https://www.youtube.com/watch?v=-bEKOp1GLDs

  • xcv123 12 days ago

    CRT has zero input lag. It is pure analog electronics with no buffering. "Racing the beam". Digital buffered displays are down to a few milliseconds but cannot ever be as low as CRT.

    • ranger_danger 12 days ago

      As an FPGA engineer that has developed custom monitors, I see no technical reason why an LCD panel cannot be driven with zero latency or buffering. It may just be that almost all monitors/TVs sold on the market happen to implement some type of buffering to have room for image enhancements etc. but it is not a requirement.

    • voldacar 11 days ago

      I don't see how you do a fair comparison. How do you even drive a CRT with a modern graphics card? Presumably you have some sort of digital to analog converter box, but wouldn't that add a touch of latency and defeat the purpose of the CRT?

      • xcv123 11 days ago

        When the analog signal enters the CRT, there are nanoseconds of lag before that signal causes the phosphor to be lit up on the other side. That is different to how a typical digital display works (with buffering) and that is where you get the zero input lag of CRT.

        > presumably you have some sort of digital to analog converter box

        Yes the CRT will display the output of that DAC in real-time. No additional buffering/latency after that conversion.

        The Atari 2600 video hardware had no framebuffer and the code running on the CPU was racing the electron beam, updating the graphics mid-scanline in real-time while the electron beam was scanning that line on the display. That was about as raw and zero lag as it gets. The other 8 bit and 16 bit consoles also did not have framebuffers.

        Another guy has commented here saying that in theory you could do the same with digital displays. In practice they buffer the input and add latency.

    • 0xcde4c3db 12 days ago

      Depending on the display and the input type, some CRT displays do digitize and buffer the signal so that they can do some of the filtering and decoding digitally. The amount that they buffer is generally only on the order of a few lines, though.

    • imtringued 12 days ago

      Tell that to people having problems with screen tearing.

      • xcv123 12 days ago

        CRT has zero input lag from signal to phosphor (maybe a few nanoseconds due to speed of electronics and the electron beam itself) but it will take a few milliseconds to scan an entire frame.

        With vsync enabled I guess you could have effectively lower input lag from a well engineered 500hz digital display, when considering lag for complete frames.

        If the system is fully engineered to race the beam then you can't beat zero input lag for CRT on mid scanline.

      • ranger_danger 12 days ago

        tearing is a software issue, it's completely unrelated to CRTs or the display technology in general.

dekhn 13 days ago

Oh hell no. We celebrated when I finally got my 36"(?) Sony Triniton WEGA out of the house. It was so heavy it took 2 strong people to move. It was enormous and hot. The image quality was great, but quickly eclipsed by LCDs. But the weight. It just made it impossible to move.