tqkxzugoaupvwqr 5 years ago

The new lighting makes such a dramatic difference. Despite the same low polygon count, the levels seem much more lifelike. If all the game classics are re-released with raytracing engines, I will be all over it. The next years are going to be good.

  • pizza234 5 years ago

    I'm very skeptical.

    This is not just RTT; there's a big amount of rework. Likely, any old game would look significantly better with such development effort (in other words, with any good remastering).

    This could be considered a great market stunt - the graphical improvement is touted to be due to RTX (ray tracing), while it's actually due to many factors.

    On top of that, it misleads people into thinking that this type of improvement is typical of ray tracing, and that it comes at virtually no cost, while currently, ray tracing takes a big performance hit, and it's generally hard to notice.

    Q2VKPT is close to the concept "improved with ray tracing", and while certainly impressive, it doesn't yield the same "AMAZING!" effect.

    • ladon86 5 years ago

      But it's not down to polygons or details in the environment - the simple architecture still looks realistic with the addition of realistic lighting. In the quest for realism, modern games have so many details, dirt and grit everywhere [1]. Perhaps going forward artists can let the lighting do more work and still represent realistic clean environments. I loved this blocky room in Half Life 1 for the dramatic (prebaked) lighting: http://combineoverwiki.net/images/thumb/3/37/C2a4e_lobby.jpg...

      [1] just look at Doom 2016: https://cdn.arstechnica.net/wp-content/uploads/2016/05/20160...

      • fwip 5 years ago

        A whole lot of the benefit is the added "normal and roughness maps for added surface detail."

        If they hadn't done the hard work of basically retexturing every single surface, it wouldn't look half as good. Without depth, ray-tracing looks incredibly flat, like this: http://builder.openhmd.net/blender-hmd-viewport-temp/_images...

        • a_f 5 years ago

          I believe the origanal proof of concept didn't do the retexturing, and just focused on the lighting; it looks fantastic: http://brechpunkt.de/q2vkpt/

        • int_19h 5 years ago

          How hard would it be to automatically derive such maps from textures? After all, they are already drawn to visually imply that with shading, so wouldn't it be possible to e.g. train a model to extract that information, just as our eye does?

    • karrotwaltz 5 years ago

      > it's actually due to many factors

      To add to this point, at 1:35 in the video they compare with / without volumetric ligthing except, well, volumetric lighting has been used in games for a while now.

      The "before" should show a non-RTX lighting instead of just nothing at all.

      • robin_reala 5 years ago

        I remember one of the big selling points of the Dreamcast in 1998 was volumetric lighting.

        • corysama 5 years ago

          Fun fact: The Dreamcast GPU was a precursor to the PowerVR GPUs used in iPhones and many Androids.

          Also, Qualcomm’s popular Adreno line on Android GPUs has its roots in the Xbox360 GPU.

        • krige 5 years ago

          Unreal, a Quake 2 contemporary, already had it and managed it on meager hardware like P200MMX

          • TomBombadildoze 5 years ago

            If you consider a slide show "managing it", then sure.

            • jl6 5 years ago

              My P200MMX with a Voodoo 2 managed it just fine.

              • elsonrodriguez 5 years ago

                It was more that Voodoo than the MMX. I tried playing some MMX "accelerated" games and it was a lo-fi slideshow.

                I think the MMX era was the start of Intel's tradition of hyped up graphics related press releases followed by a total lack of execution.

                • rasz 5 years ago

                  MMX was useless for games. MMX is Integer math only, good for DSP, things like audio filters, or making a softmodem out of your sound card. Unsuitable for accelerating 3D games. Whats worse MMX has no dedicated registers, and instead reuses/shares FPU ones, this means you cant use MMX and FPU (all 3D code pre Direct3D 7 Hardware T&L) at the same time.

                  All comes down to Microsoft and Intel coming up with a concept of purely software peripherals around 1997. Intel released things like MMX, AC'97, Communications and Networking Riser (CNR), and audio/modem riser (AMR) standards, all in an effort to push hardware vendors out of the market by handling those roles in software by the CPU. More work for CPU = more demand for fast CPUs = more profit.

                  Funnily enough AMDs 1998 3DNow! did actually add floating point support to MMX and was useful for 3D acceleration until hardware T&L came along 2 years later.

                  Intel Paid few dev houses to release make believe MMX enhancements, like POD (1997)

                  https://www.mobygames.com/images/covers/l/51358-pod-windows-...

                  1/6 of box covered with Intel MMX advertising while game used it only for some sound effects. Intel repeated this trick in 99 while introducing Pentium 3 with SSE. Intel commissioned Rage Software to build a demo piece showcasing P3 during Comdex Fall. It worked .. by cheating with graphic details ;-) Quoting hardware.fr "But looking closely at the demo, we notice - as you can see on the screenshots - that the SSE version is less detailed than the non-SSE version (see the ground). Intel would you try to roll the journalists in the flour?". Of course Anandtech used this never released publicly cheating demo pretending to be a game in all of their Pentium 3 tests for over a year.

                  https://www.vogons.org/viewtopic.php?f=46&t=65247&start=20#p...

                • tomc1985 5 years ago

                  MMX was always more selling point than substance, from my experience... a Voodoo card, on the other hand...

              • jackfraser 5 years ago

                Glide, man, such a great era for PC gaming if you had the right equipment.

                I'll never forgive the computer store guy when I was 14 who pushed me to buy the $40 cheaper S3 Trio3D instead of the Voodoo. Instead of something useful, I ended up with a glorified paperweight that only worked well for Direct3D. Guess how many games back then bothered to use that?

                I think I got Quake 1 with some shitty hacked MiniGL driver working _once_.

    • deadbunny 5 years ago

      Yeah checking out some of the comparison photos and the textures have definitely received a bump in resolution/quality.

      Most obvious to me in the quake logo/skull on the left of this[1] image.

      1. https://images.nvidia.com/geforce-com/international/comparis...

      • sosuke 5 years ago

        > Quake 2 XP high-detail weapons, models and textures; optional NVIDIA Flow fire, smoke and particle effects, and much more!

    • _bxg1 5 years ago

      Given all the additions they should've placed less emphasis on the before/after. The really cool thing about the demo isn't "look how great a remastered version of Quake 2 can look", it's "look, we can run an entire game in real-time with nothing but raytracing, even if it requires a low polygon count". The Metro and Battlefield demos only use RTX for small pieces of their environment where it has the greatest effect; this is a whole level using no rasterization at all. The other improvements distracted from that somewhat.

    • Justsignedup 5 years ago

      I was thinking the same thing. They re-textured and enabled RTX. So it would be nice to see the new game turn RTX on / off and see whats the actual difference. Just that feature.

    • lanevorockz 5 years ago

      Not sure what can you be skeptical about. The difference from RT Quake to Quake it’s only that the lights are not precompiled at the beginning of the level. Any other game just need to rework the light baking step to use RTX and job done.

      The only downside that can be pointed out is that you are hardware dependent. Would be amazing to see thing in a console, all games with dynamic lighting would help developer and gamers.

      • mbel 5 years ago

        > The difference from RT Quake to Quake it’s only that the lights are not precompiled

        In the linked demo video they also use new high resolution textures. Which include not only color data as in the original Quake, but also normal maps and maps with PBR parameters (metalness, roughness).

      • KallDrexx 5 years ago

        That's not the only difference shown in the video. IN the official keynote when Jensen was talking about it he specifically said the "RTX Off" was using the software rasterizer. So it's not comparing Vulkin RTX on with Vulkin RTX off but Vulkin RTX On with Software rendering on.

  • navaati 5 years ago

    Now it's not like it's only an engine change, for this to work they added PBR layers (metalness, emission, normal, yada yada) to all textures here, and that's a big artistic work.

    That said… gimme gimme gimme Half Life 1, plssssssss :D !

    • kowdermeister 5 years ago

      HL1: Yes please, it would look so cool.

      > they added PBR layers

      and fluid simulation for explosions

      I'd rolling in joy for Max Payne and GTA San Andreas :)

    • LeonM 5 years ago

      HL1 already has a modern remake, check out 'Black Mesa' on Steam. It's really good.

      • thecatspaw 5 years ago

        As someone passionate about hl1, I dont like Black mesa. Yes it looks nice, but gameplay wise it seems lacking to me. It feels very slow to move, bhopping has been nerfed to death.

        It feels more like a reboot than a remaster

        • crysin 5 years ago

          From what I remember of Black Mesa, it felt more like Half-Life 2, control wise, than Half-Life. So for people who played Half-Life 2 first, Black Mesa probably feels more natural.

          • yellowapple 5 years ago

            This is unsurprising, since they're both on the Source engine.

    • sbarre 5 years ago

      Maybe teams could use tools like AI Gigapixel (discussed on here yesterday) to upsample older textures?

      I agree with you that there is asset work to be done, but I bet there are tools out there that can simplify and automate this process quite a bit today.

  • yogthos 5 years ago

    I would love to see a kickstarter or something to create high quality assets for Open Arena, and add raytracing. Personally, I find that Quake 3 gameplay is more fun than any modern shooter that I've played. It's fast paced, and really well streamlined. It's just pure adrenaline.

    Adding things like health regen, cover, alternate fire modes, and so on ultimately just distracts from the action in my opinion. We don't really have any more games like Quake nowadays.

    • dkrikun 5 years ago

      try reflex arena

  • partiallypro 5 years ago

    It's cool, but even back in the day people that competitively played Quake 2 & 3 would turn down every detail to be very low, sans the resolution. In part because it helped with lag, but it also made spotting enemies much easier. The lighting in this demo is very nice, but at least in the Quake world, no one would really use it because it's very distracting, specifically the lighting.

    • dahart 5 years ago

      I would use ray tracing to get a non-linear camera, specifically so I could have a wider field of view with less distortion than the linear camera & wide fov. Back on Q3, everyone would use 120 or 130 degree fov because that was the best compromise between seeing things to your side and everything getting squished into the center of your screen. If I could have played with 180, I would have.

      Your lighting point is reasonable in the sense that this demo was made to demonstrate the dramatic lighting without respect to gameplay. On the other hand games are already clearly going toward HDR and physical and dramatic lighting. If Quake had HDR lighting, it would look just as good but it would be less distracting because it would have been directed and designed to support the gameplay.

      This version also has dynamic adaptation which would have been awesome to have back in the day - some of the dark areas are too dark and it’s great to have the lighting automatically adjust so that at least I can see.

    • PavlikPaja 5 years ago

      They would as you could use the lighting to spot enemies you can't see directly, like you sometimes can in e.g. CSGO.

  • mikepurvis 5 years ago

    Hopefully it doesn't become one of those annoying tropes, though. Like Photoshop lens flares— every game has to have a bunch of arbitrarily reflective materials and light shafts just because the hardware now makes it possible.

    • tachyonbeam 5 years ago

      When bump mapping was starting to become a thing, every surface had to look ultra glossy for a while (example: http://mojolabs.nz/screenshot.php?id=1740). Eventually the excitement subsided and people used this effect more tastefully. I think most people are just happy with more natural looking lighting though.

quadcore 5 years ago

The original textures were done by Adrian Carmack. I think he's a genius as much as John. He was capable of making a difficult theme really work: a mix between dark future and fantasy. Specially fantasy with guns is terribly difficult and I don't recall having seen a game that succeeded in making that mix as well as the Quake series.

  • gagege 5 years ago

    Maybe this isn't what you're talking about, but Bungie's game Destiny is a pretty great example of medieval fantasy transposed several thousand years into the future.

    • shahar2k 5 years ago

      there's a ton of great art happening in games these days. and a lot of it comes down to a few factors, there are some excellent concept art outsourcing houses (check out opusartz for one) the tools and artists trained in them are EXTREMELY capable in recreating whatever style, and the technology available for real time can basically do a fantastic job at baking down extremely high poly high res assets to whatever the final system requires.

      this is not to diminish artists working today. (I'm one of them!)

      Adrian Carmack though, comes from a whole other breed of creators who had to be both artistically brilliant as well as technically resourceful enough to create art within EXTREMELY tight constraints.

      the menagerie for doom, which he is responsible for are still a masterwork in readability, efficiency of design, and communication of purpose in many ways (the textures in that game very easily still hold up today)

      I work with a ton of amazingly talented artist, and still, it's rare that I see both artistic, illustrative, and technical talent all in one person. (helps that teams are enormous and so those skills can be spread amongst multiple people)

    • jsgo 5 years ago

      Is it? I only played Destiny 2 and from the parts I really got into (pre-expansions), I didn't get a very medieval fantasy vibe at any points. Most of the architecture was fairly modern or alien. Was nice though.

      Anthem has some buildings that have an older feel out in the freeplay areas, but even that would be a stretch.

      • nessus42 5 years ago

        It's much more apparent in Destiny 1, with the Hive architecture, which is like something out of Mordor. And in the Forsaken expansion to Destiny 2, where The Dreaming City seems like a high-fantasy kingdom, but under attack by evil forces.

        • jsgo 5 years ago

          ah, nice. Can't remember if it was Hive or a different faction, but while playing the core game of D2, I thought it was nice because the environment you dealt with them reminded me a lot of Alien (without the xenomorph, obviously). I think that was one thing D2 did well: in the core game, there are multiple factions that are pretty distinct so it almost feels like different games at times.

          • nessus42 5 years ago

            Yes, you are right: the Hive architecture is also rather Alien-ish!

  • LoSboccacc 5 years ago

    well there's this 40k thing

pornel 5 years ago

I love RTX, but it's so hard to demonstrate it. It doesn't give presence of any "wow!" effect (apart from demos that overdo reflections), it only gives absence of incorrect shadows, and humans are bad at noticing absence of things.

Instead of "wow" you get "well, duh, that's how things should look".

  • kuzehanka 5 years ago

    Path tracing has huge wow factor. The problem with RTX is that on current gen hardware it runs at laughably low sample rates and the result is very noisy even after DLDN.

    Games are handling this by dialling the amplitude of RTX effects way down and having a smoothing pass. After all that is done, the wow effect is gone and you're just left with a more dynamic version of the same or worse aesthetics that we're already used to.

    I expect it'll truly take off on next gen hardware, whenever that rolls out.

    Here's what RTX+DLDN actually looks like if not dialled down and smoothed: https://youtu.be/CuoER1DwYLY?t=553

    My takeaway is yes that quake 2 demo looks really cool, but you just know that they used every black magic hack in the book to get there, and most of it can't be replicated in a modern high fidelity game. It's definitely not a case of flicking an 'RTX on' switch.

    • echeese 5 years ago

      What's DLDN? I tried Googling it with some relevant keywords but it just brings me back to this comment.

      • kuzehanka 5 years ago

        Deep learning denoiser. I don't know if NVIDIA came up with some marketing term for it. It's what allows RTX to produce meaningful images at all despite the renderer running at 0.5-2 samples per pixel which looks little better than random noise. I kind of assumed they'd call it DLDN because they called their deep learning supersampling DLSS. I guess not, go figure.

        https://www.youtube.com/watch?v=YjjTPV2pXY0

  • Wohlf 5 years ago

    I've noticed something similar with a lot of the game remakes from a few years ago, particularly the N64 Zelda remakes on 3DS. The remasters look how you remember the games back in the day, but if you compared them side by side the remakes look vastly better.

    • mikepurvis 5 years ago

      Zelda 64 is a particularly interesting case, though. As such a beloved title with a long-standing fan following, there has been an almost endless parade of remakes/demakes over the 20 years since it came out.

      If you thought the official 3DS one was pretty, wait until you see the fan work done on a UE4 version:

      https://youtu.be/OrMRjK_5RRU?t=113

  • dahart 5 years ago

    That’s a good way to put it. There’s some kind of uncanny valley of rendering, I think. Not exactly the same idea, but I noticed this starting decades ago while playing on an SGI- the demos with flat shading oddly seem more impressive than the ones that had an environment map and chrome reflections. It felt like the computer was doing more work when the image was less realistic, and the more realistic it got, the less it felt like billions of amazing calculations were happening under the hood. It’s still true, and it’s ironic when I’m seeing these amazing demos today with real time path tracing and astounding amounts of complexity, the visceral feeling is more like looking at photos than amazement for how incredible the rendering is. I have to force myself to think harder about what’s going on in order to appreciate it.

  • zamalek 5 years ago

    > humans are bad at noticing absence of things.

    Until you ask them whether a scene is a photo or a render. They won't know to look for shadow acne, but they will still make a judgement that includes it. RTX has a few things that we've had for years, but it seems to do a way better job at energy preservation.

apk-d 5 years ago

Rtx really shines when applied to a low-fidelity game like Q2. The difference isn't as profound in modern titles where a plethora of lighting techniques and tricks approximates physical lighting closely, though.

I wonder if in the end (should we eventually see universal adoption) raytracing will prove more of a boon towards developers rather than end users, as it has been with many other hardware advancements.

  • noir_lord 5 years ago

    > The difference isn't as profound in modern titles where a plethora of lighting techniques and tricks.

    That's the key thing though, RTX removes the need for all those tricks (which are tricky to get right, humans are good at spotting lighting flaws in a scene) which is a net win for the game developers who can focus that time/effort elsewhere.

  • omilu 5 years ago

    Ray tracing removes a huge development workload, you no longer have to think about modeling light. It's like real life, you simply have materials and light sources, and bang everything just works and looks natural.

  • jplayer01 5 years ago

    But... That's the biggest benefit of raytracing/path tracing. It's why the movie industry is already all-in on it and has been for years. It's just in games where for some reason gamers don't understand the benefits and are so eager to dismiss it if they don't immediately see everything look 10x more amazing or realistic, entirely missing the point.

    • billfruit 5 years ago

      Perhaps many games do not require realistic light, rather some type of simplistic model serves gameplay well enough, I do not know what kind of light modeling was used in Sunset Overdrive, but that type of oversaturated lighting works for some games.

  • kuu 5 years ago

    I guess better tools for developers could mean in a benefit for the player: less time working on one thing could mean more time in others, giving more room to improvement. I guess...

    • apk-d 5 years ago

      I'm somewhat sceptical because I'm not really seeing this happen anywhere throughout the software development industry. Sure, we get more software than ever, and we get it faster and cheaper, but it's all bloated low-quality (web\electron) apps that are often less usable than their ages-old counterparts. Same thing is happening to games in some extent - most developers choose development speed and convenience over performance.

nickjj 5 years ago

Wow that's a big difference.

It reminds me of what it was like to first see 3dfx OpenGL mode in Quake II when I got a Voodoo graphics card.

  • abrugsch 5 years ago

    yeah this exactly. I was half way through playing QII first time round when I went from Software to a VooDooII. I'll never forget firing a plasma rifle down a tunnel and watching the lighting effects coming off the bolt as it went down tunnel. There hasn't been such a step change in graphical capability as those days of going from a software renderer to a hardware one. (also at the same time, was unreal... I never understood why that one room near the start was "slippery" until I got that VooDooII and suddenly that room had a mirror finish and the slipperyness fell into place. I think real time ray tracing in games is now that next step change but even then, nowhere near as dramatic as software to hardware was back in the day.

    • cyxxon 5 years ago

      I would argue that VR is the next step like this. Even taking an "old" title like Skyrim drives home that no matter how nice the renderer makes it look on a screen, it is still flat. With VR you are in the place. Does the raytracing look really, really nice? Yes, definitely. Is it that much better? I don't think so (personal opinion, of course)...

      • mattnewport 5 years ago

        Working in VR, I think ray tracing is really going to come into its own with VR. Some of the effects that are mostly just nice visuals on a monitor actually make VR more usable. Accurate contact shadows for example are super important to our brain when picking up and putting down objects off surfaces to help get an accurate sense of distance and are extremely hard to do well with rasterization based techniques but they "just work" with ray tracing (given enough performance).

        Other things like more accurate reflections and specular effects add more to VR where subtle head movements that don't produce the correct response our brain expects impact immersion even when we're not able to consciously identify why the rendering is not quite right.

        The additional performance demands of VR mean it will be a while before we really see these benefits realized but I think when it all comes together it will really improve immersion.

        There are some other potential benefits to basing VR rendering around ray tracing too that might take even longer to realize. For example you can directly compensate for the optics of the lenses when generating rays and avoid the whole post process warping step and foveated rendering is very easy to incorporate naturally.

        • dahart 5 years ago

          This is a good point I hadn’t thought of that much. Ray tracing is more easily able to get to ‘stereo correctness’ than all the screen space effects and other great tricks that have worked so well for games.

        • frigaardj 5 years ago

          Presumably the sampling rate can also be limited in places the user isn't looking (with eyetracking etc)?

          • mattnewport 5 years ago

            Yeah, that's what I meant by foveated rendering. You can spend your samples where they are most valuable based on where the user is looking among other factors.

  • jacobush 5 years ago

    Not Glide mode? ;)

    • nickjj 5 years ago

      Now that you mention it, I don't remember if it was opengl or glide, it was over 20 years ago. Same effect tho.

      I just remember seeing all of the ambient lights and glows and thinking how amazing it looked. Specifically some early campaign map where you're going through dark sewers that had a bunch of little lights that emit a glow.

      But after the coolness factor wore off, it was back to using the most minimal settings possible to maximize visibility and frame rates to play it online.

      • SketchySeaBeast 5 years ago

        If it was Voodoo you'd be using Glide. That was their proprietary API.

        But yeah, the biggest improvement was putting it into Glide/OpenGL and watching all the new lights appear.

    • qplex 5 years ago

      Quake and Quake II didn't actually use Glide but a MiniGL wrapper for translating the OpenGL calls.

      • jacobush 5 years ago

        YES! Now I remember. Unreal Tournament used glide though. UT was my vice.

  • cdnsteve 5 years ago

    Loved my Voodoo card!

narrator 5 years ago

This is great to see. Quake and Doom were all about elaborate fake 3d and elaborate fake raytracing to run fast and look amazing on slow hardware. As technology improved, first we got a real 3d engine and now we finally have a real raytracing engine.

  • dexen 5 years ago

    >Quake and Doom were all about elaborate fake 3d

    A nitpick: DooM (1 and 2), just as well as Duke Nukem 3D were indeed "2.5D" elaborate fakery of 3D. One of examples of the ruse: a level was basically a (floor, ceiling) height map, and there could be no two overlapping rooms one above the other.

    On the other hand, Quake (1) was an early pioneer [1] of real, actual 3D, including an OpenGL backend and all that. That's one of the reasons it required significantly beefier hardware: needing a Pentium for smooth gameplay, while DooM would run perfectly fine on a mid-end 486, which is a generation older.

    [1] cf. Marathon

    • taneq 5 years ago

      > On the other hand, Quake (1) was an early pioneer [1] of real, actual 3D

      The Quake rasterizer with its overlapped divide for perspective correction is a thing of beauty and possibly my favourite piece of code ever. If there's anyone out there who loves optimisation and hasn't read Abrash's "black book", it's well worth a read.

    • int_19h 5 years ago

      One interesting thing about Quake is that not only it was 3D, but it was excessively 3D in many ways. For example, they have rendered all flames as 3D models, even though sprites and/or procedurally generated 2D flame (as used in e.g. Unreal two years later) would look much better given the very low poly counts.

      https://media.moddb.com/images/members/1/240/239733/qv1.gif

      Curiously, explosions were still sprite + particles. But I can't remember anything else outside of UI that was using a sprite.

      Q2 took it even further - it has 3D models for explosions, and even puffs of smoke when bullets hit walls (these were simple particle effects in Q1). IIRC the only thing that was a sprite was the BFG10K plasma ball?

      Back then, I just found it strange. In retrospect, I wonder if it was an attempt to associate some kind of "uncompromising real 3D" branding with id engines. It's interesting that Q3 - which was the first engine id released after they got serious competitors in that space (Unreal, Source) - switched back to sprites for fire, smoke and explosions. And Unreal especially made it a point that it wasn't just 3D, but it was shiny:

      https://www.youtube.com/watch?v=zqdeyseQAP4

    • josteink 5 years ago

      > A nitpick: DooM (1 and 2), just as well as Duke Nukem 3D were indeed "2.5D"

      Double nitpick: Duke Nukem could actually have stacked floors and bridges over other playable areas.

      As such, if it didn’t use “real” 3D, its 2.5D scheme were considerately more elaborate than Doom or Wolfenstein.

      • fb03 5 years ago

        Yes, but I remember you still had to use a really contrived way of explaining that 3d 'geometry' on the editor (BUILD.exe).

        Basically you still drew in 2d but you could overlap rooms with other lines that wouldn't intersect with any already existing lines, and that would configure 'a room above/below the current room'. Then you had to slope the floor until it actually connected with the newer room.

        It was pretty strange but yeah, some people built insanely detailed things with Duke3D :-)

        • TomBombadildoze 5 years ago

          > Basically you still drew in 2d but you could overlap rooms with other lines that wouldn't intersect with any already existing lines, and that would configure 'a room above/below the current room'.

          Elaborating a bit -- the Build engine had a feature called "sector effectors" that, when triggered, would teleport the player to a completely different sector in the map. The transition was seamless and permitted interesting effects like the illusion of having stacked floors.

          The designers used this feature to most profound effect in the level Lunatic Fringe. In order to complete a full circuit of the central ring room, the player had to move through a 720-degree circle to return to the entrance. It was _extremely_ confusing playing that level for the first time.

          https://dukenukem.fandom.com/wiki/Lunatic_Fringe

          Fabian Sanglard wrote an excellent series about the engine here: https://fabiensanglard.net/duke3d/

          • estebank 5 years ago

            Interestingly enough the Portal games leverage similar level design given to the Source engine, not only for the obvious user controlled portals but also to stitch levels together seamlessly, including impossible architecture (the long bridge leading to the glados confrontation in the first Portal shows the room you're about to enter, but its actually too small and shaped differently to hold the room.

      • dexen 5 years ago

        Thank you a lot, josteink; I mixed up map format's with renderer's limitation.

  • mbel 5 years ago

    What do you mean "fake 3d"? They are as 3D as any other computer generated images with linear perspective. Quake is even using a rasterizer like nearly all games today.

    • anc84 5 years ago

      I'd take the benefit of the doubt and consider the poster oblivious to the vast difference between Doom and Quake, they probably just threw them both into the "early 90s 2.5D FPS" bucket.

    • dahart 5 years ago

      The word fake doesn’t bother me, but maybe “limited 3d” would be a better term for it. The 3d it had was still correct 3d, sure, it was just limited in what kinds of 3d it could render, unlike the full 3d games that came soon after.

    • PavlikPaja 5 years ago

      It wasn't really 3D, you couldn't look up or down at all in the original doom I think and in the later games you could, but it got all horribly distorted, as the engine couldn't actually show the scene from different perspective.

    • kkapelon 5 years ago

      Doom used 2D sprites, so "fake 3d" if you ask me.

      • morganvachon 5 years ago

        The terms "fake 3D" and "2.5D" in reference to Doom and similar games has nothing to do with sprite based mobs, it's about not having a truly 3D map. Doom's maps were drawn in 2D top-down with a height variable for any wall or platform. You couldn't have one room above another, which is why elevators in the game were solid (you couldn't stand under the floor). The various level editors for Doom based games looked like primitive 2D CAD programs.

        Quake was a whole new paradigm, it was based on a true 3D engine and its maps were generated in three dimensions. You could stand under stairs, under the room above, and underneath lifts.

      • mbel 5 years ago

        So does plethora of today's AAA titles for subtle details like particle effects, grass and others.

        • kkapelon 5 years ago

          yes. but sprites (i.e. enemies and weapons) are the primary method of interaction with Doom.

          Grass and particle effects are secondary....

          How many FPS AAA titles today use sprites for enemies and weapons?

          • jandrese 5 years ago

            Funnily enough the sprite based enemies were a real strength for Doom compared to many of the fully 3D games that came out in the next generation, including Quake. Because rendering full 3D enemies was taxing for the hardware you couldn't put too many of them on the screen at once. In some games more than 2 or 3 would cause it to chug. So you only ever run across one or two enemies at once in the games, unlike Doom that wasn't afraid to throw a swarm of pinkies at you because you just picked up the chainsaw.

            • kkapelon 5 years ago

              Agreed. I think "Painkiller" is one of the few games that actually managed (much later) to put a healthy amount of enemies against you just like Doom did..

          • sneakernets 5 years ago

            Doesn't matter, to the Doom engine they are tall cubes , albeit with increasingly awful collision detection as the cubes increase in size.

            • kkapelon 5 years ago

              So according to your definition Doom is a full 3d game, even though as a player I actually see 2D weapons and 2d enemies?

              • sneakernets 5 years ago

                Of course it is in the "game" sense. It handles X, Y, and Z axes in game on mobs and the player, although it is selective in which mobs get the "Z treatment". Monsters are infinitely tall, projectiles aren't. It's not a limitation of the engine as much as it was a design choice, as Y-shearing was too ugly in Carmack's opinion, so it was removed in favor of a very forgiving auto-aim.

                Nothing's stopping someone from adding code to render 3D models or "portals" in the game, and in fact it has already been done, with no 3D acceleration needed.

  • _Microft 5 years ago

    Do you maybe mean "raycasting" as used in Wolfenstein or Doom compared to a proper 3D engine like the Quake games were using?

mstade 5 years ago

I hope raytracing will enable some remasters of amazing old games. No need for new content, changing the stories or gameplay, just vastly improved graphics. I always preferred the Quake series for multiplayer, but Half-Life was great single player. Max Payne would be a great replay with better graphics. Some old flight sims like Jane's WWII Fighters, X-Wing v Tie Fighter, Wing Commander... I'd pay cash monies for this.

biosed 5 years ago

"Glass, which reflects everything around it" except the character standing in front of it! It seems like it is over done but not done properly. I know it is a demo but still, its trying so hard!

  • sbarre 5 years ago

    Quake 2 did not have a rendered character model because it's a first-person shooter, so you never see yourself.

    If you look at the water reflections, you will however see your gun model and a disembodied/floating hand holding it.

    • mikepurvis 5 years ago

      Your gun hand pokes out from the cloak of invisibility you're wearing.

    • biosed 5 years ago

      ahhh, didn't realise. Gun thing is mad

      • jerf 5 years ago

        TIL Quake 2 is a sequel to The Addams Family.

        Somehow... it almost works....

qwerty456127 5 years ago

> id Software’s Quake II launched in 1997, bringing gamers a new single-player campaign, a long-awaited, addictive multiplayer mode that we played for years on pitifully-slow 56K modems, and a jaw-dropping engine

As for me Quake 2 was kind of a disappointment after Duke Nukem 3D. Only Half-Life brought back the depth Duke Nukem 3D had. E.g. seeing something like a ventilation cover and not being able to break it and climb through felt infuriating.

  • scruffyherder 5 years ago

    Honestly Quake 2's biggest feature was going open source.

    When I was porting Quake 2 to MS-DOS, I have to admit that it's the most I ever played of it. It really fell into Carmak's view that stories have no place in gaming.

    It's a shame Valve found a much bigger market than games, but it leaves the mantle for others to pick up.

    • qwerty456127 5 years ago

      > stories have no place in gaming

      What does this mean? Being told and participating in a story has always been by far the primary thing I play a game for. Interactive visual fiction experience inducing compassion and involvement bundled with moderate challenge, a degree of freedom and realism (that's why I don't like rail-nailed heavily scripted games) is what I want of a game. The visual part should be reasonable (mostly needed for the atmosphere) and controls should be convenient. In my opinion Half-Life and Fallout (all the parts) were the best games I ever played. The story unveiling and the sense of helping people in the imaginary world heroically (something that is a way too hard and dangerous for an average Joe to do in the real life where you can't save&load) and exploration are the things that make a game addictive for me.

      • scruffyherder 5 years ago

        It's an old Carmack quote:

        > Story in a game is like a story in a porn movie. It's expected to be there, but it's not that important.

        It's why DooM was such an incredible novelty of fast paced 3d shooter, and Quake... Well.. Its devoid of character.

        It's why HL & Fallout are so beloved.

        I'm 100% with you

    • furicane 5 years ago

      Quake 2 is one of the most difficult FPS games to play competitively. I played the game for 10 years, 1998 - 2008. I can tell you're looking at the game from a different perspective (I'm assuming you're referring to its singleplayer), however - its multiplayer component was nothing short of amazing. Game actually takes a huge physical toll on the player due to complex controls (strafe jumps, circle jumps, doublejumps, circle-strafes, occasional rocket / grenade jump), and the movement aspect combined with aim and general awareness made it one of the best games I had honor to play.

      • scruffyherder 5 years ago

        Part of the reason I started the DOS port was that the QuakeWorld port had dropped to zero activity. I was trying to validate a different style of building GCC and all the servers were empty.

        When q2 had shipped out I was so busy working I never had time to play it seriously.

        Quake 4 felt far more engaging to me. And q3.. I had the tin box set for Linux. Remember that push to make retail Linux gaming a thing? Shame it never panned out.

    • int_19h 5 years ago

      But it wasn't the first id game to go open source.

      • scruffyherder 5 years ago

        I didn't mean to imply that, just that the only reason q2 gets as much love these days is that it's source is available.

        And iD did a great job of writing modular and portable code.

        It was really exhilarating to get it running on a different platform with essentially no real changes, other than platform code and initially removing the DLL support.

        Thankfully DJGPP had progressed far enough that its DLL support works great.

        • int_19h 5 years ago

          I remember there was a DOS extender that could run Win32 console and very basic (full-screen window) GUI binaries. Just enough to run Quake 2 in software mode.

andybak 5 years ago

No mention of how to download it - either in the negative "You can't..." or positive.

Seems a strange omission.

  • geoah 5 years ago

    The creator's website ie mentioned on the article, and code is available on his github[2].

    > As Christoph states on his site [1] ...

    [1] http://brechpunkt.de/q2vkpt [2] https://github.com/cschied/q2vkpt

    • andybak 5 years ago

      That's q2vkpt. It sounds like Quake II RTX is built on top of that but with more features.

      • arianvanp 5 years ago

        No I think they are one and the same. q2vpkt is built on top of VK_NV_ray_tracing which is the Vulkan interface to RTX. It won't run on non-nvidia hardware afaik

        • Strom 5 years ago

          They're not the same.

          “But what’s new with Quake II RTX compared to Q2VKPT?”, you ask. A lot. We’ve introduced real-time, controllable time of day lighting, with accurate sunlight and indirect illumination; refraction on water and glass; emissive, reflective and transparent surfaces; normal and roughness maps for added surface detail; particle and laser effects for weapons; procedural environment maps featuring mountains, sky and clouds, which are updated when the time of day is changed; a flare gun for illuminating dark corners where enemies lurk; an improved denoiser; SLI support (hands-up if you rolled with Voodoo 2 SLI back in the day); Quake 2 XP high-detail weapons, models and textures; optional NVIDIA Flow fire, smoke and particle effects, and much more!

  • steinuil 5 years ago

    I hope they're going to release it later, after cleaning it up. Quake 2 is GPL'd, so I think they're compelled to, and it would also be a good example of how to use the VKRay API and what it looks like in an actual game.

  • bni 5 years ago

    A Nvidia guy answers this question in the comments with a "Stay tuned"

  • dahart 5 years ago

    It’s going to be released, it just hasn’t happened yet.

iofiiiiiiiii 5 years ago

This looks like RTX versus Quake2 software renderer, so the pictures are somewhat deceptive as they do not use the more powerful original Quake2 renderers.

I would be interested how RTX compares to the other Quake2 renderers that it originally came with.

hellofunk 5 years ago

Is this ray tracing or path tracing? I'd think that real-time path tracing would still be hard/impossible in real-time considering the number of samples, but I'm not sure. People throw around these terms interchangeably which complicates the discussion, though the algorithms have notable departures from each other.

  • crote 5 years ago

    0:36 in the video, "all based on path tracing". Easy tell: the soft shadows, volumetric lighting, indirect lighting, complex shaders. Some of it could probably be faked, but it's trivial with path tracing.

    About speed: it's still very hard to do real-time, but it seems like denoising algorithms are getting good enough that it's getting doable. Keep in mind that it is a game, so small graphical glitches aren't that big of a deal.

  • seanalltogether 5 years ago

    I thought path tracing was just ray tracing with additional random scatters traced, is it not?

    • dahart 5 years ago

      Depends on who you ask, the term is overloaded now.

      These days, it does tend to mean doing global illumination using ray tracing.

      Originally, the term “path tracing” was used to refer to taking a single random scatter at every step along a path of connected segments, as opposed to the idea of taking multiple random scatters at a point and averaging them, recursively. It’s a way of thinking of Monte Carlo rendering as taking one sample in a very high dimensional space - where a chain of ray segments, or “path” is a single sample - rather than thinking of each ray segment separately as a sample.

      Ray tracing, FWIW, can sometimes refer to situations where you’re not even rendering. It has a more general meaning of doing line based visibility queries, which you can use for lots of things.

      • hellofunk 5 years ago

        If I’m not mistaken, for your last paragraph, general visibility tests for lines that does not otherwise involve actual rendering is typically just called ray casting.

        • dahart 5 years ago

          You’re not mistaken, ray casting would be more common, but ray tracing is also used. This is especially true among people that design path tracing renderers where we consider “ray tracing” to be a visibility primitive. “Trace a ray” and “cast a ray” are synonymous. I also think using “ray tracing” for visibility might start become more popular now with RTX hardware since the “ray tracing” hardware only provides the visibility test, not a renderer.

    • kuzehanka 5 years ago

      In the realtime graphics world,

      Ray tracing = initiate rays from light sources, bounce them around, accumulate them on camera. This is accurate but incredibly inefficient. This is how pre-baked light mapping and GI works.

      Path tracing = initiate rays from camera, induce bounce rays toward light sources. Not quite as accurate but orders of magnitude mode efficient in terms of producing an acceptable image.

      • hellofunk 5 years ago

        This is confusing and not true. The RTX is specifically referred to as "ray tracing" but no one does forward ray tracing which is what you are referring to. Both general Ray Tracing and Path Tracing involve "backwards ray tracing", starting from the camera. The distinction is not where the ray originates, its the algorithm for generating and following those rays.

        • kuzehanka 5 years ago

          This guy is right. I have been inadvertently perpetuating a misconception for the last 5 years.

woodrowbarlow 5 years ago

can someone help me understand what "ray tracing" means in this case?

i've read a few technical articles about ray-tracing, but they mostly describe it as a rudimentary (largely outdated) way of achieving 3d effects with relatively little code, and with severe limitations (walls must be orthonormal, changes in elevation require extra work, etc.). i understand ray-tracing at this level (break the view into columns, set the fill height of the column according to the distance of the ray).

then i see marketing-style articles like this one that use the same term to describe advanced lighting effects, material reflections, etc. and i don't understand the technological jump. what am i missing?

edit: i was thinking of "ray casting", not "ray tracing". thank you for the corrections below.

  • kllrnohj 5 years ago

    What you are describing sounds more like ray casting than ray tracing.

    Ray casting is the old Wolfenstein 3D look: https://en.wikipedia.org/wiki/Ray_casting

    Ray tracing is what Pixar uses for Toy Story, Monsters Inc, etc... https://en.wikipedia.org/wiki/Ray_tracing_(graphics)

    And then the new hotness for offline rendering is Path Tracing, which is a form really good but really hard to accelerate form of ray tracing: https://en.wikipedia.org/wiki/Path_tracing

    And yes fundamentally all of these are "shoot rays out from the camera", hence the very specific terminology used for each to disambiguate which approach to shooting out rays from the camera is being used and what happens when rays hit a thing.

    • tntn 5 years ago

      I'm pretty sure toy story 1 didn't use ray tracing. Iirc cars was the first that did.

  • xsmasher 5 years ago

    The technique you are talking about, used for Wolfenstein 3D in the '92, is more commonly called "ray-casting." You cast one ray per screen X (or fewer) and see what object it hits, and then draw a slice of that object. https://en.wikipedia.org/wiki/Ray_casting

    Ray tracing as used here means simulating light rays bouncing off of objects - it's a high fidelity technique generally not used in games in the past, because it is so slow. https://en.wikipedia.org/wiki/Ray_tracing_(graphics)

  • tntn 5 years ago

    Sounds like the first thing you are describing is closer to ray casting. Ray tracing doesn't have the limitations you mention.

    Roughly speaking, ray tra in techniques are variations on the concept of tracing a ray from the camera back into the screen and seeing where is ends up. The different variations distinguish themselves by what happens at a ray intersection. Classical ray tracing will spawn a shadow ray towards light source. Path tracing will stochastically sample some distribution of ray directions (usually based on a combination of light locations and material properties) in order to do Monte Carlo integration of the rendering equation. There are other variations.

    Ray tracing (w/ only shadow rays) doesn't give you global illumination. Path tracing (this work) does.

yedpodtrzitko 5 years ago

That looks really impressive from both technical and also marketing point of view. Especially the latter one keeps nVidia ahead of AMD. As much as I like AMD, they need to step up their game somehow.

cdnsteve 5 years ago

How do I get to play this again? Was my favorite FPS game of all time.

  • Freak_NL 5 years ago

    Buy the game on Steam or gog.com, and use the assets with any client you like (on practically any operating system) such as: https://yamagi.org/quake2/

    Of course that doesn't get you this ray tracing thingy, although you might be able to run that when they release it and if your hardware supports it.

sabujp 5 years ago
  • Strom 5 years ago

    That's the source for Q2VKPT. I wonder if the source for Quake II RTX will also be released, which is the main target of discussion in this nVidia article and builds upon Q2VKPT.

    • mattnewport 5 years ago

      They said at GTC it will but they haven't given a date for when yet.

PorterDuff 5 years ago

This kind of demo always makes me think of how long it used to take to render a single frame of anything non-trivial.

leowoo91 5 years ago

"We are not faking it" - how about the hardware? Is it exact same that consumer could buy?

  • lanevorockz 5 years ago

    RTX is based on the Turing architecture that is a pretty amazing beast. The technology of raytracing has been around or a long time through iray what nvidia did is to bake in the necessary formats, neat interface and slight hardware adjustments.

    • leowoo91 5 years ago

      I only worry about the number of cores being used in these demos. Tech might be same but it's quite possible Nvidia could use their high-energy prototypes to enhance the results. There is no restrictions on that. Clear statements like, "we are using the exact same RTX card we are shipping to consumer in September" would really be a selling point for me.

      • tntn 5 years ago

        I think Titan RTX is a "perfect die," so to speak, so even if they used the maximum number of cores they possibly could for this demo, that same silicon is for sale.

        • leowoo91 5 years ago

          Looking at the Titan's price range, that's something I would effort "max" for luxury gaming, only if it can give those exact results.