NVIDIA’s New Ray Tracing Tech Should Be Impossible!

❤️ Try Macro for free and supercharge your learning:

📝 The paper "3D Gaussian Ray Tracing: Fast Tracing of Particle Scenes" is available here:

📝 My paper on simulations that look almost like reality is available for free here:

Or this is the orig. Nature Physics link with clickable citations:

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Alex Balfanz, Alex Haro, B Shang, Benji Rabhan, Gaston Ingaramo, Gordon Child, John Le, Juan Benet, Kyle Davis, Loyal Alchemist, Lukas Biewald, Martin, Michael Albrecht, Michael Tedder, Owen Skarpness, Richard Sundvall, Taras Bobrovytsky,, Thomas Krcmar, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here:

My research:
X/Twitter:
Thumbnail design: Felícia Zsolnai-Fehér –

#nvidia

Joe Lilli
 

  • @imsatoboi says:

    i am simple man , i see 2 minute paper , i click

  • @phantomabid says:

    Research Papers: RTX ON

  • @theaslam9758 says:

    “Two Minute Papers released a video 2 minutes ago”

  • @vlividTV says:

    Amazing tech. Can’t wait for it to become available.

  • @firefox8713 says:

    What a time to be two papers down the line!

  • @benveasey7474 says:

    Awesome if this could be used in Blender for fast/lightweight ArchVis backgrounds.

  • @HarhaMedia says:

    This is very cool! Heaps more interesting than the generative AI papers.

  • @mm-rj3vo says:

    Holy hecking shoot

    First law of papers!!!

    Two more papers down the line, we’re going to get real time full realistic scenes !!!

    Now it’s about getting those “models” to be movable and interactible in real-time

  • @mm-rj3vo says:

    I’m looking for holodeck type stuff, in simple. AI that can make a chair when I say “chair”, or “give me some chair types” “more exciting” or “more basic” or “make this wood, make that metal, put a bar here and screws there” etc etc

    I want a world that I can describe into existence and edit at a whim, through language

  • @singularonaut says:

    Looks like PS7 will be matched with real-life quality of graphics) or even PS6 Pro)

  • @davidrenton says:

    finally the mortgage for that 8090 will be justified

  • @14zrobot says:

    It does not compute what is the difference between point-based raytracing and regular raytracing. It would be cool to get some more details down the line

  • @NigraXXL says:

    This is unbelievable. If we could get gaussian splatter a bit more developed to the point it can be rigged and animated, that would go so well with this new light simulation support and could make stuff like Unreal’s nanite-level of detail actually available to more hardware

  • @mindful_clip says:

    as soon as it’s possible.. i feel like “who cares”
    this can’t be good..

  • @incription says:

    okay, but nothing in the scene can move?

  • @puzzzl says:

    So is this something they’ll be able to just slot into the rendering pipeline on the graphics card and the developers won’t have to worry about it?

  • @mtrivelin says:

    I’m a 3D artist who started on the Amiga, with Ligtwave and currently (continue to) work with Maya and Vray. Nodes, Lights, Shader, render, change things, render again until it looks good.

    But seeing these new technologies that seem to perform one bigger miracle than another every few days makes me feel like a caveman trying to understand our world.
    I have no idea how to use these things. I feel like I was instantly outdated.

  • @julinaut says:

    I think what these papers really need to improve the world we live in is attention. You’re doing gods work
    Károly!

  • @LarryPanozzo says:

    Polyscope! 👏🏼

  • @aymericrichard6931 says:

    Waiting for this to come to 3d posing tools like daz

  • >