NVIDIA’s New AI: Impossible Ray Tracing!

❤️ Check out DeepInfra and run DeepSeek or many other AI projects:

📝 The #nvidia paper "3DGUT: Enabling Distorted Cameras and Secondary Rays in Gaussian Splatting" is available here:

📝 Our Separable Subsurface Scattering paper:
📝 SSS For Gaussian Splatting:

Sources:

📝 My paper on simulations that look almost like reality is available for free here:

Or this is the orig. Nature Physics link with clickable citations:

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Benji Rabhan, B Shang, Christian Ahlin, Gordon Child, John Le, Juan Benet, Kyle Davis, Loyal Alchemist, Lukas Biewald, Michael Tedder, Owen Skarpness, Richard Sundvall, Steef, Taras Bobrovytsky, Thomas Krcmar, Tybie Fitzhugh, Ueli GallizziIf you wish to appear here or pick up other perks, click here:

My research:
X/Twitter:
Thumbnail design: Felícia Zsolnai-Fehér –

Joe Lilli
 

  • @shiverwind says:

    What a time to be alive

  • @carel_dfx says:

    We’re reaching to a new level in CGI. Thank you Gaussian Splatting and AI

    • @hombacom says:

      Gaussian splatting looks very static though compared to modern realtime 3D

    • @TheAkdzyn says:

      @@hombacom yes, but it has higher fidelity. It’s applications are still relevant for the static objects in the scene/game/environment and since it’s new, it has a lot of room to grow with more papers. 😜

  • @panzerofthelake4460 says:

    0:13 yeah, this is too accurate, I can’t live happily without the shiny stuff!!!

  • @klzeccwozi1290 says:

    “Are you thinking what I’m thinking?”
    Probably not, since I’m thinking how much is this going to cost.

  • @TheAkdzyn says:

    I love your explanation!! You introduced me to gaussian splatting and unreal engine 5 as well as expanded my understanding of ray tracing.
    Applying secondary rays to gaussian splatting and still producing high speed simulations is very promising for the technology. Gaussian Splatting is the highest fidelity image quality I’ve ever seen in 3d graphics. I like that we’re revisiting old examples with the new technology so we can compare how far the industry has come in the last 5 years. Thank you!! 🤓

  • @FritzSchober says:

    “3D GUT” means “3D GOOD” in German. Even better than intestines 🙂

  • @raaasin says:

    This is the paper GTA 6 was waiting for

  • @Keylough says:

    What a time to be alive. Just a bit on the expensive side

  • @milky655 says:

    Long time lurker and your videos have amazed me every time I see them.
    There is one other cursed usage for fisheye…security cameras. Pair that with Sub surface scattering, we’re verging on precipice of security footage showing you being in places that you’ve never visited with no artifacting or teltale AI interaction. Absolutely wild! We’re venturing into Post-post truth where there can be videos of me, with my voice on, in places I’ve never been, all captured from a single image from social media. Scary but fascinating times.
    Thanks so much for your concentrated “all the best bits” journalism.

  • @The_Orgin says:

    Me, an idiot being called a fellow scholar by a PhD

  • @NicosLeben says:

    3:25 This looks a bit odd because the new objects do not cast any shadows.

    • @Alloveck says:

      I was bothered by that exact same thing.

      Everybody has their 3d graphics visual pet peeves, and mine is lighting. Especially things not casting shadows that should. So I couldn’t not see that lack of shadows instantly.

    • @M4-PERFECT says:

      I think I see the shadows, light source is right above the objects so the shadows are under them and only slightly visible. Background objects have more well defined shadows.

    • @WilsoniStudios says:

      Some proxy geo set to Shadow Catcher would probably do as a workaround for now….until the next paper

    • @phizc says:

      The added objects aren’t reflected in the scene either. The metal part in the center of the table should reflect them even if the roughness is quite high.

  • @zergidrom4572 says:

    Researchers do this, researchers do that. Yet game graphics with all these “technologies” literaly jumping kilometers back with fps, sick input lags, freakin AA ghosting, f blurred visuals

    • @Fermion. says:

      ​@@zergidrom4572 The tech will get better as it matures.

      Plus, we can’t put the blame solely on the tech. Devs and management are slow to adapt newer tech, due to the learning curve of implementation, and apprehension to stray from their “business as usual” models.

      If it’s anything like the corporate world, even something as seemingly insignificant as switching email clients has a mountain of red tape (change requests, documentation, new SOPs, meetings, meetings, oh, and did I mention meetings?)

    • @MrTomyCJ says:

      To be fair the new tech shown here does not apply to videogames nor would solve any of those issues, it’s an unrelated topic.

  • @YTisProMentalillness says:

    ‘and you only need a $4000 GPU – the 5090! It only delivers 30 FPS at 4K, but what a time to be alive!’

    • @Torpi95 says:

      the og white papers 2 years old, they managed to bring the tech to market, give it another 2 years maybe 1.5 and we should see it distributed with driver software with backwards compatibility.

    • @thomasgoodwin2648 says:

      So… a couple MORE years before it works on my C64, right? 😉

  • @darviniusb says:

    Yes. The source code is out there for the people to perfect and optimize so that other companies can closed down later or branch it and sell it. Not even USD and materialX, don’t even mention Gaussian splats or ACES, is not yet standardized across 3d major applications just because of the laziness of the devs that are happy with their monthly income cash cows. Probably in 10 years we will have something out of this. The subscription model made everyone extremely lazy and unmotivated.

  • @alexdevcamp says:

    Can’t wait to play this in a video game in 10 years

    • @SulavNiroula-b1c says:

      10 years more like 20

    • @Bubblegummonsters says:

      You can already do this in Unreal. Not the 3DGUT version but standard Gaussian Splats can be imported and run. Not sure if you could make a huge GTA level but you can get room size environments down to about 200mb.

  • @Cheddar_13 says:

    1:12 I literally had this idea ages ago

  • @fabricioestevam6297 says:

    Reality is no longer as we imagined it a few papers ago.

  • @mm-rj3vo says:

    If i can still tell any objects are synthetic, then I am not yet fullly satisfied

  • @djayjp says:

    Can’t wait for the FIRST Gaussian Splatting shot game…!

  • @Drone256 says:

    But can Nvidia actually produce and sell a new gaming GPU? They seem to have forgotten about gamers.

  • >