Ray Tracing: How NVIDIA Solved the Impossible!

790,824
0
Published 2022-10-15
❤️ Check out Lambda here and sign up for their GPU Cloud: lambdalabs.com/papers

📝 The showcased papers are available here:
research.nvidia.com/publication/2021-07_rearchitec…
research.nvidia.com/publication/2022-07_generalize…
graphics.cs.utah.edu/research/projects/gris/
users.cg.tuwien.ac.at/zsolnai/gfx/adaptive_metropo…

Link to the talk at GTC: www.nvidia.com/en-us/on-demand/session/gtcfall22-a…

If you wish to learn more about light transport, I have a course that is free for everyone, no strings attached:
users.cg.tuwien.ac.at/zsolnai/gfx/rendering-course…

❤️ Watch these videos in early access on our Patreon page or join us here on YouTube:
- www.patreon.com/TwoMinutePapers
- youtube.com/channel/UCbfYPyITQ-7l4upoX8nvctg/join

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Benji Rabhan, Bryan Learn, B Shang, Christian Ahlin, Eric Martel, Geronimo Moralez, Gordon Child, Jace O'Brien, Jack Lukic, John Le, Jonas, Jonathan, Kenneth Davis, Klaus Busse, Kyle Davis, Lorin Atzberger, Lukas Biewald, Luke Dominique Warner, Matthew Allen Fisher, Michael Albrecht, Michael Tedder, Nevin Spoljaric, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Steef, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here: www.patreon.com/TwoMinutePapers

Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu/

Károly Zsolnai-Fehér's links:
Instagram: www.instagram.com/twominutepapers/
Twitter: twitter.com/twominutepapers
Web: cg.tuwien.ac.at/~zsolnai/

All Comments (21)
  • @PsuedoNymrod
    Thank you for demonstrating (and including the links!) to these papers. Glad we already have access to one of these techniques and excited to see more commercially-available tools based on the other pieces of research you've highlighted!
  • @kobby2g8
    As the render times get shorter, the 2 minute papers get longer 😛
  • @UON
    I feel there's no way I can hold my papers down with these ferocious winds of change
  • It is such an honor to get an invitation to GTC to talk about light transport. Thank you so much! 🙏 So happy! 😊
  • @scaredyfish
    I’m always amazed how even as hardware gets faster, algorithms still keep getting smarter, so you get a double boost in performance.
  • @Wobbothe3rd
    Still squeezing in 2022! I remember me and my sister trying to render Bryce pictures on a pentuim 90. It took HOURS to make a sphere over water, lol. Miraculous stuff here.
  • @trapfethen
    You can get a better sense of what paper 4 is doing when you look at the edges of the image where the train is just coming into frame. You can see the detail jump levels as that spot on the train spends more time in the frame. A possible workaround to this particular detail is to render to a virtual screen slightly larger than the actual screen and cropping out the edges when displaying on screen.
  • @Musikpunx
    This is really an amazing time to be alive. In my particular case, I was a child, when the video game "Pong" was released. Looking at the developement of computers and AI from then to now is unbelieveble. Btw., I was born in 1967, which means 2 years before astronauts landed on the moon with a computer on board that had 80 kilobytes of memory.
  • I did my master on this. Computer Graphics. And I must confess this is insane. Like no way this is possible. I need to read all four of the papers. Wow.
  • @aidanm5578
    I've superglued my papers to my hands. Please help.
  • I remember last year, as I was browsing a few (not yet reviewed/incomplete/old) papers on arxiv, and stumbled across a thesis that was talking about cone tracing... I downloaded it and was like "hey one day I'll read it". I never did. This is probably not the same paper but the second you pronounced "voxel cone tracing" I was like "of COURSE". Thanks for the vid, love your work. To think they actually thought about that ten years ago... EDIT: turns out I still have the PDF of the thesis, it's "Audio and Visual Rendering with Perceptual Foundations", which doesn't match any of NVIDIA's papers, yet is dated 2009, so...
  • @nettsm
    Didn't we see this video about a year ago? Why are we seeing this again? 🤔
  • @guaposneeze
    The modern tech is fantastic. But it's also quite complex. One of the real amazing things about classical raytracing was that you could famously learn to get started "In One Weekend." Even as somebody who has been at least somewhat engaged in CG for many years (you've all seen stuff I worked on when I worked in VFX) the new stuff is less exciting because I have pretty much zero illusions that I am ever going to write an implementation of a full modern renderer by myself. Even just trying to use nVidia's Restir implementation library in my own engine rather than implementing it myself is more complex than writing a whole renderer used to be!
  • @Zylenxx
    This lineup of papers is not just perfect for high end graphics cards , too. Considering that theyre optimised where possible and still going for more, smaller graphics cards and possibly devices that arent normally MEANT to do such things could possibly opt in for a similar method for demos! This is a big change. Better light transports for the same amount of time and effort is wonderful to keep track of. Thank you alot!
  • @psiga
    Oof. Sympathy and empathy to the people who saw that research from 11 years ago and mistakenly thought that widespread adoption and advancement of it would be just around the corner. At least it's in the right hands now. Thank you for bringing it to our attention, Károly!
  • @trenton9
    I grow more and more in love with how incredible solutions to incredible problems are often collecting dust in a drawer somewhere waiting for proper implementation.
  • @kangsan2014
    I wish you would tell us what kind of graphics card these researchers are using so that we can understand whether they are using the newest GPUs or CPUs as well as the what version of the software. I feel like that would be more informative. Your information is vastly and greatly informative but having those little bits of information will really tell us if they are using current software, GPUs and CPUs. Thank you so much and keep the excellent information coming! :)
  • @protocol6
    It's interesting how you think of the noise as a limitation of ray tracing when cameras and even your eyes have the same problem in low light.
  • @therealsourc3
    Another way to make light transport faster is distance based light resolution downsampling. Far away caustics and reflections are not gonna be visible due to the amount of pixels in the image anyways so there is no need for them to be super refined, so as long as the stuff closer to the camera is good enough resolution everything further away can be progressively downsampled to reduce memory usage and possibly increase the accuracy where it matters more..
  • @Analoque444
    I've been fascinated by raytracing since "Imagine" on the Amiga. It's unbelievable how far raytracing has developed today. Back then, it took me four weeks to do one picture and it was a dream that one day it would happen in real time, although it's not perfect yet. Thank you for your videos.