X
Chat
Perfect Pairing: NVIDIA’s David Luebke on the Intersection of AI and Graphics
Released Sep. 11th, 2020

NVIDIA Research comprises more than 200 scientists around the world driving innovation across a range of industries. One of its central figures is David Luebke, who founded the team in 2006 and is now the company’s vice president of graphics research.

Luebke spoke with AI Podcast host Noah Kravitz about what he’s working on. He’s especially focused on the interaction between AI and graphics. Rather than viewing the two as conflicting endeavors, Luebke argues that AI and graphics go together “like peanut butter and jelly.”

NVIDIA Research proved that with StyleGAN2, the second iteration of the generative adversarial network StyleGAN. Trained on high-resolution images, StyleGAN2 takes numerical input and produces realistic portraits.

Creating images comparable to those generated in films — which could take up to weeks to create just a single frame — the first version of StyleGAN only takes 24 milliseconds to produce an image.

Luebke envisions the future of GANs as an even larger collaboration between AI and graphics. He predicts that GANs such as those used in StyleGAN will learn to produce the key elements of graphics: shapes, materials, illumination and even animation.

Key Points From This Episode:

  • AI is especially useful in graphics by replacing or augmenting components of the traditional computer graphics pipeline, from content creation to mesh generation to realistic character animation.
  • Luebke researches a range of topics, one of which is virtual and augmented reality. It was, in fact, what inspired him to pursue graphics research — learning about VR led him to switch majors from chemical engineering.
  • Displays are a major stumbling block in virtual and augmented reality, he says. He emphasizes that VR requires high frame rates, low latency and very high pixel density.

Tweetables:

“Artificial intelligence, deep neural networks — that is the future of computer graphics” — David Luebke [2:34]

“[AI], like a renaissance artist, puzzled out the rules of perspective and rotation” — David Luebke [16:08]

You Might Also Like

NVIDIA Research’s Aaron Lefohn on What’s Next at Intersection of AI and Computer Graphics

Real-time graphics technology, namely, GPUs, sparked the modern AI boom. Now modern AI, driven by GPUs, is remaking graphics. This episode’s guest is Aaron Lefohn, senior director of real-time rendering research at NVIDIA. Aaron’s international team of scientists played a key role in founding the field of AI computer graphics.

GauGAN Rocket Man: Conceptual Artist Uses AI Tools for Sci-Fi Modeling

Ever wondered what it takes to produce the complex imagery in films like Star Wars or Transformers? Here to explain the magic is Colie Wertz, a conceptual artist and modeler who works on film, television and video games. Wertz discusses his specialty of hard modeling, in which he produces digital models of objects with hard surfaces like vehicles, robots and computers.

Cycle of DOOM Now Complete: Researchers Use AI to Generate New Levels for Seminal Video Game

DOOM, of course, is foundational to 3D gaming. 3D gaming, of course, is foundational to GPUs. GPUs, of course, are foundational to deep learning, which is, now, thanks to a team of Italian researchers, two of whom we’re bringing to you with this podcast, being used to make new levels for … DOOM.

Tune in to the AI Podcast

Get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn. If your favorite isn’t listed here, drop us a note.

Tune in to the Apple Podcast Tune in to the Google Podcast Tune in to the Spotify Podcast

Make the AI Podcast Better

Have a few minutes to spare? Fill out this listener survey. Your answers will help us make a better podcast.

The post Perfect Pairing: NVIDIA’s David Luebke on the Intersection of AI and Graphics appeared first on The Official NVIDIA Blog.



-- Source: http://feedproxy.google.com/~r/nvidiablog/~3/7YDcsDNazjU/