NVIDIA’s New AI: Journey Into Virtual Reality!

Share
Embed
  • Published on Dec 13, 2021
  • ❤️ Train a neural network and track your experiments with Weights & Biases here: wandb.me/paperintro
    📝 The paper "Physics-based Human Motion Estimation and Synthesis from Videos" is available here:
    nv-tlabs.github.io/physics-po...
    🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
    Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Angelos Evripiotis, Benji Rabhan, Bryan Learn, Christian Ahlin, Eric Martel, Gordon Child, Ivo Galic, Jace O'Brien, Javier Bustamante, John Le, Jonas, Kenneth Davis, Klaus Busse, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Mark Oates, Michael Albrecht, Michael Tedder, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Peter Edwards, Rajarshi Nigam, Ramsey Elbasheer, Steef, Taras Bobrovytsky, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
    If you wish to appear here or pick up other perks, click here: www.patreon.com/TwoMinutePapers
    Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu
    Wish to watch these videos in early access? Join us here: thexvid.com/channel/UCbfY...
    Meet and discuss your ideas with other Fellow Scholars on the Two Minute Papers Discord: discordapp.com/invite/hbcTJu2
    Károly Zsolnai-Fehér's links:
    Instagram: twominutepa...
    Twitter: twominutepapers
    Web: cg.tuwien.ac.at/~zsolnai/

Comments • 578

  • Mo's Tech Room

    Your reactions make me happy

  • Blender Guru

    Whoa. Putting the VR use aside, this pose estimation looks good enough for most mid-budget animations. Can't wait till we can just act out a scene in front of a camera and effortlessly port it to a rigged character.

  • Emanuel
    Emanuel  +415

    These new papers are getting more and more insane, technology is moving so fast I can't even follow it anymore and that's amazing

  • Harper
    Harper  +324

    I cannot help but notice how the final virtual animation does not coincide perfectly with the input.

  • santino66
    santino66  +113

    I really don't want to pre-emptively call VR the Metaverse and assign monopoly to that nebulous technology just yet. We cannot just accept that such a morally bankrupt company would have any integrity upon the matter and they shouldn't be trusted as the proprietor and administrator of our new synthetic realm. The tech in this looks amazing, however! Always rocking it, two minute papers!!

  • RDB
    RDB  +121

    This seems awesome for the future of VR. We are so behind with just head and hand tracking. Maybe this could aid in moving around realistically without needing a whole motion capture studio.

  • Samuel Snowden

    This truly is one of the most impressive tech developments I've seen recently. Great paper, great delivery as always.

  • Exilum
    Exilum  +64

    I'm looking forward to see whether Nvidia could have an interest into turning this into a part of their tools that uses tensor cores. Would be interesting to use this type of high level motion tracking with Nvidia cards, and I'm pretty sure no game developer would be against trying to make a VR game with this type of controls. It's an opportunity for a lead into a potential new part of the VR market.

  • Matthieu Simard

    I'm strongly against META's metaverse but I'm really excited about bringing accurate animations to 3D characters with this technique!

  • Edward Mitchell

    Dear Two Minute Papers, would it be possible to summarize how the progress in various domains has been underpinned by combining simulations (physics, Montecarlo, etc.) with DNNs or by combining DNNs (GANs, Cognitive Systems)?

  • fudgesauce

    Another use for this is in frame interpolation. When synthesizing "tween" frames, having a physical understanding of people allows the image interpolation to account for occlusions between legs and between arms/body to generate a physically plausible interpolation. Many image based systems have bad artifacts in such situations.

  • arsnakeheart

    This is actually amazing

  • D
    D  +2

    This genuinely reminds me of when Dave Perry from shiny software, figured out, not to bother rendering hidden objects, it's so simple and elegant, now if only I could get my hands on a reasonably priced 3070

  • ShockForce612

    I imagine pose estimation of this quality will have a huge impact for independent video game developers and animators! Now instead of renting an expensive mocap suite a dev could just take a quick video in their driveway of the animations they need, and convert them to usable animation data via this NN.

  • ProjectPhysX

    This is actually very cool. Using physics constraints to fill in missing data. I tried tracking once with multiple Kinects from different angles, and it turned out to be rather tedious. A lot is gained from a simpler hardware setup and doing the rest in software.

  • Luckytime

    Great work! I love these little summaries because I would never know this existed without them.

  • patmw
    patmw  +4

    this is incredible, does it run in realtime?

  • Rafael Pozzi

    Kinect is finally evolving! In childhood when I used to play Kinect, the movements used to be horrible, it didn't detect anything right, full of false positives and so on. It had several 3d sensors, and nowadays Tesla only need a camera, just like humans. Amazing how things are progressing, I hope one day I can play Kinect with perfect motion tracking, which will only need a webcam, if the TV isn't already going to come with a camera, which could be used for that, and to control the TV itself also. Goodbye remote control!! Goodbye useless infrared sensors!!

  • yourordinaryme

    The flickering is actually a problem that is trivial to solve using algorithmic approaches as opposed to neural nets, since you can use a state machine to represent the objects in 3D space as opposed to having to "redetect" them on every frame

  • Mucheru Njaga

    Kudos to you!! It can not be understated, that the content/work, that your and your team is producing is a gift to humanity. Please keep sharing your milk of human kindness with the world