From Mesh To Yarn... In Real Time! 🧶

Share
Embed
  • Published on Dec 10, 2021
  • ❤️ Check out the Gradient Dissent podcast by Weights & Biases: wandb.me/gd
    📝 The paper "Mechanics-Aware Deformation of Yarn Pattern Geometry" is available here:
    visualcomputing.ist.ac.at/pub...
    🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
    Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Angelos Evripiotis, Benji Rabhan, Bryan Learn, Christian Ahlin, Eric Martel, Gordon Child, Ivo Galic, Jace O'Brien, Javier Bustamante, John Le, Jonas, Kenneth Davis, Klaus Busse, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Mark Oates, Michael Albrecht, Michael Tedder, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Peter Edwards, Rajarshi Nigam, Ramsey Elbasheer, Steef, Taras Bobrovytsky, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
    If you wish to appear here or pick up other perks, click here: www.patreon.com/TwoMinutePapers
    Wish to watch these videos in early access? Join us here: thexvid.com/channel/UCbfY...
    Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu
    Meet and discuss your ideas with other Fellow Scholars on the Two Minute Papers Discord: discordapp.com/invite/hbcTJu2
    Károly Zsolnai-Fehér's links:
    Instagram: twominutepa...
    Twitter: twominutepapers
    Web: cg.tuwien.ac.at/~zsolnai/
  • Science & TechnologyScience & Technology

Comments • 192

  • metacob
    metacob 5 months ago +279

    ACM Transactions on Graphics? What's that?
    SIGGRAPH? Who cares?
    Two Minute Papers? Better make some TMP-themed demos, this is serious!

  • Crabby Boi
    Crabby Boi 5 months ago +599

    2030: Dear fellow scholars this is two minute papers with Dr. Károly Zsolnai-Fehér today we have a full universe simulation at subatomic detail! what a time to be alive, but hold on to your papers because unlike the method we covered last year this new method runs at real time

    • playbyan1453
      playbyan1453 5 months ago

      @glibg10b yes but it can render properly

    • The GAME is here!
      The GAME is here! 5 months ago

      This comment is gold.
      thexvid.com/video/ydTs5ClSE-s/video.html

    • Saka Mulia
      Saka Mulia 5 months ago +6

      2033: Dear fellow scholars this is two minute papers with Dr. Károly Zsolnai-Fehér today we have an update to the full universe simulation at subatomic detail. It looks like the simulated people are starting their own SIGGRAPH! What a time to be alive!

    • NAMAN KANSAL
      NAMAN KANSAL 5 months ago +2

      You should have included - "What a time to be alive" 😂

    • Chomusuke
      Chomusuke 5 months ago +3

      @Rylac Zero it simulates faster than reality, allowing us to see into the future

  • EnDaBeeL
    EnDaBeeL 5 months ago +188

    When he said magnitudes in improvement a few papers down the line, he wasn't lying.

  • Wango
    Wango 5 months ago +47

    This has literally become my favourite youtube channel
    Gotta hold those papers

  • 10th letter
    10th letter 5 months ago +214

    Winter clothing in a game is interesting. Imagine like a scarf fluttering around

    • The GAME is here!
      The GAME is here! 5 months ago

      2mil polygon onscreen for clothes is not gonna happen anytime soon for any game. So maybe a next Crysis will push graphics card forward again.
      thexvid.com/video/ydTs5ClSE-s/video.html

    • Vít Pokorný
      Vít Pokorný 5 months ago +1

      @Creepin Bro I mean true, but look at ray tracing for example, its been a thing for quite a while and people try to push it in to games, but at the end of the day, its only an option on high end devices and even then the difference it makes isn't really worth it.

    • Creepin Bro
      Creepin Bro 5 months ago +3

      @Vít Pokorný Well that's the point of these papers, as each on grows upon the last, the methods become more effecient. They could already do simulations like this, the point of this paper and ones that build upon it is making it effecient so that you _can_ run it in a practical manner.

    • Vít Pokorný
      Vít Pokorný 5 months ago +2

      Cant wait to not be able to run the game.

    • Quinnlan Kuhl
      Quinnlan Kuhl 5 months ago +3

      thank you for that insight. Your ambition is limitless

  • Felixxenon
    Felixxenon 5 months ago +69

    I remember when Lalf life 2 came out, and I was blown away by a physics based crate being pushed over. Makes me think how I will look back at this when all simulations are equal to real life, and we still have graphical power to spare.

    • Benvel
      Benvel 5 months ago +1

      Lortal 2 and Leam Fortness 2

    • Dead Pianist
      Dead Pianist 5 months ago

      holy lafl life

    • Kenny Hill
      Kenny Hill 5 months ago

      Mind off when they say real time with a GPU, they mean relatively modern.
      If you even could run this sim on a HL2 era GPU it would perform like ass.
      Though a huge amount of perf/watt is being left on the table at the moment due to stretching single die GPUs to the max of the perf/power curve before the power draw skyrockets.
      Once MCM/chiplet GPUs are introduced they will dial back the clock frequency/voltage per die and we will see a stupendous increase in perf/watt as they scale the number of dies.
      Unfortunately this also means more overhead, so IO may eat some of the perf/watt to get chiplet GPUs past a certain point.

    • Felixxenon
      Felixxenon 5 months ago +5

      Whoops! typo...Too good to edit😅

    • Johnrex Bernal
      Johnrex Bernal 5 months ago +8

      Laf life 3

  • GrapesAndSand
    GrapesAndSand 5 months ago +21

    This really looks like a dream! I love the return of those tiny holes and bumps, and real time simulation is insane. The only problems I see now are even smaller details like the holes at the seams of the sock in 2:50, the edge yarn seemingly being held by nothing, the light simulation mentioned at 4:04, and I do fear yarn overlapping or snapping at 0:22. There's no doubt those will be fixed 2 papers down the line though, if the huge improvement seen in this paper is anything to go by.

    • Vocassen
      Vocassen 5 months ago

      Another "problem" is that the simulation is a standard mesh based simulation and the yarns are just visual, which results in the harsh transitions at 2:15. So it's absolutely great for realtime stuff, at the cost of an actual yarn simulation, so not entirely fair to compare to full yarn simulations.

  • Austin Jacks
    Austin Jacks 5 months ago +4

    TMP showing a new method: "Look how beautiful this is!"
    TMP after showing a better method: "Look at how bad this old method was"

  • London England
    London England 5 months ago +7

    Thank you for giving these amazing scientists the popularity they deserve! This is like beating Usain Bolt's record by a wide margin!

  • WhiteSythle
    WhiteSythle 5 months ago +13

    I'm glad you covered what appears to be a woefully underappreciated paper! It's fascinating stuff and makes me wonder what we'll see in the future!

    • Kenny Hill
      Kenny Hill 5 months ago

      Depends what you mean by underappreciated.
      I can guarantee you that the people making 3D DCC software appreciate it a lot.

  • Bryce
    Bryce 5 months ago +6

    Oh my! I honestly can't wait to see what people do with this real-time yarn... I mean when your in the fractions of a millisecond you have a lot of flexibility to work with!! Awesome paper!

  • NIP TV
    NIP TV 5 months ago +2

    It boggles my mind how game-changing it could turn out if someone can implement all these amazing techniques in an actual game product. Character animations have always avoided stuff like putting on/off clothes in games due to various technical issues making the realistic and robust cloth simulation unfeasible on typical hardware. Looks like this is about to change in the following years.

  • 2openhere
    2openhere 5 months ago +1

    I was watching The Matrix Awakens tech demo yesterday, and kept on thinking of Two Minute Papers all the way through, and wondering how many of these papers are in that demo and what papers are still to come. What a time to be alive!

  • 14zRobot
    14zRobot 5 months ago +2

    Interesting. I wonder why motion looks so fluid. When we see the sock part of the video, when it is getting on, flying parts look like water, not a yarn

  • epicthief
    epicthief 5 months ago +1

    When the simulation is so good you need a new lighting algorithm, love this

  • David Koch
    David Koch 5 months ago +4

    "Views are not everything, not even cloth" : Well done, very well done...

  • Duane
    Duane 5 months ago

    All of these 2 minute papers are fascinating on their own. Would love to see combinations of these (like 4+) put together into several example videos!

  • Markosz
    Markosz 5 months ago +1

    From days of computing to only hours of computing (with compromises) and now to REAL TIME computing with pretty accurate simulation. Incredible progress!

  • TheDofflin
    TheDofflin 5 months ago

    Can't get enough of this stuff. Absolutely love following computer graphics research, and there's something especially satisfying about sub-millisecond processing times.

  • Tonda
    Tonda 5 months ago +1

    I'm doing a PhD in computer graphics. I'm working on same topics as Georg's, I hope one day my contribution is gonna be on 2min papers

  • Cy Starkman
    Cy Starkman 5 months ago

    thank you for making some of the most exciting research accessible

  • juliandarley
    juliandarley 5 months ago +10

    very nice. i wonder if the algorithm can be applied to something made in marvelous designer? also, the authors mention that they get these impressive results on just a GTX 1080Ti (and i7-7820X) - this is probably less than many gamers have in their desktop machines (this is a guess - i do not have scientific proof!).

    • juliandarley
      juliandarley 5 months ago

      @Render Wire many thanks for the info. CPUs at least seem to be easier to get hold of than GPUs.

    • Render Wire
      Render Wire 5 months ago +1

      @juliandarley Newer i5 and Rysen5 with 6 core are a little bit faster than the 7820x 8 core. The I5 10400 and R5 3600 are close to be equivalent and the I5-12400 and R5 5600 can be up to 20% faster depending on the task.
      The 7820x was a 8 cores workstation CPU so it still hold up pretty good.

    • Tsz Fung Li
      Tsz Fung Li 5 months ago

      @I'm the captain now If the chip shortage goes away, by the time this method is actually implemented in games, it should not be a huge problem. And devs are not putting that highly detailed clothes into games soon

    • juliandarley
      juliandarley 5 months ago

      @I'm the captain now at least 3060s are available now. suggestions for something better than 7820x (but still at a very reasonable price - eg. secondhand 7820x can be less than £150) would be welcomed. perhaps AMD?

    • I'm the captain now
      I'm the captain now 5 months ago

      1080ti and 7820x is still very much top of the line. To slightly beat a 1080ti youd need an rtx 3060 from 2021.

  • bubbleg
    bubbleg 5 months ago

    All these amazing physics and games somehow still utilize old single-core physics :D

  • delpinsky
    delpinsky 5 months ago

    Spectacular! Clothing simulation is one of the most difficult tasks when it comes to computer graphics. Let's imagine an UE5 or similar graphic engine, running a game with such details!
    It's the same for hair movements.

  • Zavier Miller
    Zavier Miller 5 months ago +3

    Wow in one paper cloth simulations surpassed the speed of light sim that's insane

  • MrVipitis
    MrVipitis 2 months ago +1

    That's a beautiful progression in not just two papers down the line. But like a year or research and development.

  • B D
    B D 5 months ago +1

    I am a textile designer with an interest(and a bit of a background) in robotics and computation so this is right up my alley. A designer named Hanifa presented a 3G gen runway show last year and could see this hyperrealism driving innovation in that area. Look up her show, it is incredible!

  • R B
    R B 5 months ago +1

    Would love to hear one sentence in these great presentations explaining the new techniques. Just one sentence please. Not easy, but you can do it.

    • Archival Copy
      Archival Copy 5 months ago

      That could be a good idea. 1:39 gives a sentence explaining the change, but does require info from two sentences ago, and doesn't explicitly state the simulation uses both CPU and GPU to cut down on overall rendering time (which is said at the end of video).

  • The Weeb
    The Weeb 5 months ago +2

    May i request a paper?
    It should be duct tape! Yes, realistic duct tape that sticks to physics objects, isn't that awesome?

  • funny fella
    funny fella 5 months ago

    Now imagine this combined with a realistic tearing simulation! 😱

    • Mihoshika Furude
      Mihoshika Furude 5 months ago

      So basically, realistic ripping of clothes off big breasted ladies?

  • Daenoril
    Daenoril 5 months ago

    What great progress!

  • Raenin
    Raenin 5 months ago +6

    Humanity is slowly working towards the tech needed for Deep Dive. I love it.

    • Martiddy - Sama
      Martiddy - Sama 5 months ago

      I would say that within 20 years Sword Art Online will be a reality.

    • EduSSantoz
      EduSSantoz 5 months ago +4

      @Ariahhhh junst imagine this progress in simulation and neuralink's like BCIs(Brain Computer Interfaces) in the future can feed audio,video,touch,smell etc direct on our bains beeing implemented, it could all converge in a way for us to get in a "matrix", or some black mirror episodes, where people can "live" on a virtual world in their brain withouth needing screens.

    • Ariahhhh
      Ariahhhh 5 months ago

      what do you mean by deep dive? VR?

  • D L
    D L 5 months ago

    Without a doubt, my favorite TheXvid follow. What a time to be alive!

  • Om3ga Let's Play
    Om3ga Let's Play 5 months ago

    What I'm scared is that some day, solar flare will wipe all of these amazing computations. Glad that "paper" is literally available in real paper too.... i hope

  • Sky Anthro
    Sky Anthro 5 months ago +11

    I would love a Two Minute Papers blanket lol ^-^

    • James Hudson
      James Hudson 5 months ago

      I want the small knitted animal.

    • YOEL _44
      YOEL _44 5 months ago +3

      Sign me up for a mug!!!!

  • Tony C.
    Tony C. 5 months ago

    What an amazing breakthrough (ripthrough?)!!

  • Hello my name is Nobody

    Wow, i really wanna see this in future games.

  • Tramonta
    Tramonta 5 months ago

    Just a quick question: when they give the time needed to compute each frame, thats hardware dependant right? Which hardware are they using?

    • Jorge C. M.
      Jorge C. M. 5 months ago

      another comment said it was a gtx 1080ti and a 7th gen intel cpu or something like these

  • ariel sandoval
    ariel sandoval 5 months ago +14

    wish games developers had a team to do this, like "GTA V now runs in the integrated graphic card" or something

    • Jorge C. M.
      Jorge C. M. 5 months ago

      @YOEL _44 True, at least for now, it may never do but at least parts of the games could use it with a fast SSD

    • YOEL _44
      YOEL _44 5 months ago

      @Jorge C. M. I don't think it works with soft-body physics

    • Jorge C. M.
      Jorge C. M. 5 months ago

      Nanite from UE5 and similar techniques make gpu usage lower

    • YOEL _44
      YOEL _44 5 months ago +1

      You can already run GTA V in iGPUs.
      Simulations can do trickery, aproximation, or even be skipped completelly for an AI interpretation, but games need some level of consistency and some adjustable "rules" to be fair towards players, and graphics just need to come from somewhere, nowadays you have DLSS but that doesn't replace classic rendering methods, just increases high frequency details of low res images to fake our eyes.

  • Beregorn88
    Beregorn88 5 months ago +7

    I can't avoid to notice the conspicuous lack of comparison with the yarn-based simulation...

    • Markosz
      Markosz 5 months ago

      I'm sure the results aren't exactly like the simulation that takes days to compute, but seeing how it runs real time and offers better results than the previous method which didn't have any yarn simulation and still ran for hours, I'd say it's pretty impressive, it's not something to scoff at.

  • Onihikage
    Onihikage 5 months ago

    Hopefully the next paper will refine the friction exerted by these yarn-based meshes. That Two Minute Papers blanket shouldn't have been slipping off to one side the way it was!

  • Noah Boddy
    Noah Boddy 5 months ago

    I was impressed with model's martial art kicks, then you give us yarn. Unbelievable.

  • kiwirocket64
    kiwirocket64 5 months ago

    Two Minute Papers how would I do all these simulations can I use blander or would I need a different program can I use windows or would I need to use Linux? I really like your interesting content but I would really like an explanation on how to do these things

  • The Black Baron
    The Black Baron 5 months ago

    Amazing. This is going to make CG movies even better, I don't think any games are going to use this intensive rendering of realistic mesh/ cloth anytime soon.

  • Tortol Gawd
    Tortol Gawd 5 months ago +1

    want to see the implications of this on new movies!

    • nunya business
      nunya business 5 months ago +1

      Am I the only own who finds the big loose sweaters on the mannequin robot things around 1:50 super cute?

    • mr pwha
      mr pwha 5 months ago +1

      I want it on games

    • mittamoa
      mittamoa 5 months ago +1

      You won't (wouldn't?) see it ;)

  • alireza vahedi
    alireza vahedi 5 months ago +3

    Could this be integrated alongside fire simulations?

    • YOEL _44
      YOEL _44 5 months ago +3

      @Michał Bielawski I think he want's to see the world burn

    • Michał Bielawski
      Michał Bielawski 5 months ago +2

      Do you want to simulate a yarn blanket, fire crackling in the fireplace and a mug of hot chocolate?

  • Wilkens Brito
    Wilkens Brito 5 months ago

    This is Fing amazing!

  • shltr
    shltr 5 months ago +1

    How much time do i have to wait before seeing this in actual standard 3d programs like blender, Maya or even cinema 4d

  • Blackout Gaming
    Blackout Gaming 5 months ago

    But when would we see this in blender? Ever? Or would be like a paid plugin? I don't even know what happens in the lifetime of these amazing works. Can someone explain ??

  • Nathan ObJective
    Nathan ObJective 5 months ago

    Unreal engine would snap this invention in no time

  • Zerg Radio
    Zerg Radio 5 months ago +2

    Now we can smother someone with perfection :)

  • SERVANT TO FRIEND
    SERVANT TO FRIEND 4 months ago

    @TwoMinutePapers Could you please tell us what these programs are? Have watched 4 of your videos in a row... Loved them all. Loved the displayed technology. Don't what the hell they are. Was able to figure out Nvidia Canvas, because I could see it in the corner of your screen, but... Why not just say this? PLEASE... If you are going to show us cool tech, tell us what it is, who makes it, and where we can use/try it. Thank you.

  • Smiley P
    Smiley P 5 months ago +1

    Holy holding onto my papers. Woooow. And just imagine 2 papers down the line??

  • Rendra Rukmono
    Rendra Rukmono 5 months ago

    I guess it's better to put the link of Dev video in the description, to show our respect on them

  • Aljon
    Aljon 5 months ago +1

    i always wait for those motion capped scenes lol

  • CamberGreber
    CamberGreber 5 months ago

    From 40hrs to 33ms Holy Shit very impressive.

  • Amy Dentata
    Amy Dentata 5 months ago

    Looking forward to Yarn Fighter 3000

  • drdca
    drdca 5 months ago

    I misread the title, thought it said to real yarn, thought this was going to involve a process for computer design for yarn/cloth objects

  • WestOfEarth
    WestOfEarth 5 months ago

    This is the sort of detail and rendering speed Hollywood gfx requires.

  • Joshua
    Joshua 5 months ago

    So when does this technique get integrated into some like Blender?

  • CodingStuff
    CodingStuff 5 months ago

    What a time to be alive!!!

  • NosferatuOhneZahn
    NosferatuOhneZahn 5 months ago

    I'm a real noob in this field, can someone explain where I "could" (I couldn't but I'm just curious) do this? Is this in Blender or is this in own made Engine?

    • NosferatuOhneZahn
      NosferatuOhneZahn 5 months ago

      @TheNeonBop Thank you

    • TheNeonBop
      TheNeonBop 5 months ago +1

      I think you would probably have to spend a while trying to modify their code to get it to work in blender or a modern game engine. I'm not an expert either though so I am not certain.
      Edit: it says "Our method can add yarn-level details onto any deforming triangle mesh: examples in this paper use deforming cloth meshes from ArcSim [Narain et al. 2013, 2012], position-based dynamics [Müller et al. 2007], and Blender [2020]." It sounds like you can export a basic cloth sim from blender and use their code to give it yarn details

  • killedamilx
    killedamilx 5 months ago

    Amazing!

  • SomeAssemblyRequired
    SomeAssemblyRequired 5 months ago

    Amazing!

  • Russell's Shorts
    Russell's Shorts 5 months ago

    damn, I was really hoping for some hot new two minute papers merch

  • dempsej
    dempsej 5 months ago

    Marry this to Unreal Engine 5, quick!

  • channel
    channel 5 months ago

    What a time to be alive!

  • zetathix
    zetathix 5 months ago

    Now modeling those cloths would be harder than simulate them

  • Carlosgab Ellazar
    Carlosgab Ellazar 5 months ago

    Im just a kid, but i enjoy watching these ai and simulations

  • no name
    no name 5 months ago

    The only method of information retention that works for me any more is the swole armadillo

  • Vaeldarg
    Vaeldarg 5 months ago +1

    @Two Minute Papers That paper SIGGRAPH presentation video may have only had 61 views, but one of them was yours. Now it can be seen by your over 1M subscribers. So in this case it was quality over quantity, of views.

  • Adrian K
    Adrian K 5 months ago

    Mind-blowing 🤯🤯

  • Mark Zaikov
    Mark Zaikov 5 months ago

    From Hours to Milliseconds in just a year!?

  • matty_fat_batcat
    matty_fat_batcat 5 months ago

    That was a good yarn

  • mpauls85
    mpauls85 5 months ago +1

    The original presentation still only has 88 views... C'mon guys!

  • Carl Holcomb
    Carl Holcomb 5 months ago

    Someone send this method to Epic Games. The world could benefit from this in Unreal Engine 6

  • Mitcham Tuell
    Mitcham Tuell 5 months ago

    I would buy Two Minute Papers merch

  • Plasmabeam431
    Plasmabeam431 5 months ago

    So is this a continuum mechanics problem? and are they solvning this using finite element method?

  • Debajyoti Majumder
    Debajyoti Majumder 5 months ago +1

    Cant wait for it to come in blender!!! @Károly can you tell them to implement this in blender? people will go crazy over it!

  • GRANDOS
    GRANDOS 5 months ago

    awesome!

  • asdawece
    asdawece Month ago

    I hope this will be implemented on red dead redemption 3

  • La Logic 2
    La Logic 2 5 months ago

    Imagine running minecraft on the matrix's computers

  • Novruz Gurbanov
    Novruz Gurbanov 5 months ago

    And you guys still think we are not in a simulation?

  • generrosity
    generrosity 5 months ago

    61 views, soon to have 61k views here - you do great work 👌💚

  • Kenneth
    Kenneth 5 months ago

    scary fast progression since your last cloth video...

  • Heinrich Wonders
    Heinrich Wonders 5 months ago

    A-MA-ZING!

  • Rienk Kroese
    Rienk Kroese 5 months ago

    Plays games at 60 fps.
    Put cloth on Ultra.
    Plays games at 1 fps

  • Aniket Adhav
    Aniket Adhav 5 months ago

    Hello guys I want to learn hoe to make this cool simulation can you tell me how do I get started??

  • krunkle
    krunkle 5 months ago +1

    2:38 Already realizing the nefarious uses of this algorithm I see.

  • YOEL _44
    YOEL _44 5 months ago +1

    2:55 Asking that while having the numbers on screen, LOL

  • The Rock Procrastinator

    As you know we can't buy affordable graphic cards ,,, so i'm still waiting about AI prodigy to make my old R7 265 graphic card run the games of today ... of course AI should do that easily ... or $$$ maybe not !
    AI will never be able to stop fraud & scam ! ,,,

  • Noah Zuckman
    Noah Zuckman 5 months ago

    Y'all better get ready for Knitting Simulator 2025

  • Doctor Nemmo
    Doctor Nemmo 5 months ago

    Holy geometry, Batman !

  • hooDio
    hooDio 5 months ago

    it might aswell be a merch ad, wink wink

  • Morgan Freeman
    Morgan Freeman 5 months ago +1

    What if, I’ve never held a paper…

    • Morgan Freeman
      Morgan Freeman 5 months ago

      @Evskee Skee this guys good 😅

    • YOEL _44
      YOEL _44 5 months ago +2

      At this point you'd be over two papers up the line

    • Evskee Skee
      Evskee Skee 5 months ago +2

      Then it wouldn't be a time to be alive!

  • TheLazy0ne
    TheLazy0ne  5 months ago

    Adult games will benefit from this too.😏

  • Chestnut
    Chestnut 5 months ago

    2:38 My sock too moves like a sea slug having a siezure when put on.

  • mittamoa
    mittamoa 5 months ago

    Crazy!

  • Hugh
    Hugh 5 months ago

    Wow!

  • M Map
    M Map 4 months ago

    That looks like a very uncomfortable sock