Finally, This Table Cloth Pull is Now Possible! 🍽
Embed
- Published on Dec 6, 2021
- ❤️Check out Perceptilabs and sign up for a free demo here: www.perceptilabs.com/papers
📝 The paper "Codimensional Incremental Potential Contact (C-IPC)" is available here:
ipc-sim.github.io/C-IPC/
Erratum:
The cover page in the first frame of the video is from a previous paper. It should be pointing to this: ipc-sim.github.io/C-IPC/file/...
❤️Watch these videos in early access on our Patreon page or join us here on TheXvid:
- www.patreon.com/TwoMinutePapers
- thexvid.com/channel/UCbfY...
🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Haro, Andrew Melnychuk, Angelos Evripiotis, Benji Rabhan, Bryan Learn, Christian Ahlin, Eric Martel, Gordon Child, Ivo Galic, Jace O'Brien, Javier Bustamante, John Le, Jonas, Kenneth Davis, Klaus Busse, Lorin Atzberger, Lukas Biewald, Matthew Allen Fisher, Mark Oates, Michael Albrecht, Michael Tedder, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Steef, Taras Bobrovytsky, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here: www.patreon.com/TwoMinutePapers
Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu
Meet and discuss your ideas with other Fellow Scholars on the Two Minute Papers Discord: discordapp.com/invite/hbcTJu2
Károly Zsolnai-Fehér's links:
Instagram: twominutepa...
Twitter: twominutepapers
Web: cg.tuwien.ac.at/~zsolnai/ Science & Technology
Erratum: The cover page in the first frame of the video is from a previous paper. It should be pointing to this: ipc-sim.github.io/C-IPC/file/paper.pdf
Apologies!
Ok, I have watched this channel for a long time and got my mind blown several times (holding on to a cubic meter of papers meanwhile).
THAT is crazy and useful beyond belief. How is this even possible? Something that thin? How long does it calculate? I´m baffled right now. Investigating...
I remember in approx 2007 when raytracing was in a paper and they had a photo of realistic rendering of a room. I read it and said to my self this is the future. Back then it took hours or more to make a single image and with RTX cards it became a thing. I conclude we will have high grade simulation for everyone soon, like we can do graphics today with Unreal or Blender and have SDKs for programmers.
@Jib Cot On a CPU, not on a GPU. Also, tech-wise we're still in the early days.
He does say it is taking minutes per frame to calculate, the noodles and sand on sheet look amazing but are currently taking around 10 mins per time step.
@T4CO G4M3 And that´s exactly the reason why the probability that our reality is not simulated is low. Just a project in the bag of some student from a reality having a different timescale and extent. :)
i love how technology is essentially progressing enough to be capable of creating a new layer of reality, physics, characteristics and interactions of objects in a virtual world have progressed so far, at a certain point in the future, say, with games, and more specifically, a shooter game, we wont need programmers to program "damage", "recoil" or "fire rate". just the characteristics of the weapon and ammunition within it, etc etc, and just let it off to do its thing
It's insane how realistic this is. This is mind-blowing. Thank you for sharing all of the data & beautiful visuals as always! 🥰
In the eaarly 80's I remember repeatedly reading a picturesque book I loved describing the future of gaming and one of the predictions being detailed full color LCD. These videos fill me with the same kind of amazement. 😎
This has to be up there in the like top five most impressive papers covered by this channel. This is *really* impressive.
You are very kind. I loved it too - thank you so much! 🙏
The whole thing is mind-blowing, but the twisting cloth actually made me say "wow" out loud.
Same
isn't this like really huge? I've always heard of clipping problems from others and this paper literally fixes that!!! Can't wait until this utilizes the GPU instead of the CPU 😁
Imagine this quality and RTX and haptic gloves in VR!!! My God I want Matrix to be real before I die.
How do you know it’s not real already? 😳🤯
@Panzer Of The Lake Give it a try. It's weird that the lead role is Keanu Reeves :)
@Alper Johnny Mnemonic? Never heard of it
@Alper i reckon in 30 years, life is gonna be completely different. very exciting indeed. what a time to be alive :)
@Panzer Of The Lake I'm 29. I was very little when I saw the movie Johnny Mnemonic. From that day on I wanted to be inside a computer program that I can feel. it's very close now and I'm very excited.
that twisting cloth, that was damn damn impressive
If not for real time this could be amazing for any kind of prebaked cloth simulations, be it video game cutscenes, fashion, etc.
Finally, a technique that can solve fabric squishing. Humanity is now one step closer to solving the wet cloth simulation problem!
Some say two minute papers is actually an a.i programmed to make awesome youtube videos! What a time to be alive!
🤖
I'm no scholar, but I do really enjoy watching these videos on how in different ways computations and simulations are being improve upon, or made more efficient in order to resemble closely reality.
This video reminds me that in addition to "Like", TheXvid needs an "INSANE" button. This is incredible stuff.
Kindda wish for a series that takes the papers from previous videos and shows where they're at now in commercial software or whatever. I mean I'd love to see this as the cloth sim in blender for example, but I have literally no idea how realistic of an idea that would be. Do these things ever make it to the general public or do they just stay within the research world and go nowhere? Always been really curious about this
#gamedev imagine to see this in an actual game 🙂 this would be awesome and a bunch of whole New possibilities for any Genre 👍🏻👌🏼
I love these new animation techniques, my only concern is that it doesn't seem to take any kind of friction into consideration, and that some of the materials used also seem to be wayy more elastic and/or maluable than any real materials
Damn, that's impressive. One "real time in next gen gaming engines" please!
Just WOW! , this is the most realistic thing I've seen on your channel, thanks for these information you're providing !!
As someone whos simulated cloth over the last 10 years , this is mind blowing. Seriously, what a time to be alive ;)
hey kárloy, i’ve been performing card magic for nearly 2 decades so i just wanted to let u know the way the cards were shuffled was all wrong, the bend in the packets should be reversed to first interlace the cards, then they should be bent upwards with pressure applied from above and below until you’re ready for the bridge where you remove the bottom support and slightly widen the distance of the two bottom edges to cause the bridge where they interweave all the way.
also given they way they’re bent i wouldn’t expect them to fall straight down like they do in the simulation, they should be springing forward more as they fall
Sometimes it's hard to tell exactly how impactful a paper is going to be, but I think this one is pretty clearly up there, considering the assertion isn't "fewer" collisions, but "no" collisions. No collisions seems like the wet dream of every 3D graphics designer who has ever lived, and no collisions in a reasonable render time is kind of beyond belief. I'd love to know what the limits are on that, for example if it's a usable technique if the cloth is rendered as threads instead of as a whole object, and also how quickly the render time scales with the number of individual objects.
The last one was a massive flex where they threw in everything at once
I like two Minute papers.
I can't wait to see when something similar will be implemented into video games
No more hair going through clothes
Wow! I didn't squeeze my papers hard enough. This is incredible.
I'm just happy to be alive in this day and time. just to see this kind of simulation is jaw dropping in and of itself
4:41 for the sphere problem they cheated: they didn't computed the whole sphere-to-sphere interaction, but only a "bulk" behaviour, filling the volume with a "sphere texture". They didn't even used the proper hexagonal close packed configuration, but instead the much more easy (and much more unrealistic) simple cubic: you can see the spheres perfectly stacked one on top of each other...
@Beregorn88 you're mistaken. They're in hexagonal close packing, but with many irregularities because they're still in motion and not on a flat surface. It just looks like face-centred cubic packing because of the camera angle.
Those patterns form all the time in particle-based simulations, accidentally. It would be very, very strange if anyone went to all the effort of faking something that occurs naturally.
You're still perfectly free to investigate the source code if you're not sure, but first I'd suggest looking for pictures of "hexagonal close packing" to see if you can find any that don't also look like face-centred cubic packing at first glance.
@Alex McLeod have a closer look: that is NOT how balls stack. First of all, they don't form square lattices, but triangular ones; second, even if you had a square lattice the "sheets" would not be stacked directly one on the top of the other, but they would be offset, since the ball would naturally fall in the gaps of the "sheet" below. The only way to get the configuration shown in the picture would be carefully stacking the balls inside the corner of a cube, but even then any small perturbation of the system would make it collapse into one of the more stable configurations.
Have a closer look - that's a pretty normal packing pattern for identical spheres. They've published the source code, though, so you could check to see how it works for yourself - or if you're not a programmer you could just buy a bag of ball-bearings and reproduce the effect IRL.
Besides that, though, it would've been much, much harder to fill the volume with a "sphere texture" that flows, and contains realistic irregularities in its packing, and also has freely-moving individual particles. That would be a research paper in itself!
They didn't simulate that tho that was the sand falling onto the simulated object the sand was probably just from another paper/ if that is even implementable in such a algorithm a blender fluid simulation
I am a 3D artist and I have a genuine question. How much time does it take for these developments to find its way into animation programs such as Maya, Blender, etc? There seems to be a massive gap between what's possible and what's getting implemented.
For 2d stuff it would usually go Ibto photo editing phone apps. So If you want to make use of the sharpen ai for faces you'll probably export your Photoshop image to the phone and then back. Maybe it's the same for 3d
I need to know it that too so I am tagging allong. If the author of this paper made this free, why cant or wont developers implement this on Blender?
Man, these simulations are getting so realistic. Some day we could simulate an entire universe down to individual atoms and particles. Possibly even complex life forms. Hell, they might even evolve into intelligent beings that can create their own simulations… Uh oh…
Ad infinitum ad nauseam
I wonder if one day we'll get a paper that breaks the "squeeze your papers" barrier.
My papers are only 0.1 mm and as I was holding onto my stack of papers, my hand did not intersect through them.
Watching a two minute papers video that produces fantastic visuals while my computer is running some training loops for a deep learning project that has yet to produce any visuals (it generates shader code by looking at all examples on Shadertoy)
I wonder how a simulation game implementing all these cutting edge techniques from a bunch of papers would look like... Game devs take note!!
An engine developer would be more likely to implement a lot
Extremely impressive results.
4:55 So you are telling me that science has found a way to simulate me dropping breadcrumbs on myself, in a hammock, with a blanket? Outstanding!
Would love to see this implemented in Marvelous Designer
I want it in Blender.
*piece of cloth gets simulated*
"What a time to be life!"
Now just what is time and what it means to be alive.
Renting renderfarm would be perfect to calculate simulation, if multiple cpus are supported.
this is the real deal - when calculated in real-time
Would it be possible to implement this in a 3D software (houdini, blender etc) right now...? is the code for this available...?
cant wait to see this used in games
What is this sorcery ?!
I'm truly blown away. Knowing that this has the potential to be rendered in real time is astonishing.
Good! Hope to see this in a Skyrim mod next year
This is so magnificent.
The double kinking on the twisted cloth SUPER impresses me. Is that emergent or did they have to fiddle with it to get it to work?
It would have worked more reliably if the trick was performed in the standard way(pulling down)
Very interesting. Thank you.
Games in 10 years are going to be mindblowing
Waiting for it to come to blender
Wow! Great quality!
I want this in Blender right now
All of these would go really well got VR poker game.
This is the biggest nerdgasm I've ever had. He twisted the cylindrical cloth and I screamed.
can't wait for blender to adapt this technique
0:40 is so funny lol
hahaha! haha? haahahaha, haha.
hahahahahahaha
Been waiting for this day my entire life.
2:56 interesting pattern at the bottom
gravity is a little too high in these simulations.
So how can we use this? is there any software offering this technique?
The sand looking a little to fluid. Real sand isn't made of tiny balls but rather infinite variations of complex shapes. So I get that it's going to significantly increase render time but that's the cost of realism at this point in our abilities
I hope the Justice system is paying attention because, video evidence is going to need some serious artifact validation.
The armadillo's bed look's really uncomfortable, especially after someone left crumbs all over it.
What a time to be alive indeed
On which software I will get this feature , I love your video but don't know where I get it implements in real life. I am no coder I am VFX artist .
2:12 why the cloth starts interacting before the ball hit them?
@Johnny-Anthony Khawandeach simulation was paused as the cloths were swinging, once the balls spawned, the video and swinging continued. If you're seeing something else then I can't see what you're seeing even with slow motion turned on
😅
@mikosoft No all three packs have same behavior each at separate times, just before the ball hit them.
continuation of the previous simulation (of the cloths falling)
Garry's Mod in the mean time: **Cries in Source Engine physics**
Bitter lamentations
it's more realistic than reality itself
Finally! I can have a simulated super thin condom that actually works without unwanted penetration
Once this gets to real-time, hyper-realistic VR will be possible. And Half-Life: Alyx already looked great.
@Aanand Bajaj the simulations here take minutes per frame for small scenes with very limited detail. There will always be compromises to make something that looks nice but will always break down whenever examined closely.
In real life you can fit 10 trillion atoms on the head of a pin... there's no way you can accurately simulate the whole solid pin in real-time let alone a planet or galaxy. Someone may have vivid dreams that feel real, but there are always inconsistencies and breakdown. Ultimately these are tools to solve problems and entertain. Of course you don't want life to be loaded with deepfakery... but great simulation theory is debunked when you know the numbers.
yeah which means we are likely living in a simulation if it’s ever possible to create a hyper realistic universe
Oh wow, Vellum dethroned!
Existence of this realistic simulation is unrealistic
What about fabric and water?
I doubt it can get even better than this.
Create a way to calculate that on the GPU and put that baby into Blender pls :'c
Hooray for inertia :D
Pixar engineer when they see this method
*JOY*
Am I overthinking that cat and glass table joke?
I'm not tired to repeat it. Yea, we are simulated.
How do i implement this in blender
Any blender developer in here?? 😉 You know what you need to do.
At last! The human race can now simulate noodles!
This can already cheat my eyes to be honest.
Hold on to your paper early gang
This has always been possible the trick is to disable physics on everything except the table cloth.
Can I shuffle a virtual deck to generate a random number? 🤔
One of the things about simulations is that they're often not random at all. If the cards start in the same exact position, are let go at the exact same moment, and fall in the exact same way, everytime, then there's no randomness to it. Same start, same finish.
If you want different starts, then you've already done the work of producing randomness.
now when do i get to use this in MMD :)
Thank you
very cool
What happened to his voice? You can hear the massive difference from his normal intro to the one in the video. It's been a while since I watched, but it seems like it's been like this for a while. The way he talks changed quite a bit as well.
and the flowers are still standing!
Seeing the bottom view of that bowl of noodles is triggering something primal inside me
Here early enough to see the like count change when I absolutely demolished it.
On paper closer to the matrix
0:57 😂🤣
I'M SQUEEZING MY PAPERS!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Blender.org 3.0 just released..
i didnt hold to my papers
wow
@:205 This is why I don't trust these animations. Somehow the sheets anticipated the ball falling? They begin to react prior to the ball making contact. Maybe the simulations grossly over estimated their mass and gravity caused the sheets to pullup toward the ball. :D
@KillFrenzy Thanks for clarifying that for me... after watching again with the idea that the simulation was paused, I now see the continuation of motion of the sheets. Quite interesting how my point of view was much different when stepping through only the frames after the ball was spawned.
The simulation was just paused. The simulation only continued once the ball was spawned. Look at the previous footage where the cloth was swinging up after falling then was paused.
those animations are simulated in one go - sheets falling, then sphere falling. So the movement you see is just continuation of the sheets falling.