Advanced-Groom-Dataflow-Setup-in-UE-5-7-

https://www.youtube.com/watch?v=cER2jT8oxKY

Frame at 0.00s

Hi everyone, my name is Michael Froh and I'm working at EPIC as a senior physics engineer. So today I'm going to present you the new advanced Groom data flow setup that has been introduced in UE 5.7. So the agenda for today. So we're going first going through the motivation behind this initiative, so why we decided to change a bit how the Groom was set up. Then I introduce briefly the data flow framework. For people who are not aware of or never use of, it will be like a really simple introduction, a really brief one. Then we go through the Groom integration, the deformation pipeline, and we'll finish by a simple example to show off all of that in action. Groom has been around for quite some time. It has been, I think, released like five or six years ago. And if a lot of things have evolved since then, the way you were set up things didn't really change. So you were just doing all your work and all your setup in the host application. That could be Maya or Dini. And just after that, you are importing your group. And after, you are able to modify it a bit inside engine, and you are using it. But the way you are really set up things did not really change. We wanted to improve on that. We wanted to improve the flexibility on the way we are set up things. For example, if you wanted to add a new attribute in your Groom, you had to go back to the DCC, you have to add the attribute, you have to re-export your assets, you have to re-import into the engine, you have to modify the source code, basically to be able to access these attributes into the formers, into materials, And so it was really a painful process with a lot of steps. So we wanted to have a dynamic set of attributes, something that could be done fully in Engine without going back to DCC. And we wanted to have a dynamic set of attributes directly available into deformers of materials. We wanted to have as well more features. Sometimes, for example, we had a project when we had the simulation and we wanted, for example, a bit of the groom to be steeper in front and softer at the end. And unfortunately, it was not really possible. We are not able to select specially some areas and just modify them. So we wanted to introduce painting selection into the tool. And we wanted to have a better bridge with other editors. We wanted as well to have a node-based framework in order just to give the users and the artists more flexibility in the way they are set up things. So that's one of the goal, at least. We wanted to have, as well, a better scalability. So you can see, for example, on the right, a lot of MetaHumans. So it could happen in your project. You could have a lot of them. And you can imagine that for each one of them, you can have mustache, eyebrows, you can have hairs, different ones. So at the end, you can have a lot of assets to deal with. And if at some point during your project, at the end of this one, you say, oh, I want to maybe decrease a bit the competition time and just reduce the complexity of my room, you have to go through each one of them, modify them. And so you have to go back to DCC, re-export them. It's a really long process. And so we wanted to have a template-based system. So basically, you just define some kind of recipe. And after, you can reuse that recipe across a lot of assets. So this makes late changes possible. Even at the end of your project, you can just say, I going to decrease the number of guys by 10 And you can apply apply that to receive you on all your assets in a one button Another feature that we wanted to target was like art directability. So I don't know if like some of you have watched the demo we had like at Reefest Orlando on like the Witcher one. This was a collaboration with CD Projekt. This demo was really awesome. In that demo, we had a sequence when the horse was running. It was this one. And the deformation was really great and awesome. But in that example, we did not choose the groom for that demo. One of the reasons was because we are lacking the art directability. It was not possible if, say, I want at some point for the tail to be here or here. I mean, it was not possible. For sure, you can simulate the full tail. but you are not able to really control exactly where it could be at some point in time. We wanted to give that freedom to the artist in order to control the tail, for example, with Control Rig, with AirBand, and to give the artist a full control of the deformation. It is why we picked Dataflow for that, because Dataflow is a really flexible framework. I'm going now to present you a bit what it is. So I'm not going to go too much in detail. If you have more questions about how it works, I will be around after the talk so we can discuss a bit more. But I'm just going to briefly describe that framework in case you never use it. So Dataflow is like an asset generation tool. It's used for now in most of the chaos asset editors. So it's used for clothes, how to eat asset, flash asset, Geometry Collection Assets, and I think it's going to be used in 5.7 for physics assets as well, generation, and now for groom assets as well. It's used more and more. I hope it will be used even more in the future. It's a really great tool. I'm just going to describe here really briefly the UI. For example, on the bottom part, you have all the graph-based widgets. On the top part, you have all the view-based widgets. The central part of it is the graph editor. The graph editor is a standard dependency graph, in which you have a collection that is flowing from node to node, until a terminal node, which is basically writing your asset. On the left side, you have the member variables. On the right side, you have the node details. It is quite standard if you are used to blueprint to other graph-based tools. On the top, you have the construction viewport, simulation viewport, simulation timeline, the tall palette on the left, and you have the preview scene, asset details, and the signal liner on the right. I will now focus on some of them, because they are a bit different to what you are used to, but for most of them, it is quite standard for Unreal. So for the construction simulation viewport, their goal is just to give you instant feedback of what's going on. So different kind of feedbacks. The construction one is there just to give you a visual representation of what's going on for each node. So you are going through each node output, and you are going to render the output. So it gives you the freedom that if you have something flowing from node to node, you can just click on several nodes one after the other, and you can see the evolution of the collection that is flowing in. So it's a bit like Houdini, if you are used to Houdini a bit. So it's really, for example, when you're modifying the geometry for example you say I going to resemble my number of vertices and I going to smooth a bit my geometry or something like that you can directly see the end result live in your viewport The construction viewport is used as well for all the tooling. So we have some tools available in Dataflow. And these two are only like opening of the construction viewport, when you want to paint a some attribute to select something, everything will happen in the construction viewport. The simulation viewport is a bit different. The goal is to have the final rendering and the final deformations there. So you are set up in this viewport in a bit in a different way, because you have like in the preview scene, you have an option to put a blueprint. So by default, each asset has its own preview blueprint, I would say. But if you want to do something custom, you can just create your own blueprint, put whatever you want inside, put the blueprint in the preview scene, and it will display something here. So for example, even if we take, for example, this example, this one is a simple blueprint with one asset. But if you want to, let's say, test the scalability of your asset, you can create a blueprint with one of the characters running on screen. And you can see, for example, the transition of the LOD or something like that directly into your simulation viewports. You have as well a timeline here just to record the deformation in case you want some animation and deformation there. For the subgraph on the variables, it's quite an important feature for Dataflow because it makes you reuse things as much as possible. Dataflow subgraph is kind of standard. I mean, you just group a lot of nodes there, and after you can reuse that Dataflow graph into the main graph. So it's just a way of reducing the number of nodes in your graph and just making probably less errors when you're going to set up it. So the data flow variables is as well super important because you can imagine that without that, all your nodes, I mean all your graph, won't be a recipe anymore. Because for example, if you say I want to target a skeleton mesh for my room, you will need to embed the skeleton mesh node inside your graph and your graph will no longer be a kind of template that will work for every kind of asset. It will be really specific to your asset. So if you do that, you will need to create a Dataflow asset for each one of your Groom assets. So now we have Dataflow variables. So you can define a variable. You can override that per asset. But the graph itself is kind of unique and could be used across all your assets you have. Another widget that is really interesting in Dataflow is the collection spreadsheet, because it allows you just to debug the collection that is going from node to node. And it adapts as well to the selection. So if you want to debug what's the output of your node, you can just directly see all the groups and the attributes that are within each group and the values that are within each attribute directly in the collection spreadsheet. So now I'm going to go through the Groom integration, so what we did to integrate the Groom into Dataflow. So the first thing that we had to do is to change a bit of data structure. So before, the Groom was holding strands and guides, It was many core internal data that you have to import through the DCC. And these data were available in the form of material, but it was like a rigid set of attributes. So we just extended that, and we just added the manager recollection to the groom that were holding data for attributes And these attributes were available for deformers We plan to expose them as well in Materials in case you want for example to define in Dataflow some rendering attributes that could be used in Materials, for example. So it's like in the roadmap. I hope it will be quite soon. Here is just a description of what a manual correction is, because I spoke about collection since the beginning. You saw that it's the main data structure that is flowing from node to node in data flow. It has been used as well here in the room, and it's used in many of our assets. So a manager collection is, to be really simple, it's like a data container. Basically, you define groups, and you define a size for each group, and all the attributes that belong to that group will share exactly the same size. So it's really easy. For example, you have an example here. you have vertex group with like n elements. For example, if you'd say, I'm going to define a position attribute in that group, on the velocity attribute in that group, it would just belong on the vertex group. Both attributes will have like n elements. Same goes for the phase group or the geometry group. Once we had that in place, we started to integrate some nodes to have some feature in Dataflow. One of the first things we just tried to tackle was the card generation. So I don't know if you ever use it, but there's an option for Groom when you can just generate cards out of the strength. So just to build some LODs. But the current pipeline is like you have a button into the Groom asset. You just press the button. It opens a pop-up window. You have some options there. You just change the option, and it will generate the cards. But sometimes things could go wrong, unfortunately, and you could not be happy with the end result. And it was kind of painful just to figure out where the issue could come from because you can't really inspect the intermediate result. You just launch and just hope for the best. So we wanted to change that just to have more granularity and being able to see intermediate result and just to have parameters that could control just some intermediate result. So we added different nodes to do that. We added a node, for example, to build the clamps, one to build the geometry, one to build the texture. For each one of them, you add a different set of parameters, and you can see the result of each node individually. Which is really handy, because you can now start re-debugging things and understanding things. Another thing that you wanted to have, absolutely, is like having more like binding all the curves coming from the strands or the guys to a skeleton mesh, because we wanted to have more hard directability. So being able to define a control rig or simulation, for example, of joints, and being able to the simulation to target that or directly deforming the curves based on that rig. So we added a node just to target a skeleton mesh. So it was like a transfer of the skin weight. We were transferring skin weights from a skeleton mesh that we just set up in Personal Editor. And we were transferring these skin weights onto the curves. But again, sometimes things could go wrong, and you should be maybe sometimes not happy with the transfer. So we just as well integrated all the skin weights tool from the Personal Editor, because artists really loved it, into Dataflow. And so now we have in Dataflow a way of editing, visualizing, modifying all the skin weights that we have transferred by the node. And if you want some more procedural correction, because you don't want to paint by A or something like that, You can select some areas and just run a correction node. It is like a correction node. You have a different type of correction. You can relax the skin weights, print the skin weights. I am a club normalized with standard algorithms that are available already into the painting tool that are available as well in Maya. You can procedurally modify them. This tool has been used quite a lot, for example, for clothes. It is a clothes resizing project. And it could be used as well for all the assets that we have on the data flow. Another feature we tackled was the guide generation. So before, the guide generation was a bit spread out. So the pipeline was more or less. You had the choice to import the guides directly from the DCC. Second option was you take, you import only the strengths, and you just regenerate a part of the guides based on a percentage of the strengths that you wanted. And once you have that, on the simulation side, when you were reading the guides, we were just resampling them. We were just doing some inverse dynamics to get the respos. We were able as well to smooth them. We had lots of different operations that were done just at one time, and it was not really necessary. and you could do that way before, even at the editor time. We wanted basically to consolidate that and just to put more way of generating them inside Dataflow. So we tried a lot of nodes for generating them out of the strands, for resampling them, smoothing them, creating LOD generation directly. So there's a lot more to come in the future, but at least now we have a lot of options but just to generate them. And it works as well with selection. So you can say, I have my strengths. I'm going to bait some areas. I'm going to extract that selection, use that selection to generate that many guides. And that I'm going to smooth them in that way. So there's a lot of flexibility there. And to have more flexibility, we just added as well the Bones tool that has been introduced recently in the Person Editor. So it's like a tool to add, to wrap around, to move bones, wherever you want. So we wanted to add that. If we wanted to change the Skeleton Mesh, we wanted to target. And if this Skeleton Mesh was kind of dependent of how the guides or the strengths were set up, we wanted to have a way of modifying the Input Skeleton Mesh without going back to the Skeleton Mesh Editor and just modifying that here. so we just like how did the tool into that after and now we have like a way of like what if I'm supposed to be a teaser and I could be sort as well has been extended by our team just to have like a lot more feature and now it's really easy to to select things to paint things and just to reuse after this attributes that you set up here in the deformers, for example. So now that we went through all the new way of set up in data flow, we had to modify a bit of the deformation pipeline, because we are not really able to use all what we have defined before. So before, even in 5.6, the deformer graph could be used in a Groomer Set Component, but you only were able to modify the strengths, and that's it. But we wanted to have way more things. So we wanted to be able to define the guides first to have access to all the data flow attributes that you just set up before and to have more coupling And I going to explain a bit what coupling means here So I think we had some projects in the past where, for example, you had an asset and you have different components. And we wanted to have interaction in between these Groom components. But it was not really possible because each component has its own solver. So you are not able to eat on the farmers. And it was not really possible to have them in the same one, except if we were merging the Groom component into one and just being able to access everything, all the data there at the same time. So we wanted to change that to have a really strong coupling between the components to be able to say, I have this component, this component, they are even on two different characters. But at some point, the two characters are going to be close to each other, and we want to have more interaction between the two. So we wanted to have a strong coupling between the two. We wanted to have a weak coupling. For example, imagine you have a character when you have a cloth that is simulating. On top of it, you have a groom. And you say, I want to have some kind of interaction between the two. I want to grab some information out of the cloth and just send that, for example, to the groom. So we wanted to have a weak coupling in a sense. So it's not solved at the same time with the same solver. But we wanted to extract some information out of one solver and pass that to another one. So it's why we integrated the timeformer graph into other kind of assets. The first one was the solver components that I'm going to describe just after. And another one is the data flow simulation that I'm going to describe a bit later. So the groom solver component has been introduced for strong coupling. So it's basically a component container on which you can just register blueprint function. So you can add remove group components. You can add remove collision. So by collision, for now, I mean skeleton mesh or static mesh. But it could be extended to other things in the future. And you have as well a way to set up like a deformer graph or data approximation on this component. And once you have that, the deformer graph has been extended to support the Groom solver. Now we have two bindings in the platform graph. Before, there was only one, it was the GroomAsset, with only the strengths data interface. Now you can access from the GroomAsset binding to the guides, to the strengths, to all the data flow attributes that you just set up before, and all the mesh data, I mean all the skeletal mesh that has been bound to the Groom in data flow. But now we introduce as well a GroomSolver binding, and the GroomSolver binding has access to all the evolution parameters that you can have into the solver, plus all the collision data interface. So with these two binding, basically, you can try now to have more interaction between all the assets that you have into the Grim solver, plus you can have all the collisions with all the skeleton mesh and static mesh that you have registered to the solver. And I'm going to present you a bit like the data flow simulation. So it's really, really experimental. I would not advise you to use it out of the box now. It's hard to be production ready, but since it's available and since it's used, for example, in all the editors, I wanted to mention it a bit. So the data flow simulation is a way, basically, to the runtime data flow first. So it's not a construction one. It's like it's evaluated at runtime. And basically, how you're set up that, you have a lot of solver components. You can have a flesh solver component, a close component. You can have a groom solver component even a chaos solver actor And you can register Dataflow simulation asset this component And if you have a manager which is a subsystem it going to gather all the components that have been used in the world and that is using a Dataflow simulation. And it's going to create some proxy and affect this proxy to each Dataflow asset. So after you open a Dataflow asset, you say, oh, I want to grab all my closed proxy. I want to filter this proxy by this tag. and I want to advance the simulation in time. And that's why I'm going to extract some information out of the solver and I'm going to take other proxies, which are the Groom one, and I'm going to enqueue, for example, here a deformer graph and I'm going to pass some data to the deformer graph. So it's like a way of a central place where you can launch a lot of solvers and now it has been extended to launch as well a deformer graph. So it's really, really handy. So this is, for example, a simple setup example. For example, if you have on the top side, an outliner with a root actor, and you have a flesh component, close component, a groom component, a geometric collection component, each one of them has been assigned to a data flow solver asset. So simulation manager will just register some proxy to each one of the solver assets, and you will have access to this proxy within each data flow simulation asset. And what's nice as well is, if I go back here, when you are entering a deformer graph into the data flow simulation, all the variables that you have set up in your deformer graph will be exposed into the node. And basically, you can start plugging things in as an input of the node. So things that could be like another variable from the data flow simulation asset, or some things that would come from other solvers. Let us say, for example, you want to say, I have a closed simulation running, and I want to extract a velocity field out of that, and I am going to plug that velocity field directly into my deformers to be able to drive a bit of deformation. I mean, these things could be doable. But with all these new deformation in place, we wanted to have a way of recording that. We just modified as well how the Groom cache was working to be able to write cache. It was only reading cache in the past. And we plugged that in the new Dataflow recording pipeline. So now you are, within Dataflow, able to cache the result of any kind of deformation that you are going to set up inside a deformal graph or somewhere else. So I am now going to do a simple setup example. So it is really simple, but it just shows you more or less the different steps that you have to go through in order just to have something running. So the first thing, for example, what we want to target, just to say, oh, I have like a skeleton mesh that is like going to be simulated, or that is going to be controlled by control rig, and I want all my curves, like from my guides, or my strengths to follow that. So the first thing that you have to do is just to set up a skeleton mesh on the physics asset. In Skeleton Mesh Editor, you can set up a physics set with all your bodies, all your constraints. That is it. Then you have to use that Skeleton Mesh inside Dataflow. In the Dataflow asset, you say, I am going to transfer the skin weights from my Skeleton Mesh that I just set up onto my guide, that example. You can edit them. For example, here is a visualization of my skin weights, you just select the bones on the outliner, it automatically shows off all the different skin weights, and you can modify them paint them modify each vertex if you want I mean do whatever you want Once you have that in place you can create a deformer graph So the deformer graph we just grab the guides And after, we grab the bone indices, the bone weights, and all the bone transforms that you will have at runtime. Then you should plug all of these buffers into a natural SL shader. And here you are just going to basically do linear blend skinning in order to bind directly the curve points onto the bones. Once you have that, you just create a blueprint in which you just set up your Groom Assets component, your Skeleton Mesh component, and you can affect the deformer graph that you just set up before directly onto your Groom component. And once you have those blueprint ready, you just put the blueprint, put that into the data flow preview scene, and you will have that. And so after you just set an animation, and you were able to move it around, and you will have all your curves following your bones. Again, it's a really simple example, but it shows at least all the different steps that you have to go through just to have something working. And once you have this kind of binding in between the two, you can do a lot of different things. You can even use directly control rig. For example, if you have an asset like that, you can just modify the bones on all the curves, on the strengths, we just follow exactly what you are doing. Now you have full control of what you want to do. So just to recap a bit what we went through the session. Now we're using Dataflow. It enables you to add a lot of additional data to the Groomasset. It really opens the door to a lot of different things. We have spent some time working on the pipeline and just having everything kind of ready. And now we are going to spend way more time like just producing content, new nodes, to do a lot of stuff. You can maybe even imagine the future being able to groom things directly into that tool because you will have like all the nodes just to modify the geometry. You will have all the tool just to paint something, just to modify something. So you will be able to do a lot of different things. You can already, but it will be expanded in the near future. We have as well improved all the bridge to deform a graph to enable to create custom deformation to GPU. I think in the roadmap, we probably have quite soon as well that available in the materials. That way, you could just as well define some rendering parameters inside the data flow and just use that directly into materials. And we have now as well more interaction in between the different Groom components. And we like all the solvers, thanks to the data flow simulation, even if it's a work in progress. And thanks to that, we'll have quite soon as well like a new solvers will replace the old one. So it's a work in progress. It will be kind of available in 5.7, but really in experimental stage. So it will be improved over time and just be ready for the 5.8 release. So yeah, so thank you for your attention. And if you have any questions, feel free to ask. Thank you.