https://www.youtube.com/watch?v=PgzSGQnWVcU

Welcome to Animation Hub. Today we're going to be looking at best practices for using a webcam for MetaHuman Animator. So before we can use the MetaHuman Animator face, it's important to make sure the plugins are enabled. So I'll go to plugins here. So I don't know exactly which ones of these you need. I imagine you need most of them, but certainly for the Live Link aspect of this, you'll need the MetaHuman Live Link plugin enabled. In my case I'm using a Logitech Brio, which is a nice little accessible web camera. To get that set up on this Metahuman, which by the way is a very standard preset Metahuman, nothing special here. I'm just going to go to add source in my Live Link window and then choose Metahuman Video. As you can see, I've got my Brio there, and also you'll notice that it's capable of doing the 1280 by 720 90fps. There's a lot of options here. This 90fps one we found is ideal. The extra frame rate that you get over resolution, because as you can see, I could do 60 at 1080p here, But this 90 at 720p is considerably better for visual fidelity and fine facial movements. Now, if your frame rate is at 30 FPS, you'll probably find that capturing the face at 60 is absolutely fine. But if your target frame rate is 60, then 90 really does benefit you. So I'm able to get this 90 option here because I've got the Logitech webcam plugged into a fast enough USB port. So just bear that in mind, you need to plug it into a USB 3.0 port that's got the correct amount of bandwidth. And once we've done that, I can hit connect. And as you'll see, if I then go and select this, it's gone green. And that's because it can see me and I can go to input video and you can see here I am. But you'll notice straight away that actually we're only achieving 36 frames a second out of that 90, which is not ideal. And that could be for a number of reasons. But one of the clues as to why that is is right here where we can see dropping is yes And what that means is that whether video frames are being dropped because they cannot be processed fast enough which tells us that the webcam is sending us 90 frames per second but we're just not able to process them all. And that's because essentially what we're doing is using the same GPU to render this Metahuman as we are trying to solve the face. Now I'm going to get into later what you can do to get around this if you want the best quality Metahuman being solved in real time. But for now, let's just look at what we can do to the Metahuman in the viewport to just use that viewport as a preview to the kind of facial quality we can achieve and get 90 frames per second in Live Link here. So one very quick way I can do that is actually just to turn the hair off. So let's try that. And we're no longer dropping frames, and now we're achieving 90. So there you go. That was a really nice quick fix. But whatever you need to do to your environment to just give yourself a nice lightweight render to get 90 frames per second here is what you should do. The other thing that could be impacting this frame rate here is actually the Logitech camera settings itself. So you do need Logitech drivers installed. They're called Logitune. LogiTune. If I just go and get the LogiTune app up, I've got mine set up already, but yours will probably be something along these lines. So one thing that can impact your frame rate here is the auto exposure, because the exposure is controlling the timing of the camera. So first of all, let's just turn HDR off. We don't need that. I'm going to leave it auto focus on because it's not causing any harm, but if that is jumping around you might want to disable it. Let's disable auto exposure, and I'm going to show you why this has been a problem. If I was to raise my exposure up, you will see we've got a nice exposed image here, but we're now only getting 33 frames per second, and that's because at minus 5, the camera is only the the camera sensor is staying open long enough to achieve this exposure it can't then capture at 90 frames per second so in my case on this camera minus six is what i require to be able to get up into the higher frame rates as you can see that now up in the 80s Minus 7 also achieves it but it not the best picture really So that works quite well. There's also this gain here. Obviously, you can control the gain to get a good exposure. But really, you want that gain as low as possible, because a noisy image is going to give you noisy solve on the Metahuman. Speaking of which, let's get this Metahuman animating. So I'm going to click on the Metahuman and then go to Live Link Subject and choose the Brio and click Use Live Link Source. And there we go. So as you can see, we're now doing a real-time solve. And this is a real-time solve at 90. So it's a really lovely high fidelity facial solve that's catching tiny little movements in my face and giving me a really good lip sync. If we take a look at the noise here for a second, I think that's worth discussing. If I was to just open my jaw and hold it open, you'll be able to see my lower teeth in the MetaHuman, and that's a really good way to assess the quality of the solve. If I just hold my jaw open, you can see that's the kind of quality we're getting and then i'm going to turn my lights off and do it and you'll see the difference you can see that's a considerably noisy face then as soon as i turn the lights on we get a much smoother face so light is very important too you want to make sure there's enough light and you can have your logitune gain settings as low as possible and an exposure setting that still allows you to achieve the higher frame rates. Now, we mentioned earlier about what happens if I want a really high-quality MetaHuman at the same time. Well, that's where we can leverage another PC running Live Link Hub, because what we'll do is we'll solve the face on that machine, and then over Live Link, we'll send the Live Link data and solve this MetaHuman in full screen on this machine. So let me show you how to set that up. so here we are on my laptop and so we going to want to run live link hub if you got the live link hub plugin enabled you can just go down to tools and launch Live Link Hub or if you want to just run it locally then it is available in your engine win64 directory and it just this application here so let's just run it. So here we are in Live Link Hub and the workflow is very similar actually so we'll just go to add source and choose Metahuman video and log check Brio is there and I just hit connect there we go and that's basically the same workflow so this is now ready to send data over to into Unreal and but as you can see there's no clients this would be where you would connect to a another machine on the network and the reason it's not connected is that at the moment I've got of LiveLink Hub is not adding any clients automatically so I can drop it down and I can just go to all and there you go we can see our machine and we're sending that LiveLink data. It's really important that LiveLink Hub and your instance of Unreal on the other machine is on the same network and a common issue especially if you're using VPN is the UDP port could be different and so if we go to settings and go to UDP messaging just make sure the unicast endpoint is selecting the network that you wish to communicate on. And the same is true in Unreal. In project settings, just go to UDP messaging and check the unicast endpoint there is correct. So now we can head back over to our main workstation. So here we are back in Unreal. LiveLink Hub is running on my laptop now with the Logitech Brio plugged in instead. So we've completely removed that solve happening on this GPU. And you can see in the Live Link window that Live Link Hub is sending through the Logitech Brio subject. Because the name hasn't changed, it's mapping to this Metahuman straight away. So that's turned things back on the Metahuman. Let's turn that hair back on. And let's look at it full screen. So there we go. So we have a 90 frames per second capture of Metahuman, which is getting all of my lip fidelity nicely and all the subtle kind of, well, not so subtle facial expressions. and this is basically the ideal setup if you're doing a live metahuman capture. So as you can see, it's looking great. So thank you very much. I hope this is really useful.