https://youtu.be/8TaVVKX-FRk?si=YaKB55Jaa356SzOB

ARJAN BRUSSY, ARIAN BRUSSY, ARIAN BRUSSY, A TECHNICAL Cool. Hello, everybody. Welcome. I hope you're having a great fest so far. You know, it's been very busy already. A lot of cool things happening. So I'm today going to talk to you about what's coming in 5.7. So it's kind of a big overview of all the things we have in flight. Touch a bit about some of the things that we have coming after 5.7. So it's some preview as well of stuff that's coming. My name is Arjan Brussi. I'm a technical director at Epic. And yes, let's get started. It's software development. All plans are tentative and subject to change. You know, we have the preview out. We're pretty confident that we're going to ship all the features that are in preview. But some things are still kind of ironed out, getting to the finish line. So, yeah, you know, that's basically what's happening. Quick recap of 5.6. I think it's good to kind of set the stage for some of the things that are happening in Unreal land. You know, and I think the big thing about 5.6 was performance, right? We have gotten feedback. You know, we know the type of work that people have to do to kind of get performance going. You know, Unreal 5 was a big thing about like big open worlds, right? huge complexity, orders of magnitude, more detail in the world. But, you know, kind of getting that at, you know, high performance and all that was a challenge. So it was great to be able to work with CD Projekt on this and actually have like some really amazing content to throw at it and kind of figure out what do we need to do to kind of make that run at steady, steady performance. Not just max performance, but steady performance. Streaming is one of the things, right? World partition, all that's very nice, right? you don't have to write your own streaming solutions anymore, but it would have troubles navigating from sector to sector, cell to cell, so it's great to see that streaming performance is so much better, especially for simple things, simple geometry that you don't have to interact with and make that fast. Overall performance as well. A lot of things in flight, such as now we have these big open worlds, We have this engine that's generally capable of doing that. You want to do dynamic time of day. So what that means, you're changing the sun position or the moon position. Not both. One of them, every frame maybe. So you have to kind of re-update VSMs and a lot of this stuff constantly. So we have to optimize things like VSM to be really fast in the uncached state and not in the cache state. Because you can't cache anything if the lighting constantly changes. So stuff like that. but also like a lot of things like, you know, finally hardware ray tracing, like really embrace that. You know, it's kind of an interesting story because, you know, when PlayStation or 5 or, you know, the Gen 9 hardware came out, you know, it was doing hardware, but nobody really thought it was possible to do full games with hardware ray tracing. And, you know, now we're like the second half of that generation. And suddenly we're like, well, actually it's possible, right? Some licensees are telling us like, yeah, we're actually doing this, this is cool. So now we embrace that as well. You can see the next leap, second half of the generation always, the next leap of fidelity. It's amazing using all those things. Generally, parallelization, and I have some slides about that as well. I'll just go to the next slide because it's always interesting, always fun. That's a screenshot of insights on the Witcher demo. The joke, and it's a painful joke that we always make, is this does not look like Unreal. Unreal has often been more game-thread bottlenecked. We spend a lot of time on going really wide, going wide in animation. There is a new animation framework that is able to go over all the threads, like limit all the sync points between all your worker threads and your game threads. You can do so much, much more. RSI going wide, physics going wide, everything going wide. Now we are getting problems of, well, how do we keep everything kind of synchronized and orchestrated and all that, right? If you're starting to hit this amount of threat utilization, you know, you have other problems. But, you know, it's working, right? We get a lot of cool things. Animation performance, you know, part of open world. You know, you're building a big world, you want to do crowds, you want to do rich simulations with a lot of interactivity and all that. So, yeah, animation is a big thing to be able to paralyze and all that. Still very experimental. We will talk a bit about that later, but that is basically what you see here, is animation going wide. Then I have a slide, or two, or three, about some tips about how to use UE effectively. It is maybe common knowledge or something, but we do get questions about it, and I think it is good to reiterate some of these things. You know, experimental, beta, those kind of things. Like, what does it really mean, right? So, we have a lot of things of experimental that we sometimes announce. And, you know, in the past, we were very reluctant on showing people experimental stuff because it was not ready, right? We didn't want to support it. We don't have the documentation yet. All that kind of stuff. And now we are, like, we're sometimes showing the experimental things pretty early. But, again, we still don't have everything ready. The tooling, the documentation, all that stuff is not final. We have also with experimental, we are still working on these features hard, which means we can change the APIs. We can break your content. We can break the file formats, those kind of things. That's really the state of experimental. But then experimental is a spectrum. Some things are very experimental. Some things are almost there You could probably if you start a game right now you could probably ship with that feature So you know there a bit of a caveat there You know beta for us really means that we have probably shipped like, internal projects on that, you know, something in Fortnite or so. And, you know, probably be reasonable, be ready for people to ship with in, like, two releases, so about a year, maybe one and a half year. If you then break the API, which we shouldn't break the API anymore at that point, we would be deprecating things if we would change it. You have time to change your plugins or your code to deal with an updated API. That's the part of beta. And then you have production ready, which we say, well, now you should be able to ship with that feature in your game completely. Some people are like, well, I don't want to touch any experimental stuff, or I'm going to wait until everything is production ready. But yeah, you're embarking on the journey of making your game that's probably three, four years, you know, or shorter, I don't know, right? I hope it's shorter, but, you know, whatever, right? It will take time. So, you know, you can probably start using some experimental features. You can probably start using beta features. But be careful, right? Just think about, like, what does it mean? And if you're really kind of, like, eager, like, I'm going to jump on all the experimental stuff, then pick a couple that are important for you and maybe talk to us, right? Talk to us like, what was your advice? And we could probably give you a bit of guidance. You know, the other thing is modifying the engine. And, yeah, you know, the source is there, right? But with great power comes great responsibility. I don't know. I think people have learned, you know, long-term grizzled developers of Unreal have learned that there's dangers. You know, there'll be dragons there. We had people changing the engine so much that they could never upgrade, never ever upgrade. And then we have all our developers happily typing away, doing amazing stuff like we're going to talk to you about today. And you want to kind of take on these new features, so you have to be way, way smarter in doing this. So I think generally the sentiment has been after UE4 that never, never again, right? We want to do these kind of awful integrations. There are some things you can do, right? There are some tricks to do it a bit more safely. Probably the biggest thing is just be more componentized in a plugin type of way of doing it. I think everybody is starting to do that. Gameplay feature plugins that can contain Blueprint codes and data, et cetera. Combine them in a plugin that you can load and unload, even in live, with hot fixes and stuff. Be careful about where you extend. If there are extension points, use them. If you want to use a new extension point or propose one, we'll probably take pull requests for reasonable requests. If you change things, comment it. Try to get rid of the tech debt again by stopping your own work. It's hard. I know you're committed. There are also things like doing shadow integrations on the branch and try to keep feeding our new things onto our branch and see if it runs and do A-B tests, maybe those kind of things. We see people do all the things to be successful there. And last thing, we have Ari's great presentation, fun to watch, but there's some really good information in there. You can do all these things yourself, right? People are using TeamCity and whatever, their own CI, CD system, that's all fine. But we're starting to see that for these systems to work really well and that's highest performance and fastest iteration, they all start to work together. And if it all ticks, this is when you see magic happening. And you have to understand we have thousands of developers at Epic. We're working on three, four different games, basically in single executable, all in the same branch. and you know dev iteration for us is critical right we will be so losing so much money it will be so slow going we will not be able to generate builds if we don't fix this so we're super motivated to fix these things so yeah horde right orchestration but it also starts doing telemetry you know introspection on your build processes I have something about that later you know zen is not just for cooking on the server but actually if you run that on your workstation, it accelerates everything. It accelerates editor load up. It accelerates play on target. It's all those kind of things because it abstracts the file system. You don't have to deal with Windows file system and virus scanners and that kind of crap anymore. It's much, much faster. And we'll build accelerator as well. Distributed builds across platforms, including Windows and Mac, shader compilation going wide over multiple machines. super critical. So all this stuff will help you, we hope, and we see. So yeah, about 5.7, what's new? So tentative dates, well, we just in preview, release mid-November. I think second week-ish of November is currently the plan for the first one. So yeah. And maybe talk about some of the themes. So the themes are kind of similar to maybe 5.6, and I think that's probably a good thing. I think we, you know, 5.0, starting with Nanite and Lumen and those kind of things, lots of new features. We keep throwing new features at you, et cetera. But I think stabilization of these features and getting the performance right and tuning right and documentation and making it work in every case is important, right? So performance of it all, right? Performance as a feature, basically, running 60 hertz across all platforms with the feature set that we think we can deliver Scalability you run it on every platform right So scale up and down on the platforms that you need to target And then workflows, you know, not just making it cook faster and load faster, et cetera. Also, like, how can you make the tooling better to be faster in developing the games and develop content and such. So let's go into some more detail. Rendering. So Substrate will be production ready in 5.7. So if you're not using it yet, you should probably start taking a look at it. So what this does is it's our kind of new, super advanced material system. You can do like multiple layers, like physically correct, super cool stuff. It will take your old content and it will just render it in Substrate starting with 5.7 if you select the flag. If you don't select the flag, it will be the same as it was before. We will probably deprecate the old path of rendering in like a year. So we will start starting to deprecate a non-substrate path. And out of the box, if you enable it on your project, the goal is, and the thing that we are testing against, is to not have any performance degradation. So it should run at similar performance as your old stuff. But then you obviously want to do cool new stuff with substrate. You want to add all the bells and whistles, all the cool features. You want to do cars that look amazing, like a photo and all that. So there, I think we need to temper the expectations a bit. Some of that stuff is possible, but it will get more expensive. It is not as simple as a previous gen or shader. So there are intricacies there, and you need to set up some good profiling tools to figure out how can you spend the performance. But yeah, basically out of the box, If you enable it with a blendable G-buffer, that is the thing that we call it, it will be the same performance. You can start making substrate materials that are even more complex, but if you want to do multiple slabs, so multiple layers of physically correct things, you need to do the adaptive G-buffer, and that is the path that still needs further optimization. Anyway, super cool stuff. We will release a sample pack for people to play around with and see how to make this content. and we will be working on tools as well. Then Megalights is actually getting a lot of work done for that, so that is super cool, going in beta in 5.7, and then the aim is to get it to production ready with 5.8, so we are kind of accelerating some of this stuff. Megalights runs on platforms that have hardware ray tracing, and capable hardware ray tracing, so it is not something that can run on mobile or so right now, even though, you know, other discussion. We're adding some new features, right? So Substrate was a bit noisy. We've been looking at like better, better, better, better noise control, support for grooms, shadow translucency, service types like grooms and such things. So it's a lot of kind of feature work and better performance tuning to kind of keep the resolution and scale update rate under control. Then foliage and skinning is now experimental in 5.7. If you look at that spectrum of experimental I talked about, this is probably on a pretty advanced stage of experimental. It is already running a performance on Gen 9 consoles, on PCs. the big thing is that the rendering works. The skeletal assemblies work, the voxels stuff works and all that. The biggest thing is really in the tooling side of things. How do you alter this kind of content efficiently and how can I place it in the world? Plus things like how do I do a collision with these kind of crazy content and such things. Those are the type of things we're still spending time but generally this is working really nice. Then the next question we always get is, can we use this to run crowds and characters and metahumans? Those kind of things. So yeah, not yet. You can do simple characters with Nanite skinning, if you just have it play like an animbank or something. So you can do maybe far distance stuff with that. But full MetaHuman with translucency in their eyebrows, their eyes, and blend shapes and stuff in their face is not possible yet. It is an R&D. We are thinking hopefully somewhere next year, but maybe a bit later. But that is next up for the team to work on. Then general rendering architecture. So this is really like the ongoing 60 frames effort. So yeah, we have this initiative internally that we call 60 hertz or 60 frames per second by default. And basically looking at like, well, how do we make sure that all these features, all the settings, all the scalability is set up to actually run at that performance or where do we need to optimize further? You know, a couple of really interesting things in flight. We have an initiative as well, kind of sub-initiative in that to kind of look at frame pacing, make the fluency of the frames better when performance kind of bounces around a bit. Bindless is a big thing that people are constantly asking about. Bindless is still experimental. On that realm of things, people are starting to use it, but we'd like to have a bit more fine-grained control for bindless. So right now it's an all-or-nothing switch. So it's either everything is bindless or not. And we want to have more control, like some shaders do bindless, some others don't. So we need to have more work and maybe support some other platforms a bit more to kind of maybe make that completely done for et cetera And then GPU profiler So Luke Thatcher is doing a talk on that We released the profiler experimental in 5 Super critical to start learning that tool, because it has much more information about the scene. You can see where things are being kicked off to the renderer, how it flows through the pipeline, where potential bubbles are in your pipeline. And we've used this extensively to kind of, you know, squeeze all the bubbles out and get it to get the Witcher demo for instance to run at that performance level. Yeah, a bunch of other things. I'm not going to talk about all the bullet points, but yeah. VSM Material Translator, great talk by Dan Elksnidis about that as well, right? a lot of work in that realm because shader explosion and permutation and the resulting PSOs and all that, there's a lot of pain there. We need to get the shader explosion under control. And there's a lot of efforts in that realm to first start with altering of things, like how can we make shaders that don't explode into permutation? Then how can we optimize the whole compilation process? and can we find ways how to streamline that? And even kind of go so far as what can we do, make fundamental changes to all the things. Yeah. Let's go into simulation. And simulation, we both have Niagara and Chaos. And I think the interesting thing is that it is all kind of interactive. It all kind of works together with each other. That is the reason why we have Chaos and Niagara. It is all interoperable. Because now we have an amount of source that we can interoperate with. The foliage can use the physics sim, the water, the fluids, and everything. It combines and turns into complete simulations of what you want to do in the world. Niagara can be doing super advanced stuff like the heterogeneous volumes. like 3D textures animated over time, pre-baked sims, awesome stuff. We have internal shadowing. We have better permutation. We have better control over scalability. We've actually been shipping this on Fortnite in some events, and we're displaying these things live on Gen 9 consoles. You can't have the whole screen full of these things, but you can have a couple, and it looks super cool. On the flip side, we also have cheaper particles, like stateless emitters, with which you can do many more emitters and run that also on mobile, etc., because it's way more efficient to calculate. So if you want to do a crazy amount of particles, you can do stateless emitters. Then also, right, and I'm going back in the bullet, so to Confusial, you know, Niagara is now also able to emit nanite meshes and also able to emit megalites, so you can have every particle, I don't know, every particle, I don't know. That would be cool. Have a light and feed into the Megalights thing. It is really advanced stuff, both on the super high end and on the feasible end, if you want to do cross-platform things. Then a lot of other things. You can do a whole presentation about SimCache and all that, and all that, and you should probably look at that and see what you can do with that stuff for your own projects. To kind of help everybody a bit, we are working on a new sample setup, so we can show you some of the best practices and example content and all that, to show off all the complexity that you can achieve. Then on the chaos side, also a lot of things in flight, a lot of depth. So just thinking about character simulations, where we run much of the complexity of character simulation through chaos. So, you know, realistic performance physics for cloth, hair, flesh, garments, attachments, and all that through Chaos and the data flow system. You can do layered, complex interactions of skin that's sliding over muscle, over bone. You can do like these crazy things like you see here and that horse of the Witcher demo, right? So Chaos is able to do that super advanced stuff. We're also looking at a groom initiative to improve the art directability of making hair and grooms and turn that into a robust pipeline that you can tune hair with and make custom hairstyles with and that have physics response and all that. Dataflow is this authoring framework which you can make procedural systems really quickly. That's a really powerful tool to build up physics sims and setups and all that. It's a great way for your tech artists, tech designers to look at and build complex systems. A bunch of things there with extending things in destruction and also looking at how can we turn this into simulation of fields and foliage and those kind of things. And then they have this slide about it as well, where a lot of continued optimizations happening. I think we have big, big efforts in flight to kind of improve performance across the board. We've been shipping Chaos in all our projects for quite some time. We've seen many licensees having really good results with that. We'd always love to hear if you're running into problems, and we'd love to hear more about what specific setups you have. You can look at the Chaos visual debugger and see what's going on and share those kind of things to a dev community, and we can probably guide you along. But yeah, a lot of optimizations in flight. And then maybe last but not least, it's like that environmental simulations. Can we make full-on sims of like winds and fluids and all that that can drive water and foliage and particles and cloth and everything kind of combined? And that's super exciting as well. We have like these hybrid type things, like some of these things are calculated in the editor. and then they can be dynamically perturbed by, for instance, the player or players walking through the water. So you kind of combine the pre-baked sim with runtime sims and get to interactivity still. So you're kind of overloading some of the computation. Then going on to world creation. So PCG is actually hitting production ready. PCG has been a pretty seminal tool for us because we want you to be able to build these super huge worlds, so that means you don't want to place individual assets anymore. Sure, you can use external tools for that, but then you don't have what you see with what you get. So you want to see it in Nanite, with Lumen on top, in that full fidelity, etc. PCG is also kind of interesting, and I keep reminding people, it's not just an editor-time altering tool. You can run it at runtime, you can run it at load time, You can run it on the GPU. You can do things that respond to your game. When the gameplay changes, the world can change around the player. You can have content that only shows up close to the player, and that content only lives on the GPU. It doesn't even live in the CPU land. You can probably interact with it, but it will add a huge amount of fidelity and detail and all that pretty cheap and efficiently. Super cool, right? If you are not using it, you should really dig into that stuff. We have been experimenting, we have been making it faster every release and making the tooling better, etc. So now production ready and we are super stoked about that. Then the other thing is building a new vegetation tooling system. So with Nanite being able to make these trees out of components, like little branches and little things like you see up there, and then all hooked together in like a skinned skeletal mesh assembly. You know, that's hard, right? There's no tools that dump out that kind of content in the right format for you. So we're working on the tool ourselves. You know, there's a presentation about that. I don't know if it's happened already, but that's a cool presentation. Hopefully you can see it otherwise shortly or start playing around with it. You know, it's a very early version of the tool, you know, But the fortunate thing is that our Quixel guys have been working on foliage procedural tooling for many years. And that was living in an external tool. And now we're bringing all that knowledge and all the logic and all that. We're bringing that inside a real editor to be able to build really complex foliage right inside the editor. That's all nanite friendly. Then, world building in general, we have this new fast geometry plugin that optimizes streaming a lot. Basically, that takes all the simple stuff around you. Most of the world is probably not dynamic. It is probably all static simple meshes or instances of meshes, etc. This fast geometry streaming plugin, you can enable. It is not enabled by default, but you can enable, and it will help take all these simple assets and stream them in really fast. And we were able to do things with the city demo, which you can now stream through it at 500 kilometers per hour. And previously, it would hitch and not be able to load it in time. So these are the kind of improvements. And that demo is still proving to be very valuable to test performance against. We also have memory tracking. You build this whole world. There's all these cells, et cetera. but some cells will be higher loads than others. You want to have introspection and see where all the memory is going. Then we have the ability to make custom HLDs, which people are requesting. Most importantly, maybe on the slide, just look at the rule building guide. We have version 3 of it right now. We just updated it for 5.6. This is essential reading. If you're building open worlds, but even if you're building maybe just more linearly streamed worlds, there's a lot of tips and tricks in there how to use all the things in the right way and in the most efficient way and for performance and for content authoring, etc. So really, really critical reading. So yeah, going on to characters and animation. A huge amount of stuff in flight as well. You know, and the thing is always like, why are we doing all these things? Why are you building animation inside the editor and all that? And I think, again, we like to limit DCC round trips. We don't want you to go from the editor, go an external tool, import, export, that kind of stuff. It's just a painful cycle. But you also want to be able to work in context. We see a lot of people that are making even movies or cut scenes or whatever if you have it in a fully lit scene in your actual 3D world maybe you don have to animate it because it behind something Maybe you don even see it The camera is looking the other way. You're only just looking at the gray box in an external tool, right? You're not like all parts of the animation need to be perfect, but that's not really needed. Also, I think we have a lot of experienced people that really want to innovate on the state of the art of the tools. So with control rig and forward and, you know, for inverse kinematics physics combined to do it, right, we can do meaningfully faster ways how to alter content, make animation and all that. And yeah, that's sort of the things that are in flight, right? You know, a lot of things with rigging going on, you know, control rig physics, really cool, right? It's just automatically generates all the secondary physics stuff makes things a lot faster. You know, retargeting, there's a demo as well in one of the booths there, you know, like differently sized animations completely retargeted, you know, with like contact resolution and all that, having the floor kind of contacts be resolved correctly because we're running part of the IK chain again, you know, with the thing that we call like relative IK to kind of prevent self-collation and such things. So super cool to kind of reuse your animation set over a wide variety of differently scaled characters. So super cool. Animation altering in itself, right? A lot of like quality of life improvements, you know, because yeah, we have animators using the tools themselves and then they give us feedback. You know, they're like, well, I hate this, right? This needs to be faster. We need to iterate on that stuff a lot. So a lot of quality of life going into altering. And then last but not least, on the sequencing side of things, we want people to build ever more complex, even like dynamic cut scenes or complex things that can be localized, can have different length audio, things that we've done for 5.6, It will do better integration of cameras from the game camera to the sequence camera and all that. A lot of work on sequencer. The big thing with 5.7 is some kind of tooling to help validate the sequencers for being correct and set up correctly so you can run it in production better. This light is really something else. At least I can see you guys. This is better. Unreal Animation Framework. So this is the thing that I talked about. It's like really heavy in experimental state yet. You know, we've been working on it for quite some time. So we're kind of completely getting rid of animation blueprint. We're kind of really kind of redoing building an animation system again. And that's a lot of work, right? We have to support all the things that we've been doing basically all the way in UE4. We have to redo them again in this new framework. and kind of the proof of life or proof of concept with that was that Witcher demo where you can show that it's actually working. It is working at performance and all that. But now we're like, well, we learn things in the demo. We want to redo some things. We want to change things. We want to, you know, fix things. So, you know, it's still in heavy flux. So we don't have any internal projects right now using it. You know, we don't have Fortnite using it, etc. It's still purely in R&D land. So I just want to say, like, this is like painful experimental I know everybody wants to use it, but you have to wait a version or something. Or do it on the site and start with some smart people trying to figure out, what can I learn? Is there some conceptual things that can help me integrate it when it's actually coming down the pipe? But very cool things ongoing. And then Mover is also not yet done. done, but that's going to be very far on that spectrum again. You know, we have licensees starting to start new projects completely with Mover right now. They're happy. They're seeing good performance. They're going to get a good response, right, over just the old character Mover component, capsule type, old school stuff. And yeah, so we're trying to get it into beta by 5.8, at which point we also want to have Iris networking be supported by Mover. So 5.8 will be spring next year probably. So yeah, we're on a path to kind of get that over the line as well. But that's a well-going thing. Mutable, right? It's been integrated in Unreal for a few versions now It is a great way to make customizable characters right inside the game and all that. You can do some super cool stuff with it. If you are interested in doing customizable characters, really look at Mutable. It will solve a lot of things for you. With 5.7, we have a big thing that we call, it is going to be like data-less, which means that the character is more completely described as parameters, instead of baking out a lot of stuff in the editor. and then dumping a lot of textures and models into the code. So it will runtime generate the things that it needs and just by sending the parameters in patches even and all that. So we can do stuff like if you have a new character that you want to put in your game, you can just upload the parameters and it will just make that character as the runtime load time appear. So cool stuff happening that is mutable. And then certainly MetaHuman The biggest thing recently is that MetaHuman is now part of UE itself So you don have to run it in the cloud anymore You can run it inside the editor itself So that's great. With 5.7 now also runs on Mac and Linux. So that's useful for people that care about that. A lot of updates. You know, I think, you know, with MetaHuman, we now see it as like this full character production pipeline. You can make super high-fidelity faces, but you can also make body shapes, clothing, grooms using Creator. And then we have MetaHuman Animator that allows you to basically animate any MetaHuman using various tools. So you can use your phone, you can use complex stereo cameras, you can use video stuff, etc. And the fidelity that we're getting out of even pretty low-end phones is amazing. So maybe it's good enough for secondary characters. And now you can really accelerate creating good humans and all that. And then, yeah, I think another thing that people are really excited about is that we now have MetaHuman being available in Maya and Houdini. so you can actually export it out, get it into these tools, run some more complex things that maybe don't exist in Unreal, like paramedic room tools that we've also released as a plugin for Houdini, and then move it back into Unreal again. So those are kind of cool ways to kind of go advanced things. Going to framework and gameplay systems, So Iris replication is going into beta. So we are using it across Fortnite right now. We're seeing in all the use cases much more, much better performance over the old replication system. And yeah, we really think you should start taking a good hard look at using this. We also have some support for like seamless travel. If you are, you know, if you're using more old school streaming, You want to keep certain actors loaded between levels. And we also have like a better way to kind of handle async loading. So if Iris says you have to spawn a new actor, you can actually schedule that. So it's not all scheduled on the same exact frame and which could cause hitches. So this is a way to kind of order things and not have these hitches. Yeah, and maybe another really interesting thing is that we have this multi-server replication plugin, which is experimental. Some people have found it, started using it, et cetera. And it's kind of a way to start making MMO-level tech, you know, having multiple servers all communicating with each other, having things that can replicate between servers. We are working on various tech on that realm. So interesting stuff. You want to hear more? I think we just made a knowledge-based article on that which kind of explains some more of it. So, yeah, cool new things in flight. Then, generally in game systems, a whole bunch of things going on, maybe cover a couple. You know, State Tree, so we can now have a way to kind of extend the State Tree compilation where you can provide your own data into the State Tree. We have some quality of life, like the compiler manager, the gameplay animation camera system, the procedural kind of camera system is still experimental, but very powerful already to kind of have like a more procedural control over your camera to kind of frame what is dynamic in your scene. Like if you're multiple characters, one character, differently sized characters, et cetera, to kind of make that happen, but also blend between like gameplay and cut scenes and such things. Then on mass, we have been looking at optimizing memory by refactoring the way that shared fragments are being handled. We have better configurability, how to run it over multiple threads and all that, so we can get more performance out of the systems. That's starting to run really well. Going into core stuff. You know, some kind of interesting, like low-level things that are happening down at the foundations of Unreal. So, you know, like one thing that we're releasing with 5.7 is like a memory allocator improvement for, you know, PlayStation 5 specifically, where we have seen up to 15% speed improvement in CPU threads using our new memory allocator. And it's kind of crazy how big of an impact a memory allocator can have. So we're doubling down on that. We have some other further work to kind of further improve that. You know, the other platforms are already running some of this new tech, but the Placing 5 was behind, so now the Placing 5 is also running the new stuff. So that's really interesting. The other thing is that we are really looking at like engine memory and code size. you know, the also for the last bullet, you know, the thing is that we probably have one of the largest C++ code bases in the world. You know, maybe Windows is bigger or something. And yeah, you know, everything is starting to squeak and grind to a halt, right? You need crazy computers, you need crazy things like UBA and all that to be able to even compile and be efficient in ways. And we're fighting against it and we're pretty good, but we need to look at fundamental things. We're looking at a lot of things. We're looking at templates, we're looking at includes, we looking at force inline we looking at constructors F names across the board For all programmers that have been looking at memory that probably all good news We seen multiple tens of megabytes easily removed from memory and from code size across the board with all of these things. That's good news, but it's hard. We have a lot of fucking source. another thing in that realm is incremental cooking you know the way the legacy basically from UE4 was that there's no clear dependency graph between objects and files and everything in the engine and we kind of solve that now by using a Zen server that kind of starts storing all that metadata for files and kind of knows how files are are connected and which file, if you change it, will result in other files having to be recooked and all that. That will ultimately result in incremental cooking, where we just cook only what is needed, instead of cooking everything all the time. That will also help batch sizes tremendously, because everything will be much more deterministic, and we can really keep the batch sizes down. Overall, cook speeds are way improved. So really exciting beta in 5.7. Hopefully, it will go into production ready next year, obviously. You know, other thing is build health analytics where, you know, yeah, you have to introspect what's going on, right? Like how often are people crashing? What's the load time for the pie? What's the sync time for their perforce? You know, all those kind of things. And, you know, if you don't introspect it and you're looking at it like a year later and everything takes like two hours or whatever, you're like, well, something, you know, it's really hard to fix. But if you start looking at the graph, you have somebody that's like your build master or some person that's looking at it like, well, we need to be keeping this under control. So we are dumping out more analytics. You can pipe the analytics into your own dashboards as well if you want to. But it's really critical to kind of keep control over what's happening. Another thing on like Zen is that we can now use Zen to stream directly to target platforms that are consoles, but also, for instance, mobile things. So previously, right, you had to stage, deploy, boot, load a thing on a console, and it could take up to, like, what, 26 minutes for something like Fortnite. You know, people want to test on a console. A console behaves different. Your controller is different. You know, sure, you know, like a lot of stuff. So now with Zen, we stream it over the network, 10 gigabit connection directly from that file cache in memory. file cache in Zen server directly to the console. And we can load it in one minute, 19 in this specific case. So that's amazing, right? Really critical improvements in Perth. Another thing, if you look at all these kind of improvements, it's sometimes hard to kind of see what we actually gain, right? Like where do we actually end up? Because content on our end is growing so fast because we have thousands of people making new Fortnite skins every freaking day and whatever. So we make things faster, but the content is maybe even growing faster. So we tested a city sample in 5.1, between 5.1 and 5.7, and we can see that we can cook in a quarter of the time and with 20 gigabyte less memory, peak memory. And Dan is doing a talk about that. So a lot of this stuff is shaders and making that better. So yeah, that's super cool. So cool stuff to see. Unreal Build Accelerator, we kind of talked about, nothing purely new with 5.7, but yeah, we're constantly looking at it. There's a lot of quality of life and improvements. A big thing that you might have missed from 5.6 is that we have this experimental cache server. So basically there's like a standard path in which things are compiled, like a sub-type drive. And that allows us to share whole PDBs and all that between each other people on the network. It accelerates compilation speed again by a huge amount. Really impactful. Finally, last but not least, audio. MetaSounds, really, we see as our audio equivalent of materials or shaders in Unreal. You do not have to commit to MetaSounds and not be able to use anything else anymore. So there's a way to use your favorite middleware and MetaSounds next to each other if you so choose. But we see people using and really embracing MetaSounds and procedural generation and advanced DSP stuff in there. You know, we see them getting really rave reviews on audio and all that. So take a look, right? Don't be so scared. Audio Insights is the way to debug it, right? We are spending a lot of time in there to profile, debug, playback, tail recordings, and all that to debug, why are sounds playing, why is the mix sounding a certain way, all that kind of stuff. A lot of cool stuff in flight for audio. Also here, outside of the quality of life, we have a new sample project. A lot of these things are so complex, and you need to actually throw data at this, and play with the data, and try to understand what's actually happening. So the sample projects, we think, are super useful for that. So I'm happy that we have this new sample project. And we managed to make it to the end. So thank you so much. Thank you.