https://www.youtube.com/watch?v=4zFQTp23xY8

Hi, everybody. Thanks for coming today. So my name is Mike Weiss. I'm a senior solution architect at Epic, and I help a lot of you folks visualize your products. I've spent a lot of time in automotive, but also in aerospace, retail, and pretty much anything that you want to see look really nice and unreal, I tend to do. and I'm having a blast here in Stockholm. After all I've seen in this city and this stuff this morning, I think it's safe to say that I'm a quality addict. Everybody's got that in common here. We're trying to push to the highest level, highest quality, and I've spent many years on this relentless pursuit of pushing myself and the tools to try and tell better stories in higher fidelity for some of the biggest brands in the world. being in your shoes as in my former life as well as visiting you today i know that this isn't necessarily new but it's becoming truer and truer every day visual trust is a project requirement now in the past we could do cool experiences put a bunch of post-production on it and people would accept it as cool and that's good enough. But we need to have an ability to trust when we work with our teammates that are focused on the materials that we work with and the lights and how those things interact, that we're looking at the actual product so that we can make the best decisions and get past that point of rework or double-guessing or extra confirmation just to know that what you're looking at on that screen is that product. And I wanted to take that challenge to Unreal specifically. And so the goal today is I want to make something accessible. I want to make something that's prescriptive, something that is actually something that you can do today. So this presentation is not short. I have about 93 slides. but it's very visual and that's how I learned. So I've done this as though this is what I would have loved to have two years ago when I started this project. Also, it has to be repeatable. So it's neat if you can do it, but you have to be able to do it over and over and over every day because change management is one of the biggest parts of what we do in product visualization. It has to be consistent and that goes with that trust point because if yesterday it looked good, but then today it looks a little different, what happened? Right? And I think that comes from a measurable foundation in the real world. And I've stolen this portal from reality phrase from some friends of mine. And I think it's just so relevant. And I'll show you what I mean by that portal. But today we're going to build a circle of trust. And that circle of trust is going to have these four components. I'm trying to, again, make this accessible, right? There's some math, there's some color science, but I want to bring it to you in a way that you can digest it and you can feel good about it. So one of the things that we want to really start with is our materials pipeline. And we need to know that we can accurately represent real-world materials in Unreal because that's kind of the thing that we're looking at, right? The thing that we're looking at has to be right or else I can't get everything else in line. So we showed the Electric Dreams demo in 2023. And we showed off how cool it was. And it was, you know, this amazing new thing coming called Substrate. And now it's production ready today. Literally. And that's a big deal. And so I knew as we were working on this behind the scenes that folks needed a way to be able to judge and understand their materials in a way that works in the real world as well, so that you have that real-world analog, because while we're digital experts, our colleagues aren't. They're working very traditional, right? So everything you wanted to know about Substrate but were too afraid to ask is a do-not-miss session, in my opinion. Nathaniel is going to be presenting it tomorrow at 4 o'clock in Room 8, and it will be available online is the intention right now. But just as a quick primer for Substrate, just in case somebody missed the earlier session or they've been kind of, because this is on by default now in Unreal. So when you start a new project, your old projects can still be in the legacy. You can still turn it off, okay? But by default, we're going with Substrate. And substrate defines how physical matter is represented in Unreal. And the coolest part about it, finally, is that it scales similar to like what we do with Nanite based on your budget from high-end visualization down to mobile. So you have one shader definition that you can use across all your platforms. And you guys know we ship a game that runs on a lot of platforms. So it's going to help us a lot. But I think it actually helps you even more. in the product visualization space because you don't need your ray trace switches anymore. You don't need your, you know, what am I going to mobile? Am I going to this? All these crazy variants because that was very hard to pin down the look, which contributed to that lack of trust. So I'm not going to go into like everything about substrate because Nathaniel's got his talk. Daryl showed some stuff earlier. But the important thing I want to focus in on this talk is what makes a finish. And the thing that we do with substrate that's really great is that we separate out, like how a paint coat has a, like this is a very simple one, but it has three layers. And each layer has different blends of different materials, like the flakes suspended in the base coat. And it makes this structure. And what's awesome is that is exactly how substrate works. It the same idea So there a concept called slabs in substrate And a slab is what defines one element of that matter And so it has a medium and an interface. And the medium is what defines how that light interacts with that surface, how it's absorbed, how it changes the light with subsurface scattering, and any translucency. And then the interface is how those different slabs interact with each other, or like the top coat. You saw in that Rivian, the dirt that was on top of the opalescent. You know, there's a ton of stuff going on in that opalescent, but then we have that nice coat. Daryl showed it earlier in 5.7 with the titanium with the oil on it and how you can play with that. It's very powerful. But knowing that those layers have transmission, and that transmission is physical, it has a physical thickness, is really empowering for our business in visualization because we can talk to the manufacturer and get the spec to find out what is the physical makeup of this material. And so when we have the ability to do that, like mix with clear coat here that you see, and this is that same sample hallway that I'm always talking to you about. Get the sample content project out of the launcher. There's a substrate hallway that has a ton of materials. like how we demonstrate how these two materials can be layered on top of each other, or they can be mixed with each other. So it's a vertical and a horizontal mix. And that's all I'm really going to cover on substrate, because you just need that foundation for what we're going to talk about here. But do dig in. So we took it a step further, and we said, well, what if we could scan real-world materials to power substrate? We have this physical thing, so can I take a physical material and do that? And so partnering with X-Rite Pantone, we looked at their AXF scanning technology, and their AXF file is in use by a lot of you today. You have these massive libraries of thousands of materials you've collected over all the years, and it's this digital appearance capture of the spectral data as well as all these other pieces, and we worked with them. and say, okay, if I use this, I use the MAT12 for my experiments. This thing's a multi-angle spectrophotometer, and what it does is that you can classify these materials by scanning them and then taking them into their software. But basically at the base of it, it's like a BSDF model, BXDF type of model, and you're getting all this data from these different angles, and it does all this really cool science stuff, and at the end, you get an appearance. And with that appearance, you can describe in a very scientific way, because people use this for quality assurance in their pipeline. So if you don't have this and you're in a major organization, somebody in your org may have this already, and you should talk to them. Typically, it's either on the assembly line or on the floor. So there's one thing that Unreal doesn't do, and that's spectral rendering. So if you're getting into infrared, near-infrared, UV-type studies, the team at United Visual Researchers, UVR, has basically a connection to Unreal to be able to do that extra step of spectral visualization. And they do that for lots of good reasons. Thomas is in the crowd. You can talk to him after if you're interested in taking it to that level. He also has a talk today at 4 o'clock that you should catch right next door. And he'll show you the difference between what I show you today and where they like to go. And we rolled this out at Orlando. So there's already a talk that's a full session on this AXF topic. We show how you use it, all that stuff. But AXF scans in the end aren't like you're used to, right? Usually when you bring in a data type, it's got this hard connection to its original source, and it's kind of unobtainium. but in Unreal, AXF scans become substrate materials. And that's really important because then once you bring it in, you can just ingest it into your pipeline just like anything else. A couple of the pitfalls that folks run into when they start working with substrate materials in the physical way is that they don't know the thickness of the layers and that distorts how those things are coded and stacked. The texture scale gets off. Because on the AXF scan, you can actually know what the size of the original scan was. Because you're sampling reality. So if you know that, you can plug it into your material tiling. Or we use a UV scale in our implementation. Where you can actually scale that out to a real world size. And that leads to the transmission of how this stuff. If you don't use transmission, you won't be able to see it. So a cool example is like if I have a base coat that has like a gray primer on it, and then I put a base coat over it. With substrate, I can actually scratch it. Because I can put a thickness map on there and basically scratch it down to that bottom layer. But if you don't scratch it, you don't see it. And if you don't see it, it doesn't get rendered, so it gets optimized out. It's really powerful. And I think there's some really cool stuff to be done there in the manufacturing space. and then the other thing is like if you're trying to complement these materials if you're using these in a mix you really need to make sure that you're using your albedo and that your albedo is properly defined um and you'll see some examples why in a minute but that's kind of the biggest point is that um so that everything's kind of physically based speaking of physically based i just found this site, physicallybased.info. It's a cool database. I find a lot of my stuff from the original manufacturers' websites, but what they have is physical measurements of common things. And you'll notice here that Unreal Engine, you can select all the engines, right? And we're just one of them And even more important you can pick the color space And don worry we going to spend some time on color space in a minute But you know even pure snow the base color the albedo is only 0 And I get a lot of scenes that I review that are using like 0.98, almost one. And that just doesn't exist in nature. So really understanding how materials are put together, plus substrate, plus that power of a physically based shading model, will really give you a step ahead. The other thing that's really cool about this site is they also have a database of common light sources. I just took a clip, like car headlight LED, right? 500K, 300 lumen. And then also cameras. So I have the A6000, but like one of the teammates I had, Thomas' camera is the A6400. So when I'm recreating that camera inside Unreal, Unreal is ready to take those physical sensor sizes and those aspect ratios and that crop factor and plug it right in. So now we have our material, and we've got our camera covered. So I'll go into it in a minute more, but this camera I have in Unreal, and I know it's this camera because I went to the manufacturer's website. And that's one thing I appreciate about this site is they do list their references. So you can go and just see them for yourself, but it's a great tool. So this is where I ask you, you know, who is doing a formal review on physical and digital materials? Just a quick raise hand. Cool, two, three. Nope, two, that's a head scratch. Three. Okay, awesome. Four. Great. I think it's going to become really important across all the industries. I work with the media and entertainment group as well to be able to have a way to do this, right? And when I say do this, I mean look at your materials in a way that's trusted. I chose the Spectralight QC Light Booth to do my experiment because I see it everywhere. X-Rite's got this thing installed all over the place. It's measured. It's calibrated. I'm not implying that everybody out here goes and gets one, but that's what I chose to put out because that's what a lot of my friends in the industry are using. And I knew that it has a lot of benefits. I'll get into that in a moment. But what I really want you to be able to do when you leave this session is make your own stuff. I want it to be so that you can go home and reproduce your own marketing lighting setup for your design purposes. I want you to be able to go and assemble a box if you want to and feel confident about it. Because that's the trust we're trying to build. The Spectralight QC, you know, if we put ourselves in the shoes of a material designer or a color and trim person or somebody that's trying to decide on which coat they want to use for their product, they use this light box often. So they put their materials in there because it has a classified lighting setup. So there's multiple light modes that are in this box. And that gave me a really cool test bed to do my experiments to know that I wasn't just hitting one target. I was hitting, I think, five or six. And so when you're this person that's been doing this every day for 25 years, and you show up with a laptop, right, they want to touch it. They want to experience it. They want to work with it, right? And so that laptop next to that box, I want to have Unreal on it. And then I also want to be able to do all these different modes because, yes, everybody just wants D65 is what I was told. But that's not true because a lot of people also told me the other modes matter. So everybody has their own process. Everybody has their own requirements. But I think trust is now one of those requirements that can't be ignored. So here's an example of AXF scan materials in the light booth. I did zero to author these materials. I literally just scanned them and dragged and dropped them in. Watch that Orlando talk if you want to get more into that. But, you know, this is just one light mode. But it's really, this is just the normal Unreal viewport. I just hit play. I didn't even hit play, honestly. I just used an editor tickable actor and just spun it. Because that's how we're going to work. We work in the editor. Yeah, you can do this at runtime. That's the cool part. But we're working in the editor. We're authoring these experiences, right? So not the best lighting I've ever done. But it shows you what this digital twin looks like. So X-Ray sent me the CAD data for the booth. I brought it in through Interchange with Datasmith. And I just lit the outside of the booth with a couple of lights just so you guys could actually see it. because typically you're doing this in the dark. The only light that's on is the booth light. So it doesn't super matter what the rest of it looks like, but because I'm a quality addict, I wanted it to look right. So on the right, you can see in the details panel, when you select this blueprint actor that's in your viewport, you can toggle all these different modes. And there's three different implementations for each lighting scenario. And I'll go into that in a moment here. but the thing is, this is the Unreal one here. And when I have my AXF scanned swatches, it's really cool because I have them. They're physical. I have them here so I can work with them and use them as a reference across my scenes along with my color checker. So the color checker, we're also giving away through the niceness of X-Rite. That the X color checker classic that AXF scanned derived from the spectral data into Unreal That a standard tool right And then you have all these different light modes that you can go into just like I showed you with the real booth. And that'll let you take your products and put them in there and make good decisions. That's the portal to reality. In my case, you're going to build your own. And it's really empowering. So I just made a little turntable tickable actor, right? So I can spin it in there. But literally, this is an OBS capture of, you know, it's just a capture of my working desktop, right? And it's really empowering. So that's the viewport. There's a thing I want to show you. Here's a tip and trick. So when you're doing your material preview in Unreal in 5.7, you can modify the mesh that's shown in that viewport. Not everybody knows that. That's what that last little brick button is. You just pick something in the content browser that's a static mesh and click that button, and you get that, and you can tumble it just like you would in your viewport. Then, to take it one step further, you can actually change your scene preview. So, effectively, we ship out these ones. It's clear sky, neutral. We have a new data prep viewport setting. And you can turn on and off the floor because you don't usually want that. But then you can add a profile. So you can go in and at the bottom add a profile in the settings for that material preview. And you can bring in any kubemap. so you can either generate one from unreal with the things i show you today or you can use a standard one like this one ships with uh the x-right software um that you can use so that in their software in our software we're using the same lighting when we're making our judgments okay and so know that you can add that profile and when you do so you can customize the heck out of this. You can deal with the post-processing of this specific viewport. That's super powerful, okay? So that's the material side. Everything is grounded on trustworthy reference. And so I already showed you the color checker, right? This is an industry standard tool. If you're in cinematography or photography, that you're very familiar with these things. And I used it to gather reference. I'm going to go through how I captured my reference for my project so you can see so you can do it for years. So this ColorChecker Classic is what we used. You can get it online. It doesn't super matter. There's a lot of different versions of it. And because there's a lot of versions of it, it's really helpful to know that Babel Color has a database of the ColorChecker that you can go that has all the reference values that you'll see me referring to. So whoever makes this usually ships a data sheet that has the known luminance values of this. It's got 24 patches. Some of these are for skin tone. Some of these are for, you know, just base white balance value and all that stuff. But all that stuff is reference. So this becomes physical reference in your photography. So, you know, first things first, right? Precision. So all those little tapes and all that positioning, I know exactly where everything is. I know that on my camera, this little, this is where the sensor is inside the camera. So when I'm in Unreal, I know where to measure from because our cameras, you know, have that nodal point. I want to match it really close. Really understand your gear. Understand your gear and its limitations. We had an interesting one where the precision of one of our devices wasn't what we assumed it was. And all of a sudden, when we realized that, everything else made sense. And, you know, what I'm showing you here, you know, lock it down. You see how much tape we used to keep this tripod where it needed to be when we were capturing the light box in the lab. And, again, you know, document more than you think you need because, you know, you see I have two light meters going there, right? I just have redundancy. Know that you have everything because once you move stuff, then you've got to kind of start over, right, because you don't know what's locked down anymore. And then once you have those measurements, Unreal becomes super easy. Because if you go into any orthographic viewport and middle click, you get a measuring tape. And they're in real units. So you can go in and set centimeters or inches or whatever crazy unit that you want to work in right inside Unreal. It's as precise as a CAD software that way. And I don't know that everybody knows, but you can define your snap size in here too. So if you go into the level editor viewport settings, settings, you can actually add more grid snaps or grid sizes to that list that you see at the top in the new viewport. And I just wanted to throw it in as an extra. When you're working with real-world stuff, typically it's smaller. You know, these things are not big. So if you set your near clip plane as a C-VAR, you can save it off in a C-VAR collection or something. But typically, 0.01 keeps it from clipping when you're getting close. Okay, so now I'm going to speed up a little bit just because that was kind of the foundation on where we're going. So now let's go there. So I took six hours and challenged myself to go to my basement and set up a photo studio, similar to what I've helped others in the world do already in their organizations, to show you how accessible this is. So I've got my rabbit friend here that I 3D printed for model scale that isn't licensed IP, so I could actually show it. You see a lot of the toys I brought today in there. Got some basic ring lights and panel lights off of an online shopping site. And borrowed a drip from a friend and just set it up. And so, you know, just to give you kind of a feel for the layout, this is, you know, super basic amateur stuff. basement stuff, but it is pretty close to how the real world works. You'll have silks, you'll have a little bit fancier light boxes, but the principles are all the same. And it took me about 10 minutes, maybe a little bit more, 12 minutes, to do a reality scan of that space before I tore it down. And I highly recommend you do it because it worked out amazing. So my sensor position in Unreal for My light meter is a little bit different. But you can see getting the angle of that color checker or getting that rabbit's position or the position of these plastic pieces digitally, it was priceless. It helped me model the photo backdrop. You know, I just used our modeling tools, did a quick poly model. And it just all fell into place because I had done my research. I had done my measurements. What's really cool is that I didn't use this Photo Studio intentionally first when I placed the objects with my measurements. And then when I turned it on, the only thing I had to change was the position of the light meter a little bit because the sensor was different and the angle of the color checker because it's really hard to get that right. So, you know, use RealityScan. Use it. Like, if you're taking reference for anything, just whip out your phone, wave it around the room for a while, send that sucker to the cloud, and pull it down out of Fab. I use GLB as my transport. There's all kinds of options. But then you've got a photo studio. I'm glad it's after lunch. I was going to be cruel if this was morning, right? I'm in Stockholm. It's Quixel assets, right? It's almost like a cheap button. But here I've recreated my lighting setup. This is an Unreal viewport screenshot. This isn't path traced. It's nothing fancy. It's just a viewport screenshot because that's where we work. We work in the viewport. Then I grabbed some Twinmotion assets from my friends. I added a reflector card. so the photo studio backdrop um the photo studio backdrop has an axf scan i use that mat12 but this this material is actually out of the automotive material pack the reason i did that is these are legacy materials that were translated into substrate that i then tuned in the biggest thing I needed to tune in was the saturation because it was, as I said earlier, the albedo is really important. So that base color is the main thing I did. And same thing with the shoes. And I wanted to do something funky, right? I didn't want to just do another boring pair of shoes. So now it's going to get technical from here on out. But I hope you understand the value of why I'm doing this. Because now we have something like the quick render, which is super sweet, because you can set up a movie render graph to render out as many variants of that as you want with one click. And so I just set up my base, it's super basic, right? I wanted to do a path trace and a deferred render on every click so I could compare them, and it's very basic. There's documentation that I share later where you can learn how to do this. Okay, so materials is done. So now, sorry, reference is done. So how do we make sure that our color is accurate? It's actually the hardest part. Our eyes, this camera, this monitor, this monitor, that monitor, that monitor, that camera, that camera, that projector, they ain't the same. So unfortunately, you got to calibrate every display that you make decisions on. That means on the artist side and on the consumer side. This is the calibrator that you've seen in the Unreal Viewport. This is the X-Rite i1 display. It's an old one. I've had it forever. This is a newer version. It's the Calibrite 1, 2, 3, whatever. But you got to calibrate all of them. And once you do that, the next step is you need to consciously just choose your color space. This is the point in time to decide your standard through your project of what color space you want to use. If color space is something that's brand new to you and you have like, you're just not, you're like, I have no idea what you're saying, Mike. I have no clue where you're going with this. ACES is the Academy color encoding system. It's used in media and entertainment to make it so that when I share footage between studios, it looks the same. And so the big thing we're going to talk about here is color space. So when we talk about color space, it leads to tone mapping. So you can see on the left, we have the untone mapped version of this. And then there's different versions, and it's kind of harder to see on there. It comes out more on this monitor because the monitors are different. The SDR video is slightly cooler than the Rec.709, and the untone mapped is way cooler. So which one's right? If you want to learn more about this, Rod Bogart did an amazing talk in Seattle talking about from texture to display in the life of a pixel. And the things that I want to take away from this key point is that inside Unreal, whatever you bring in, you can tell it what color space it is. I can say it's ACES CG, or I can say it's Rec.709 linear sRGB because I consciously chose that that's what I work in. And I'll show you why that matters in a minute. But then Unreal takes it in and makes it a high dynamic range linear image. or linear value, it's not an image to Unreal, it's values. And they all work in a color space, so it's not explicit, it's implicit. You get to set the color space inside Unreal to decide what you going to work in By default Unreal works in that SRGB linear SRGB Rec But you can choose others. And I'm going to leave it at that because it gets a little hairy from there. But there's another thing. There's a frame buffer pixel format that you can raise the precision on in your projects. And you probably want to do that if you're doing this kind of work. And another place that you can see this color space really easily when you're trying to figure this out is that in the color picker, we have this all over the editor. And if you turn off the sRGB preview, you'll see it in the color space of the working color space. And that's the same values, but they're very different colors, right? So what we're really talking about here is the difference between scene-referred color and display-referred color. The display has its own color, but the scene should be what the light is in the real world or in the raw linear unreal values. It's kind of like doing that conversion in post. It's like when we hear post processing and adding all this bloom stuff and all these color shifts. That's all working on top of the scene referred color. So we want to capture all of our reference in scene referred. And the main thing I want you to take away from here, it looks worse here again, same reason. but this is really flat. This is really saturated and beautiful, right? And that's because when I'm doing the calculations on these pixels, I need them to be linear so that I get the same shift. I don't get these weird offsets. And Rod talks a lot about that. So here's how I capture that. In the scene referred linear color pipeline. So my camera is configured to linear. I turned off all of the fancy stuff. It really wants to make it beautiful for me. It wants to make it this beautiful display mode, right? So everything's set flat. Exposure, the only thing I change when I shoot my brackets is my shutter speed. Everything else is locked down manual. Watch out for dynamic range optimization, for example. It changes your color, right? And that's what we're going to be really fighting here. So I take my raw camera files, and this is my brackets of raw data, not the JPEGs, the RAW. Because the RAW is the sensor data on this camera of how it perceived the light in the real world when it captured them. Anything else coming out of this camera is shifted in color and that breaks the calculation. So you end up, sorry, play. When I mix my color space in the middle, it's not the same anymore, right? That's that curve thing I keep talking about. And this only happened because my merge. So when I process this and I develop these RAWs, I wasn't active. I'm going to release all these slides. So there's a lot of reading stuff on here for you guys, but we're just going to rip through it. Okay. Don't panic. But the main thing here is to actively manage that and save your color values. Right. You got to protect them because every step along the way, these tools want to make it look beautiful for you. So I use RAW therapy to develop it. And this white balance is just an agreement on what white is in the scene and everything shifts the same. So when you have two different cameras, you really want to use a consistent white point. And this is kind of where you're setting that. This is where you're setting that. And so in raw therapy, I made sure that everything was flat. Exposure zero, tone curve zero, like flat. I made sure that when I did my profile lens correction, I did not use chromatic aberration adjustment because it changes the color. Right? Same thing for my camera profile. Don't use the tone curve. Don't use the look table. If there's a button, you're going to touch it, right? Don't touch it. Turn it off because it shifts the color. We want the sensor values of those photons and that luminance in that world. And then we kick those out as a TIFF. So here's a quick example. So this is the JPEG. That's the TIFF. If you look at the box I circled, it's kind of the most apparent. But the TIFFs are actually, they're not linear, but they're relational. where the JPEGs, everything's kind of blown out and you don't have as much there because it's got that color table. It's got that tone curve, okay? So then we're gonna merge it up. So we're gonna merge it just like our typical HDR workflow that's been around forever, okay? I use Photoshop. The thing you wanna watch out in Photoshop is that it also has a color space. And when I mixed the color space, you saw the shift didn't work the same anymore. So be explicit with your color space. So I'm choosing ACES CG as my choice. Just choose one. Don't change it. The other thing you want to do is make sure that you tell it to tell you when it doesn't have the right color space. So check those two boxes. So it says, hey, this footage is weird. So you can make a conscious choice. Do I want to convert it? Do I want to leave it alone? Save the color values. Can't say it enough. Then what we end up with is with this relative referred HDR. This is an EXR. but it's a high dynamic range image that has the relative referred values that are in the from the camera scale not the real world scale so how do we do this how do we evaluate real world luminance to gather that real world scale okay i'm gonna go through this quick but i will give these away so don't don't panic there's like five edc articles coming on this so the important thing to know is that when we're looking at Pele's toga, you know, that blue toga has an albedo, which means it's how much luminous energy is absorbed by it. And we're trying to figure out how bright the source of the light is. So we're trying to observe its luminance. And in order to do that, like what we do is when we measure with a, oh, I didn't bring my light meter. When we measure, or yeah, I'll use this one. So when we measure with a light meter, we're measuring how much light is in this given space and we're correlating that to the real world right and that's illuminance is perceived brightness so i use this really fancy lab equipment this is the minolta ls 110 it's lab equipment you can point this at any object and it return that value right and um it kind of unobtainium but uh but i used it just to validate So this has a known albedo of neutral eight so it about 81 So when I shot this with well yeah so I shot this side first, sorry. So we shot this guy. These all have a known reference, and that reference is, so I went in, I decided what I was going to measure, right? That's what the first column of numbers is. Then I took the reference measurement because I know what these are, right? And I documented them. And I did the upper and lower card because the facing angle matters a lot, right? We're working with light here. If I measure here and I measure here and I measure here, they're all different values. And then I use that device to capture the different luminance of those points. Documented, lots of documentation, right? And that's really cool. But how do you do that? You don't have that lab equipment. And I found out something really neat that I want to share with you. The Lambertian assumption allows you to basically take a light meter and position it at a known distance from the target. In this case, I'm reproducing our lights. And I align the lux meter with this coplanar so that they're facing the same direction. And then I record that, right? So in this case, it was 255.4, whatever, neat. Then I know, as I said, that this is about 81%, right? So then I can plug that into this equation where the luminance equals the lux times the reference patch divided by pi. So I ended up with 66. That's what I would have wrote in my table, right? When I measured it with the lab equipment, and remember, this is a range. It's between 5 and 10. It's photons, right? These aren't, it's not a static number. I got 63, which means I don't need crazy lab equipment to do this. All I need is something with known luminance and a light meter. And I can derive the luminance in my photo of what it should be. So this is HDR view. It's a way to measure. So I just recreated all my measurement points inside here. So basically, number 5 has a value. This doesn't measure luminance in this software, but you can use this equation to figure out what your scale factor is so that you can shift from the camera-based luminance that you have today into something useful to measure the real-world values forever. And one little thing I added here is if you want to know what the Photoshop slider is for the exposure, you just log to that value, and it'll give you what you punch into Photoshop. It all falls into place. One other thing is that this is the ACES CG. Give me the mouse. This is the ACES CG equation. And if you use HDR view, for example, you choose your color space, right? We talked about that a lot. And down at the bottom, it'll actually show you those values that you're using for RGB based on your color space. So you can just cheat it out of there. So for sRGB, it would be these values instead. Remember, Unreal's running in that 709s RGB, so if you want to do things for Unreal, that's the world it lives in, inside it, under the hood. And so then you scale your luminance. So now I've gone from camera-based luminance to, hey, that's the real world. So now I have an actual copy of the real world scene-referred luminance that I can use as a reference later. Now I can do the same thing in Unreal with those values using the Pixel Inspector. Pixel Inspector is a tool that lets you inspect a pixel, but it's actually a grouping of pixels, which is good. If you zoom out and zoom in and have things, it's going to change. So you've got to kind of work with it to like, the smaller you measure, everything's a gradient. That's the best way to say it. But you're looking for luminance before tone map. In luminance before tone map, you're going to get the value that you're looking for there. So the other thing you're going to want to use in Unreal is the virtual production pass-through post-process volume. It's in the virtual production utilities, and what it does is it basically sets your scene to not have post-process stuff, like bloom and vignetting and all those things. And it's kind of like what our team decided the most scene-referred color looks like. And there's actually a viewport mode now as well. That's HDR before tone map that you can also look at. But that's what the pixel inspector is kind of looking into is before tone mapping. I have a blueprint light meter tool that I'm going to put out on the EDC that's going to teach you how to create your own sensor so you can replicate your own equipment like I did. And it uses that pixel inspector concept of because I can get the luminance before the tone map, I can query the scene wherever I'm at to measure that value. and so here we can see that you know wherever i place this in the scene it's it's measuring it up there's a flicker to it because of lumen um and it's always a couple of frames or a couple of ticks behind because of the process of how the it uses a render target but basically i'll show you how to like make your own uh light meter it has iso uh and the new one that i'm doing for the article. It has f-stop approximation, all that fun stuff based on shutter speed. But it's got some logging. So I'll show you guys. I'm going to give you the sample of mine, but I really want you to make yours because mine's not that useful to you, but yours is right in your hand. So that's coming to the EDC shortly. I'll have a link at the end where you'll get all this stuff. But what's cool is that UE lights are physically correct. So if you set a light to 3.1415 candela and set it a meter away and measure it, that's what you should see on your meter. That's how you know that it's correct. And that's what I needed to know in order to do this. If you're in really, really bright values, some of my friends are trying to reproduce flash photography, which is incredibly bright Run this Cvar R scene color format because otherwise you clip the value of the frame buffer and not get enough precision that you need So that important when you get to anything over 205,000. Okay, so now we're on the home stretch. Now we just got to set up our lights. So this is what the booth looked like. This is what the lights look like inside the booth. You can see me all set up doing my thing. I told you earlier, there's three different implementations of every light. And with that, the reason I did that is because it depends on the type of project that you're trying to do and what your trust criteria is on how you develop this solution. So in the first slot here, the UE point light is what I used. What you're seeing is there's a reflective, like a mirror basically inside the UE viewport so we can see up into the camera. Okay. So the UE point light, because I have the CAD data, you can see that we can see all the bips and bobs up in there. And that is shaped the same as the real lights. Okay. This is the incandescent A light. So there's one halogen bulb. Then the second point is the full lamp. So the entire light part is actually one HDR light texture. And when you get big rectangle, rect-light textures, you get lumen artifacts because of the specular lobe and the way that things work, okay? It doesn't mean it's bad. It just means you need to know that's what happens with big lights. And so when you path trace, which is what the bottom row is, you see that between the full lamp and the crop, there's going to be a little difference. There's a little bit more light coming out of the full lamp. But that artifact is only in the viewport in Lumen, and it's only at the top there where it first hits. And then what I did in the last one is I cropped just to the middle to the logical extent of that light and how it shines, and I don't get that artifact. so it may not freak out your people as much. The cool thing about using the HDR in the rect light is that when you're looking at your specular reflection, you're actually reflecting something, where with the unreal lights, you're just getting those square or circular light specular pops. So how do we recreate our own light kits? The same way as we captured our reference. That's why I dragged all you through that. You just shoot the light itself, Do it the exact same process way. Save your color values. And when you make your texture, I don't care if it's 16 by 9 aspect or 3 by 2 or 1 by 1. It's going to be 1 by 1 as a texture. And it's going to be power of 2. It has to be power of 2. It makes a huge difference. You're going to shape the light with your light in Unreal with the rect light. So that's where you worry about your aspect. But your source texture, it doesn't have to be 8K. Like 1K is plenty, right? Just keep that in mind. But basically, you set up your light. You measure it as we talked about. You shoot it. You shoot your brackets. You process it the same way I showed you. And you get your HDR exposure. And you set your intensity on your light to 100 candela. The reason I'm doing that is because now I have a 100% slider that I'm going to calibrate into. It's a set value. So then you'll end up with your rect light. See, it's not square. It's rectangular because that's my LED light. And it has the same value that I measured. And then because my light had modes, I made a quick blueprint actor for this that when I click the change mode button, it assigns different texture maps to that source texture for the light. And what you end up with is this. So I have my light booth where I bring in my product and I judge its materials in a known lighting condition. Then I have my studio that I reproduced that I can trust and I just drop the product in and it looks great. And I render it out. So I want y'all to go out and build your circle of trust. That's my goal. It's a call to action. Pick one light in your house that you think looks neat. Not a point light. Those are easy. Get something that has some interest. These were LED panels that I was shooting. I want you to go harness that power of Unreal to build that out. The last point I want to make in this little recap here is when you're working in an untone-mapped linear color space, things look really kind of boring. They're kind of gray. They're kind of flat. But you're going to do that in, I don't know why my repeat's not working, then have fun, then shift your camera, then do the color effects, then do another post-process volume that has really cool saturation and uses the color grading. But you're choosing that because then you can render out this footage just like any other footage that you do in a pipeline and give it to your comp artists or your photographers, and they'll treat it just like a real camera because you've effectively made a scene-referred image just like you did with your camera directly out of Unreal. So everybody takes a picture now. If you're at home, grab the screenshot. I have an article up that is going to have samples of the plugins and additional resources of all the things that I showed you today. All the links, all the talks, all that stuff. I'm going to be at the Dev Hub on Wednesday from 9 to 10.30, and I'll be at the UE kiosk from 12 to 2. Come see me. Let's geek out. I'm really interested in how this applies to your world and if it does. Yeah, thank you, everybody. I appreciate you.