Q&A: Two of the leads on Unreal Engine 5 tell us how the new engine will help give developers the ability to make near-photorealistic games.
By
Eddie Makuch
on
Epic Games recently showcased Unreal Engine 5 and announced that the next-generation game development toolsuite is now available in early access ahead of its planned public release in early 2022. Epic showed off the capabilities of the new engine with a thoroughly impressive tech demo called Valley of the Ancient–and you can see more of that here.
To learn more about Unreal Engine 5, Valley of the Ancient, and what Epic has in store for the future of gaming, we spoke with Unreal Engine senior technology designer Chance Ivey and VP of engineering Nick Penwarden. They touched on numerous exciting updates about Unreal Engine 5, including how it will allow for a new level of visual fidelity that wasn’t possible before and how, overall, the engine gives power to developers to streamline their workflows and create games faster and more efficiently.
The overall idea for Unreal Engine was to create “new, better, streamlined” development workflows, Penwarden says. The new Nanite and Lumen technologies inside Unreal Engine 5 are critical to what Epic is trying to achieve and unlock with the toolsuite. Another major component to Unreal Engine 5 is its World Partition system, which allows developers working on a large-scale games to break sections down into smaller pieces that developers can work on collaboratively and section-by-section to help move things along and work more efficiently.
As a practical example of one of Unreal Engine 5’s key innovations, Ivey said that in the past, making lighting adjustments to a massive scene required “re-baking,” a process that takes so long that developers would do it and then go home for the night. But with Unreal Engine 5, developers can more freely iterate and experiment with big changes without needing to sit around and wait.
“It makes the actual edit time, the creation time, the iteration time–way, way way, faster because we can make changes and see how things look at different types of days and be able to change the environment and lighting in ways we couldn’t before,” he said.
Our interview touches on all of these subjects and more, including how the advancements to Unreal Engine 5 might help virtual production workflows for Unreal’s non-gaming applications, like the technology that Epic created to help Lucasfilm make Star Wars: The Mandalorian’s virtual sets. Check out the full interview below.
How did you settle on the Valley of the Ancient demo to showcase the new UE5?
Ivey: So basically we knew we were going to release early access to the engine early in 2021, and whenever we go about doing that–putting tools in people’s hands–we generally like to put content in their hands too, to show them what they can do with those tools. They can learn from them, they can see how we would approach certain workflows for things that that they’ve never seen before.
So I talked with Nick and a handful of our engineering directors and dev leads on different teams and said, ‘Hey, what do we feel is going to be to a certain level of maturity by early access that we want to make sure that we encourage people to go explore it and test out?’ So we got a list of features from there, and then we hit the books to say, ‘How can we best show these features by building a project in a few months that can show both what’s possible today in early access but also be something that they can developers easily dig apart and learn how they work.
Every time we build a demo like this we want to give the source out to other folks so they can learn how we do things and then use the assets in their own projects as well. So we basically wanted to say A: What do people need to know about and how can we demonstrate those features for them. And B: How can we give them really great content that they can use in their own projects in their own projects if they want to as they get started with UE5 and C: Let’s go through the process of building and and releasing something, with early access, to see how far we can push some of those boundaries and see how we can make the toolset better for folks at early access.
Is Valley of the Ancient a playable game that you’re going to sell or offer?
Ivey: It will be available for anyone to download and run themselves. the demonstration runs at 30fps on both PS5 and Xbox Series X; we wanted to target the next generation of consoles to see what maybe the next generation of games might be able to do with the tech.
The new UI is maybe one of the most obvious things that people might notice first. Was this re-design in response to feedback or something you wanted to address on your own?
Penwarden: I think it came from a combination of feedback from users, from our own internal developers, and from just new ideas that we had on how we can make an editing experience a bit more immersive and try to make sure that developers have the detail they need on screen but can really focus on the task at hand rather than being bombarded with the UI.
With UE5 you’re going for a more streamlined and optimized-for-workflow approach. Is that something you think UE4 didn’t do such a good job at?
Penwarden: I wouldn’t say UE4 didn’t do it well. I would say we’re trying to do it even better with UE5. And really looking at a combination of where the new technology that we’ve been able to build can help to create new, better, streamlined workflows. So with Nanite, the ability to ingest high-polygon models directly rather than needing to go and build custom LODs for that purpose, or with Lumen not having to go pre-bake [global illumination] and create lightmap UVs and go through an overnight baking process, but rather being able to dynamically light your scene. I think the World Partition and UI redesign all have that common thread of, ‘How do we streamline and make the developing experience better across the board.’
For World Partition, the idea of being able to edit the world as if it’s just one big map rather than having to subdivide it up manually into sub-levels for the purposes of editing or streaming content.
What are the major keys and takeaways for Nanite and Lumen that you think developers are going to be able to improve upon with these new tools?
Penwarden: I think the two key takeaways are being able to reach a level of visual fidelity that wasn’t possible before. Being able to create environments at a higher quality with less effort is another key takeaway. And the third, this one is maybe more specific to Lumen but being able to create more dynamic experiences while preserving a very high level of fidelity with that with high quality global illumination. So the fact that you can make changes to the environment in real time or you can change the time of day or change the lighting scenario and see all the high quality GI updates in real time, opens up new kinds of gameplay experiences that developers can create for players.
Ivey: All that hyper high resolution geometry, we’re lighting this scene with one directional light. Just one. To do all of this. And we have a time of day slider that we use for a lot of our cinematics. But pretty much as we rotate our light around the scene, all of this lighting is dynamic. Everything here.
Just doing simple things like this in past and previous games, as an environment artist or as a gameplay or mission designer, if I have to change something in the world as far as the geometry and layout goes, a scene like this [he shows us a massive map set in the Moab desert]–we might be re-baking lightmaps for production overnight, or longer than that.
So the fact that we can do this now not only opens up a bunch of new gameplay possibilities–this rock could be destroyed in the game and it wouldn’t destroy our live mix or anything–it also makes the actual edit time, the creation time, the iteration time, way, way way, faster because we can make changes and see how things look at different types of days and be able to change the environment and lighting in ways we couldn’t before. This is one light. We light this scene like the sun would light the earth, as opposed to having to do a lot of trickery to get the same kind of results.
Can you walk me through what’s happening with Quixel and the Bridge and why it’s so significant for UE5?
Ivey: Quixel, as a library, it’s hyper-realistic, photo-real assets. Bridge has historically been this tool you use to acquire things from the Quixel library. We’re trying to keep workflows that make your life better; doing more with less work. Bringing bridge into the editor here allows you to access the library of content there and import it directly into your project as opposed to having to go through a traditional process of get an asset over here and bring it in there. So it’s just one more way to blend those tools together and keep you as close as you can be to your creation. Artists can pick the asset they want, bring it into the scene, and then it’s lit. As opposed to having to find it, import it into their project, and once it’s there drag it into the scene. So it’s just trying to get as close as possible to you getting your results.
Penwarden: And I think, more generally having access to the Quixel MegaScans library is really nice for enabling developers to get these really high quality assets really quickly into their games and experiences, whatever they’re creating. Because rather than having to do the work to model it all themselves and build the individual content, they have access to this huge library of photorealistic content at their fingertips to go and populate their world and build a really high fidelity experience really very, very quickly.
Ivey: As a designer, who is not an artist, generally in the past what i would do is I would find free assets I could use to kind of prove out an idea and then later have an artist come in and fix this. So for someone like me, having access to the Quixel library, it’s not just assets that can help me feel out a design or really understand what I’m trying to build, but it’s actually items I can ship a game with because they are that high quality. They are better than anything I could ever find other places.
World Partition seems like it’ll be one of the big takeaways for optimization of workflows and efficiencies. Is it in response to how game sizes are growing these days, so something like this is more of a necessity?
Penwarden: Yeah, it’s a response to … let me say, the reason we did it was to allow for easier creation of large worlds. So a couple things–on the edit time side, previously what you’d need to do was think up front about, ‘How do I want to build this world in a way that will allow all of the artists and designers on the team to collaborate without stomping on each other’s toes?’ So you might split up the world spacially or split up the world into layers, so lighting is in one layer and effects are in another, and then you have the designers and artists managing what the file structure looks like for the game so they can work together without stomping on each other’s toes. On the edit time side, to everybody working on the project, this just looks like one big map. And under the scenes, the editor stores all of the individual objects in physically a single file on disk, which means as long as we’re not editing the same object in the scene, we can both work in the same physical space in the map. And collaborate much more easily that way.
The other side of it is the runtime streaming side, so when you build a large world you need to think about how much content can be in memory at any given time, and how do I need to break things up so as the player traverses the world, how do we get content in fast enough that people are going to be able to see it. So what World Partition does is it separates those concepts. So at runtime, we do take all of the content, we divide it up into grids or streaming chunks, that then the engine can then stream at run time. So the size of those chunks are all fairly easily configurable by developers so they can change it based on their game’s needs. And very quickly get to a point where you can stream these large worlds and just have players move through them easily without having to go back and re-work a ton of content to meet the new requirements of your game.
As an example, if you were building a large-world game and then all of a sudden you decided, ‘Oh wait, we want to have a sequence where you hop on a motorcycle and drive at 75mph down a canyon road,’ all of a sudden you need to go back and think, ‘OK, well, we can’t stream in that content quickly enough.’ So you’d need to go back and rebuild it or take apart different sections and build it with those streaming times optimized. Now you can a little more easily go in and change streaming distances and not have to go micro-manage files on disc.
I understand there have been a few games announced for UE5 and presumably many more behind the scenes. Not asking for any details on that, but curious what you’ve been hearing from developers who are working on UE5 so far?
Penwarden: I think what I can share there is, we’ve had a couple of experienced Unreal Engine licensees testing our UE5 for a little bit now; a very early version of UE5. And so far, the response has been overwhelmingly positive. I think developers are really excited about the new technology that we’re building into UE5 and what they’re going to be able to make with it.
“We’re not there yet in terms of true 100% photorealism, but we really are getting closer and closer.” — Penwarden
Because the Unreal Engine in general is used by so many teams across games, movies, apps, all different kinds of things. When you’re thinking about the new whatever you want to add to UE5, how do you go about prioritizing what to work on when the engine itself is used for so many distinct purposes?
Penwarden: So it’s interesting. I think one of the nice things is photo-realistic real-time rendering is a key component to so many different industries, and actually a number of the core technologies overlap really, really well. So Nanite, which is going to help to transform what next-generation games look like is also something that is going to be really empowering for film and virtual production workflows where you really care about being able to bring in film-quality assets without having to go and make a game-optimized version. Same thing with Lumen. So we have this set of core feature set that is broadly applicable and then it’s really a matter of working with partners and adding some of these specific work flows and support for certain [digital content creation] packages and so on for enabling new industries to take advantage of Unreal.
Are the advancements to UE5 going to have an impact on the tools that they use to make The Mandalorian?
Penwarden: I don’t know that I can talk specifically about that show but I can say all of the tools and technology we’re building now, it will be in UE5 and it will take the virtual production workflows that we’ve built in UE4 and make sure they’re working in UE5 as well. So all of these technologies will come together with the full release of UE5 to power virtual production workflows, and technology like real-time high-quality global illumination, as well as being able to import film-quality assets directly, I think is going to have a huge impact on virtual production workflows and the workflow and their ability to iterate on set.
You’re delivering a lot of exciting new features and functionality with UE5. When you’re thinking about the feedback over the years from UE4, I’m sure there was a lot that came in, but was there any consistent feedback that developers wanted to see updated or improved upon for UE5?
Penwarden: I think probably the most consistent feedback that we get is around trying to optimize for developers’ workflows and I think developers always expect Epic to push the boundaries of graphics technology and visual fidelity, so we’re doing that and I think everybody expected that. I think the work flow changes and how we can make developer workflows more efficient is consistent feedback that we get from licensees that we’re really embracing that we’re trying to make better with UE5.
You’ve both mentioned photo-realism and film-quality assets for UE5. I’m wondering if you can both share your thoughts on the uncanny valley as graphics get more and more sophisticated.
Penwarden: The uncanny valley, you know, I don’t actually have that many great things to say about that, to be honest. I look at some of the shots from the Valley of the Ancients demo. We saw photographs of the Moab desert and then see a render of Valley of the Ancient and if you’re not careful you might mistake one for the other. We’re not there yet in terms of true 100% photorealism, but we really are getting closer and closer.
Got a news tip or want to contact us directly? Email [email protected]