How Painkiller RTX Uses Generative AI to Modernize Game Assets at Scale

-


Painkiller RTX sets a brand new standard for a way small teams can balance massive visual ambition with limited resources by integrating generative AI. By upscaling 1000’s of legacy textures into high-quality Physically Based Rendering (PBR) materials—a process that might have traditionally taken years—the team dramatically reduced the burden of repetitive work.

This approach was especially impactful for contributors without traditional modding backgrounds, freeing them to give attention to creative decisions: refining materials and ensuring the sport’s iconic atmosphere responds appropriately to ray-traced lighting. Learn the way the team architected a production pipeline that blends automation with artistic judgment across 35 unique levels.

To explore the motivations, solutions, and lessons behind these technical challenges, we spoke with McGillacutty (environment reconstruction and material lead), Quinn Baddams (team lead and founding father of Merry Pencil Studios), and NightRaven (creator of PBRFusion).

What’s your skilled background and current role?

McGillacutty: My background spans architectural design, technical art, and game evaluation, with a give attention to real-time environments. I currently work independently, combining teaching and technical client work with development on RTX Remix projects like Painkiller RTX. My role centers on environment reconstruction, material authoring, and constructing AI-assisted asset pipelines.

Quinn Baddams: My profession has focused on constructing and optimizing complex systems—first in business strategy and digital infrastructure, and more recently in computer graphics. I’m currently studying computer science with a give attention to AI and machine learning, which directly informs my work as team lead on Painkiller RTX and founding father of Merry Pencil Studios. I apply systems pondering to architect our production pipeline and integrate generative AI as a practical solution to problems of scale.

NightRaven: I’m currently a system engineer handling the whole lot from full-stack automation to administrating VMware and cloud environments.

What made you should develop into an RTX Remix modder, and what brought you to Painkiller?

McGillacutty: I got here to RTX Remix from a visible and architectural perspective, with none modding background a 12 months ago. When Quinn showed me Painkiller’s towering gothic interiors, I immediately saw how well they might lend themselves to ray-traced lighting—stained glass, stone, metal, and deep interior spaces. RTX Remix offered a strategy to renovate those environments by rebuilding the materials so the lighting could finally behave realistically, which pulled me straight into the project.

Quinn Baddams: I’ve been excited by computer graphics and technical art because the early days of 3D accelerator cards like Voodoo and TNT. On the time, real-time ray tracing felt like something we would see far in the long run, but advances in denoising and technologies like NVIDIA DLSS made it viable much earlier than expected.

RTX Remix naturally pulled me in. I’ve at all times found physically based rendering principles satisfying, and path tracing suits that mindset well. After experimenting with several games with various levels of compatibility, Painkiller stood out. It has solid mod support, an energetic community, and it was also certainly one of my favorite games back within the GeForce 2 GTS era.

You’re among the many early adopters to make use of generative AI to rebuild textures and materials at scale. How did you employ models like PBRFusion to convert low-resolution assets into high-quality PBR materials?

McGillacutty: With minimal texture reuse across 35 levels, manually rebuilding 1000’s of materials simply wasn’t feasible for a small team. PBRFusion became the backbone of our pipeline, allowing us to batch-convert large sets of legacy textures right into a usable PBR baseline at unprecedented scale.

The model robotically generated base color, normal, roughness, and height maps, which allow us to bring entire levels right into a physically based context in a fraction of the time. Coming into modding with out a traditional background, this AI-driven approach was critical—it removed the friction of repetitive work and let me give attention to creative decisions, like refining materials, preserving the sport’s iconic atmosphere, and ensuring the whole lot responded appropriately to ray-traced lighting.

Quinn Baddams: PBRFusion makes it possible to batch-upscale a whole project’s textures and quickly generate normal, roughness, and height maps, which is a superb start line. That said, convincing results still require material-by-material judgment.

Many surfaces don’t profit from height maps in any respect, while others, especially metals, require way more careful treatment. Most metallic materials in Painkiller RTX were hand-crafted. Glass, transparent surfaces, and skin also needed custom values and maps, particularly for subsurface scattering.

Hero materials received additional attention using a mixture of techniques, including mixing CC0 PBR materials, AI-assisted generation, and procedural workflows in tools like InstaMAT Studio. AI provided the baseline, but traditional material authoring was essential for achieving quality and control.

What got you excited by generative AI, and what motivated you to fine-tune a model for RTX Remix?

McGillacutty: Scale was the first driver. With 1000’s of textures spread across 35 levels, rebuilding materials by hand would have been impractical for a small team. I used to be already using generative AI for rapid iteration and visual exploration in other design contexts, so adapting it for RTX Remix felt like a natural extension.

Tremendous-tuning a model gave us a strategy to process large volumes of stylized legacy textures efficiently while maintaining cohesion across levels. As a substitute of treating each asset as an isolated problem, AI helped establish a consistent material baseline that we could then refine artistically.

Quinn Baddams: My interest got here from a practical, production-focused curiosity. While experimenting with asset pipelines, I noticed a transparent technical gap: there have been no generative AI models tailored to the particular challenges of game development, particularly removing baked lighting and shadows from legacy textures, which is a significant obstacle when converting assets to PBR.

That problem overlapped directly with my academic give attention to AI and machine learning. RTX Remix provided a real-world production environment where I could bridge that gap by fine-tuning models to unravel an actual pipeline bottleneck, turning research into something that directly addressed Painkiller’s scale.

NightRaven: RTX Remix was my entry point into generative AI. It was exciting to see older games brought back to life with modern rendering, and while learning easy methods to mod with Remix, it quickly became clear that high-quality PBR materials are certainly one of the most important aspects in making path tracing work.

I began using the available PBR generation tools, but I wasn’t satisfied with the outcomes. Despite having no formal background in AI, I made a decision to construct my very own solution, which became PBRFusion. It went through three major iterations and greater than a thousand hours of labor to achieve version 3—the version utilized in Painkiller RTX. Considered one of my goals was also to lower the barrier to entry for RTX Remix, making it easier for more creators to experiment and contribute.

Why was it vital in your texture pipeline to mix AI-generated outputs with traditional hand-crafted work, relatively than counting on a single approach?

McGillacutty: It comes right down to scalable quality. AI-generated outputs were essential for handling the sheer volume of assets and establishing a consistent visual baseline across the project, but they’re not an alternative choice to artistic judgment. The manual refinement phase is where we pushed quality further and preserved Painkiller’s distinct character.

That’s where we reinterpreted ambiguous source textures, corrected materials that broke under physically accurate lighting, and made intentional creative decisions. This hybrid approach allowed us to automate roughly 80% of the repetitive work, so we could focus human effort on the 20% that ultimately defines the project’s quality and vision.

Quinn Baddams: AI-generated roughness, normal, and height maps provide a robust start line, but they often require adjustment to attain physically accurate results. Correct values will be very specific, and plenty of materials need manual tweaks or custom painting informed by real-world PBR references.

Painkiller also relies heavily on texture atlases, which might confuse AI models when a single texture accommodates multiple unrelated surfaces. Mixing AI automation with hand-crafted work allow us to remove many of the repetitive busywork while maintaining precise control over each artistic intent and physical accuracy.

NightRaven: PBRFusion was at all times intended to be a tool, not a drop-in alternative for material creation. I’m glad the Painkiller team approached it that way—using the tool to speed up their workflow relatively than treating it as a crutch.

Since the model isn’t perfectly accurate, especially for roughness generation, it would get things mistaken. Human verification and adjustment are essential to make sure materials behave appropriately under physically based and path-traced lighting.

How did you maintain a consistent style and quality bar across greater than 35 levels while integrating AI-generated content?

McGillacutty: Consistency at that scale required defining constraints early and treating AI output as a baseline system relatively than as individual, isolated assets. PBRFusion’s content-consistent super-resolution produced cohesive results across large material sets, which helped establish a shared visual language for the project.

We frequently evaluated materials in context using in-engine captures, then iterated in order that each AI-generated materials and hand-crafted hero assets reinforced the identical style and quality bar.

Quinn Baddams: We set a small variety of core guidelines early on. Small or distant textures weren’t upscaled unnecessarily, height maps were limited to large, flat surfaces, and roughness maps were treated as a primary driver of perceived material quality.

We referenced real-world PBR materials to validate roughness values and paid close attention to how albedo maps behave in a physically based workflow. In practice, consistency was achieved largely by reviewing and adjusting roughness maps to make sure materials behaved as intended under lighting.

The materials and textures now react way more realistically to light. How did you rethink your material, texture, and lighting workflows to attain that result across so many environments?

McGillacutty: Introducing physically based lighting into Painkiller meant rethinking the complete relationship between materials and lightweight. The sport’s environments are abstract and otherworldly, designed around dramatic contrast relatively than realism, so simply adding realistic lighting wasn’t enough.

We began by stripping baked lighting information from the unique textures, then rebuilt contrast intentionally through material definition—using grime, surface variation, and physically meaningful roughness values. That way, the drama got here from correct interactions with light relatively than painted-in shadows.

All lighting in Painkiller RTX was hand-tuned on the scene level, which allowed us to fastidiously shape mood and composition across each environment while still preserving the sport’s signature atmosphere.

Quinn Baddams: We took an iterative approach and learned early on that incorrect lighting responses couldn’t be fixed by simply adjusting texture brightness. The unique game relied heavily on baked shadows, which added contrast that not made sense in a PBR workflow.

After removing that baked lighting, we reintroduced contrast through roughness variation, stronger normal maps, and controlled self-shadowing. Standardizing physically plausible light values across scenes was also critical to achieving consistent, believable results.

Full-scene path tracing, volumetric lighting, and advanced techniques all work together in Painkiller RTX. How did you mix these systems to shape the sport, and what did each contribute that you simply couldn’t get from more traditional rendering?

McGillacutty: Full-scene path tracing and volumetric lighting fundamentally modified how materials behaved, which meant material work needed to be developed in close alignment with lighting. While lighting and volumetrics were handled by the team lead, my role was to make sure materials responded appropriately once those systems were in place.

Path tracing exposed properties like roughness, reflectivity, and wetness much more clearly than traditional rendering ever could. In areas with rain or fog, I adjusted materials to incorporate puddles and surface ripples in order that they would interact believably with volumetrics and moving light.

An incredible example of that is RTX Skin, particularly on characters and semi-translucent surfaces like marble. For assets equivalent to the nun or the lab fish, RTX Skin allows light to genuinely scatter through the surface. You possibly can see it in haggard skin or gelatinous flesh, this subsurface scattering creates a way of depth that easy surface highlights can’t achieve.

RTX Skin has been a particularly helpful tool. It’s allowed me to make these characters feel like tangible, physical parts of the ray-traced world we’re constructing. It’s especially rewarding to see a game from 2004 transformed to such an extent.

Quinn Baddams: Full-scene path tracing fundamentally modified how lighting and materials interacted, exposing inaccuracies that might have been hidden in traditional rendering. Volumetric lighting added depth and atmosphere, particularly in large interior spaces. While traditional techniques can approximate these effects, path tracing and volumetrics allow light to behave consistently across the complete scene.

RTX Skin was a significant part of constructing all of this work together. For a project rebuilding a classic game, it solved two vital problems. First, it allowed us to get much more out of our low-detail character models. The mesh geometry is precisely the identical, but RTX Skin makes it appear significantly more detailed. A variety of that comes from the conventional maps generated through PBRFusion, while RTX Skin itself helps smooth sharp edges, making low-poly geometry appear denser and fewer jagged.

Second, and more importantly, it gave us true artistic control over subsurface scattering for the primary time in a real-time pipeline. You possibly can define exactly how much light scatters through a surface and the way its color changes because it does. We used this on the wings of the demon Alastor, where the interior veins are only visible due to RTX Skin—an effect we didn’t consider possible before.

To my knowledge, this level of ray-traced subsurface scattering hasn’t been available to game developers in a practical, real-time way. It was previously limited to offline rendering. Having it available through RTX Skin is implausible—not only as a technical leap, but since it’s genuinely enjoyable to work with. We’re only scratching the surface of what’s possible.

For developers inspired by Painkiller RTX who wish to take a primary step toward similar visuals, which features or workflows would you recommend experimenting with first?

McGillacutty: My advice is to start out with a straightforward, focused artistic goal. Don’t attempt to rebuild a whole level. As a substitute, capture a single, iconic scene and focus on the connection between a number of key materials and the lighting.

Use the RTX Remix Toolkit to switch the unique textures with basic PBR materials, then iterate using path tracing and lighting tools. When you understand that core dialogue between materials and lightweight, you may introduce AI tools like PBRFusion. Used this manner, AI becomes a rapid iteration engine—letting you test different visual hypotheses throughout the same scene.

Quinn Baddams: Start with the RTX Remix Toolkit itself. Capture a scene, apply basic materials, and start experimenting with lighting and path tracing to grasp how they interact.

The RTX Remix community can be a very important resource, with shared tools, scripts, and energetic support. Most significantly, experiment freely—hands-on iteration is the fastest strategy to construct intuition for these workflows.

How do you think that generative AI has modified modding and game development, and what tools are you looking forward to next?

McGillacutty: As someone relatively recent to modding, the most important change I’ve seen is accessibility. Generative AI dramatically reduces the time and technical overhead required to experiment, iterate, and ship meaningful work. This opens development to creators from a wider range of backgrounds. 

For my next project, I’m looking forward to more advanced material and geometry tools and AI-assisted workflow scripting.

Quinn Baddams: Generative AI represents a paradigm shift—away from memorizing systems toward creating with a support layer that understands them. AI acts as each a tutor and a problem-solving partner. 

I’m particularly excited by further advances in AI-assisted asset cleanup and using retrieval-augmented generation to work with undocumented legacy codebases.

NightRaven: I’m already nearly finished with the following version of PBRFusion, hopefully providing great profit to this modding community.

Join us at GDC

Join us at GDC to explore how NVIDIA RTX neural rendering and AI are shaping the following era of gaming. Get a glimpse into the long run of game development with John Spitzer, Vice President of Developer and Performance Technology at NVIDIA, as he unveils the newest innovations in path tracing and generative AI workflows.

Then, join Bryan Catanzaro, Vice President of Applied Deep Learning Research at NVIDIA, for an interactive “Ask Me Anything” session covering the newest trends in AI. Together with two full days of additional sessions, these events offer a front-row seat to the technologies enabling recent sorts of player experiences.

Resources for game developers

See our full list of game developer resources here and follow us to stay awake so far with the newest NVIDIA Game development news:



Source link

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x