🖼️ Stop Shipping PNGs In Your Games

September 4, 2025 • Mason Remaley • gamedevtechzig

Are you shipping textures to players as PNGs? The goal of this post is to convince you that this is suboptimal, and walk you through a better approach.

I’ll also share my implementation of the suggested approach, but if you’d rather do it yourself I’ll also provide you with the information you need to get started.

If you’re using a game engine, it is almost certainly doing what this post suggests automatically, but it doesn’t hurt to double check!

What’s wrong with PNGs?

PNG source

PNGs are great for interchange. They’re lossless, they compresses well, and support is ubiquitous. PNG is my image interchange format of choice.

This post isn’t a criticism of PNGs–it’s just that the PNG format is designed for image data, not texture data.

Here are some examples of features you would expect out of a texture format that you’re not going to find in an image format:

Can you work around all these issues? Sure.

You can premultiply and generate your mipmaps at load time. You can ship separate images for each cuebmap face. But now you’re resigned to cheap mipmap generation, and cubemaps that are difficult to downsample correctly.

You can certainly make it work, but you’re making things unnecessarily difficult for yourself by using the wrong tool for the job.

Furthermore, texture formats have a killer feature not mentioned above–support for GPU compatible texture compression like BCn.

An in-depth explanation of GPU compression formats it out of scope for this post, but at a high level, these formats store each block of pixels as a couple of endpoints and a method for interpolating between those endpoints.

This trades mild degradation of image quality for improvements in storage, VRAM usage, and sampling performance. It’s so good it feels like you’re cheating thermodynamics.

GPUs can’t decompress PNGs on the fly, so as a result, if you ship PNGs you either can’t take advantage of this compression, or you have to first decompress the PNGs and then do an extremely expensive compression step to convert to the desired block based format every time a player loads the game.

That’s a little goofy, right?

(EDIT: Well, it’s goofy when done naively–see discussion w/ Ignacio Castaño here, something along these lines can become viable if you can transcode quickly.)

What texture formats are out there?

some of Way of Rhea's data files

Texture formats like Khronos’ KTX2 and Microsoft’s DDS are designed for exactly our use case. They’re just headers followed by some image data that you can upload directly to the GPU without any additional processing.

Well, unless you use supercompression. GPU compression formats don’t provide great compression ratios, so it’s typical to apply lossless compression as well (think zlib or lz4.) In that case you’ll decompress, and then upload.

The meta here is to design your lossy compressor to be aware that its output is going to be losslessly compressed afterwards. This lets it make decisions that reduce entropy, improving the effectiveness of the lossless step.

I used DXT5 + lz4 compressed DDS files for Way of Rhea, I’m switching to BC7 + zlib compressed KTX2 files for my next game. Both approaches are reasonable.

Note: I primarily develop games for desktop platforms. IIUC, on mobile, hardware support for various types of GPU compression varies but the formats are similar-ish, so the meta is to use something like Basis Universal to quickly transcode to the correct format on load.

Exporting to KTX2

KTX2

At this point, you’re likely looking through the export menu of your image editor of choice for KTX2 and DDS, and not seeing any results.

Unfortunately, AFAICT most people end up rolling their own exporters. People used to use Nvidia Texture Tools, but it’s archived as there wasn’t funding to maintain it. It’s still a great reference. Nvidia has a closed source fork, but I don’t love having a closed source dependency for such an integral part of my engine.

I’ve implemented an open source texture tool that you’re welcome to use directly or as a reference for your own implementation: Zex.

It can be used as a command line tool, or as a Zig library. It reads PNGs using stb_image, and converts them to KTX2, with support BC7 compression + rate distortion optimization from bc7enc_rdo, and supercompression via zlib.

It supports most standard features, such as mipmap generation with configurable filters and address modes.

I haven’t implemented cubemap exports yet as my current game isn’t using them. If you need support before I get around to it, PRs are welcome–it should be a pretty straightforward addition.

If you want to implement your own exporter, here are some useful references. Keep in mind that you don’t need to support all possible features, just the ones your engine uses:

Texture Viewers

animated screenshot of tacentview source

Most image viewers won’t be able to open texture formats like DDS/KTX2. This sorta makes sense–image viewers are typically designed to show a single image, whereas a texture may be comprised of multiple mipmaps and cubemap faces and such, and may be HDR. This requires a fancier UI.

I’m personally a fan of Tacentview for this use case. It’s open source, cross platform, and supports a large number of formats.

Preserving Alpha Coverage

a 3d tree source: firewatch inspired me so I made a tree and then never used it for anything

Pregenerating your mipmaps gives you a chance to be a little more “correct” about them.

For example, if you’ve ever tried to render a tree or a chain link fence in-game as a cutout (or with alpha to coverage) but found that it vanishes when you get far away, your mipmap filtering likely isn’t taking into account the alpha test.

You can see Zex’s alpha test aware resize here. This isn’t battle tested yet, compare results visually in-engine to see if it provides a benefit for your artwork.

Automation

a screenshot of the Oven GitHub repo source

You probably don’t want to convert all your images by hand. I did this for Way of Rhea for a while, but eventually realized that it was a waste of time. Every time a texture changes you have to go back and figure out what settings you used last time. Just automate it.

I’ll probably write a follow up post describing my strategy for automating this at some point in the future, but if you want a sneak peak, check out Oven. It’s not exactly general purpose right now, but might be an interesting reference.

Join my newsletter

Receive an email when I publish writing on game development.

You can unsubscribe at anytime.