Enables loading (and uploading to GPU) textures asynchronously using compute shaders.
This package has yet to be tested thoroughly.
Here's how you could create a texture:
var httpClient = new HttpClient();
var image = await httpClient.GetStreamAsync("https://picsum.photos/2048");
var texture = await AsyncTextureLoader.Instance.LoadTextureAsync(image);
See Quickstart for basic usage. The platform has to support compute shaders for this to work.
The first time the compute shader is used, a hiccup can occur in some cases. To prevent this,
call AsyncTextureLoader.Prewarm()
somewhere you don't care about hiccups.
Take a look at the implementation of LoadTextureAsync()
to get a good overview of the inner workings:
var image = await DecodeImageAsync(input);
var texture = await AcquireTextureAsync(image.Width, image.Height);
await UploadDataAsync(texture, 0, 0, image.Width, image.Height, 0, image.Data);
return texture;
As you can see, if you have a RenderTexture
, you can also update only part of it if you want. This could, for example,
be used to create and manage a texture atlas. (Origin as at the top left.)
Here's how you could update part of a texture:
var textureSize = 1024;
var tileSize = 256;
var rt = await AsyncTextureLoader.Instance.AcquireTextureAsync(textureSize, textureSize);
var tasks = new List<Task>();
for (int x = 0; x < textureSize / tileSize; x++)
for (int y = 0; y < textureSize / tileSize; y++)
{
int xPos = x;
int yPos = y;
tasks.Add(Task.Run(async () =>
{
var data = await _client.GetStreamAsync($"https://picsum.photos/{tileSize}");
var decoded = await AsyncTextureLoader.Instance.DecodeImageAsync(data);
await AsyncTextureLoader.Instance.UploadDataAsync(rt, xPos * tileSize, yPos * tileSize, tileSize, tileSize, 0, decoded.Data);
}));
}
await Task.WhenAll(tasks);
UploadDataAsync()
is protected by a semaphore that works in a FIFO manner, so you can just call it from anywhere,
anytime. Use the cancellation token to abort upload processes.
The package comes with a C# port of the well-known stb_image.h header file. (StbImageSharp)
While it's good enough in most cases, you may want to replace this with a more optimized variant for your specific use case.
To do so, simply implement IImageDecoder
and hook it up:
AsyncTextureLoader.Instance.ImageDecoder = new MyImageDecoder();
Naturally, this step is superfluous if you use AsyncTextureLoader.UploadDataAsync
directly.
Note that the class currently expects the data layout to be RGBA32.
Currently, mipmaps are generated by default using RenderTexture.GenerateMips()
, possibly causing a hiccup. The methods
of AsyncTextureLoader
have options to control the number of mips to be generated.
As of the time of this writing, Unity only allows asynchronously uploading textures to the GPU when they have been compiled with game. If, however, you need to dynamically load textures at runtime, you're kind of left in a bind.
There are basically two approaches to solve this (as far as I know):
- Manage the texture using native plugins.
- Manage the texture using compute shaders.
The preferred approach would probably be #1, but it is also harder to manage because you have to cater for all supported platforms.
This package tries to implement approach #2.
- Seems to crash in some cases with a Vulkan backend (null pointer dereference) (observed on Samsung Galaxy S7)