You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am currently implementing a C# wrapper on top of wgpu-native.
It works as expected. Like wgpu-native - very reliable!
Now I checked if there are any memory leaks.
To simplify observing leaks I disabled VSync by setting presentMode = WGPUPresentMode_Immediate.
Doing this will make leaks easier to observe as they grow faster.
A simple C# triangle example renders with ~11000 FPS (Vulkan) on Windows 11 with Nvidia RTX 4070.
I use the Task Manager and watch the "Memory" column to look for leaks.
In this case the memory grows ~100kb/sec. Which means ~8 byte/frame.
I made the same test on macOS with Mac Mini M2. The same application has no leak. Memory stays constant.
I am also sure the C# layer does not allocate anything.
So I decided to run the triangle example in wgpu-native.
After painful :) setup as a VS solution I got the example running.
(I failed to setup the make environment successful).
What I observed:
If I comment out either wgpuQueueSubmit() or wgpuSurfacePresent() there is no leak anymore.
It could be also the case that there is a leak is in the GPU driver.
Maybe someone else could confirm or disprove this behavior on Windows 11 with Nvidia driver.
I am currently implementing a C# wrapper on top of wgpu-native.
It works as expected. Like wgpu-native - very reliable!
Now I checked if there are any memory leaks.
To simplify observing leaks I disabled VSync by setting
presentMode = WGPUPresentMode_Immediate
.Doing this will make leaks easier to observe as they grow faster.
A simple C# triangle example renders with ~11000 FPS (Vulkan) on Windows 11 with Nvidia RTX 4070.
I use the Task Manager and watch the "Memory" column to look for leaks.
In this case the memory grows ~100kb/sec. Which means ~8 byte/frame.
I made the same test on macOS with Mac Mini M2. The same application has no leak. Memory stays constant.
I am also sure the C# layer does not allocate anything.
So I decided to run the triangle example in wgpu-native.
After painful :) setup as a VS solution I got the example running.
(I failed to setup the make environment successful).
What I observed:
If I comment out either
wgpuQueueSubmit()
orwgpuSurfacePresent()
there is no leak anymore.It could be also the case that there is a leak is in the GPU driver.
Maybe someone else could confirm or disprove this behavior on Windows 11 with Nvidia driver.
WGPUAdapterInfo - Vulkan
The text was updated successfully, but these errors were encountered: