glfwSwapBuffers causing access violation (only when rendering geometry using glDrawElements)


For context: I'm currently developing a C# .NET Core 3.1 application that uses the OpenGL.Net library in conjunction with glfw-net for window management. To my knowledge, OpenGL.Net has no need for GLEW as it automatically resolves extensions at runtime (so long as you call a certain function after creating a window/context).

All OpenGL calls are run in a parallel thread to the main update thread, and I've ensured that there aren't any calls outside the render thread. Additionally, any data used by both the update thread and the render thread is copied into a seperate state (RenderState) so that data for the next render frame can be computed while the render thread renders the last frame. I'm using VAOs to render static geometry using glDrawElements, and have done EXTENSIVE debugging (a few days now looking at what feels like every forum post under the sun) to ensure that there are NO faults with the buffer objects, the data stored inside of them, the shaders/uniforms that they use, or the texture objects that they use (which is a texture2Darray in case you were curious).

The problem comes in with this: after rendering these objects using glDrawElements, I call glfwSwapBuffers, and then get an access violation (0xc0000005), crashing my program. I've tried running it without a debugger, in a Release build, targeting .NET Core 2.0 instead of 3.1, and even alternative forms of the GLFW dll that I'm using to no effect (using VS 2019 by the way).

It should be noted that the problem does not occur if glDrawElements isn't called, or alternatively, if glfwSwapBuffers isn't called. In addition, I've proven that it isn't a problem with not being able to get color information to the window as glClearColor works just fine. The moment any kind of geometry needs to be rendered, it crashes.

EDIT: Forgot to note that it successfully renders a few frames before crashing for seemingly no apparent reason. Yuck.

My question is, simply put: what in the world could cause this? I feel like I'm going insane, quite frankly. And yes, I'm aware of what the error means. EDIT: Just knowing why glfwSwapBuffers would cause a memory access violation would likely help significantly.

Code for render loop:

while (Game.StopIssued == false) {
    using NativeWindow w = new NativeWindow (Settings.WindowSizeX, Settings.WindowSizeY, Game.Version, m, Window.None);
    Glfw.MakeContextCurrent (w);
    Gl.BindAPI ();
    Shaders.Initialize ();
    VoxelTypes.CreateTileTexture ();

    Threading.CurrentWindow = w;
    Gl.Viewport (0, 0, w.ClientSize.Width, w.ClientSize.Height);

    while (w.IsClosing == false) {
        Threading.FrameFinished.WaitOne ();

        Glfw.PollEvents ();
        Threading.InputHandle.Set ();

        RenderFrame ();
        Glfw.SwapBuffers (w);

    if (Game.ResolutionChange == false) {
        Game.StopIssued = true;

Code for glDrawElements:

public static void RenderTiles (Matrix4x4 view, Matrix4x4 projection) {
    RenderState.UpdateTiles ();

    Shaders.TileShader.Use ();
    Gl.BindTexture (TextureTarget.Texture2dArray, VoxelTypes.TileTexture.ID);
    Gl.UniformMatrix4f (Shaders.TileShader.UniformView, 1, false, view);
    Gl.UniformMatrix4f (Shaders.TileShader.UniformProjection, 1, false, projection);

    foreach (KeyValuePair<Vector3Int, TileRenderObject> entry in RenderState.TRO) {
        Matrix4x4 model = Matrix4x4.CreateTranslation ((entry.Key - RenderState.FocalPoint) * Tile.Size);
        Gl.UniformMatrix4f (Shaders.TileShader.UniformModel, 1, false, model);
        Gl.BindVertexArray (entry.Value.VAO);
        Gl.DrawElements (PrimitiveType.Triangles, entry.Value.Length, DrawElementsType.UnsignedInt, 0);
        Gl.BindVertexArray (0);


    Gl.BindTexture (TextureTarget.Texture2dArray, 0);
asked on Stack Overflow Apr 22, 2020 by bitkoala • edited Apr 23, 2020 by Jonas W

0 Answers

Nobody has answered this question yet.

User contributions licensed under CC BY-SA 3.0