Wx's GLContext not usable by pyopengl under linux; windows ok (OpenGL)

Hi,

Short background: I am developing an application where I am building the UI with wxPython. I’d like wxPython to take care of window and OpenGL context creation. I will use ctypes to call .so/.dlls that will do the heavy lifting (data analysis). My first testcase was ripping all X11-specific stuff from glxgears putting only the GL calls in an .so. That worked under windows&linux. This is my first foray into OpenGL so I only found out after this exercise that glxgears is more than ancient. However, my attempts to use modern OpenGL are foiled by some issue with the GLContext.

I found a suitable python-only test case using wxPython and pyopengl that demonstrates the issue on stackoverflow. This is my modified version:

#! /usr/bin/env python
 
# Adapted from:
# https://stackoverflow.com/questions/39734211/how-do-i-get-pyopengl-to-work-with-> a-wxpython-context-based-on-this-c-modern
#
# Works on windows but not on linux
#
# Tested with:
# --------------------------------------
# Windows 7 AMD64
# python : 3.8.2 (tags/v3.8.2:7b3ab59, Feb 25 2020, 23:03:10) [MSC v.1916 64 bit (AMD64)]
# wxpython : 4.1.0 msw (phoenix) wxWidgets 3.1.4
# pyopengl : 3.1.5
# OpenGL : 4.6.0 NVIDIA 460.89
# --------------------------------------
# Fedora 32  kernel 5.9.14-100.fc32.x86_64
# python : 3.8.6 (default, Sep 25 2020, 00:00:00) 
# [GCC 10.2.1 20200723 (Red Hat 10.2.1-1)]
# wxpython : 4.1.1 gtk3 (phoenix) wxWidgets 3.1.5
# pyopengl : 3.1.5
# OpenGL : 4.6.0 NVIDIA 455.45.01
# --------------------------------------
#
# On linux the GL context created by wx seems to be partially visible by
# other "code": pyopengl for one, but the same problem occurs in low-level
# code in a .so imported using the ctypes module - the code in the InitGL()
# function works up to the call: glEnableVertexAttribArray()
# - maybe at the driver level the context is known and hence most GL stuff works?
# - but perhaps it is not "visible" in the X11-OpenGL interface, i.e. GLX?
#
# Also of interest: an earlier .so that used the drawing commands from glxgears,
# i.e. OGL 1.x only, did work. In any case, the current pyopengl based example
# demonstrates the issue sufficiently, I think

# I think wx uses libepoxy to load OGL. I tried to CDLL libepoxy, libGL,
# libGLX with RTLD_GLOBAL before importing pyopengl (import OpenGL) and wx
# thinking maybe some global variable was define more than once but to no
# avail
#
# Setting g_exclude_crash_trigger = True   allows the code to run by
# not calling glEnableVertexAttribArray() and since it then does not make sense
# glDrawArrays - in this case the window turns up with a dark green background
# so glClear() *is* working

import platform as pfinfo    # prevent conflict with OpenGL.platform
import sys

import OpenGL
#OpenGL.CONTEXT_CHECKING = True
from OpenGL.GL import *          # the package is called pyopengl
import wx
from wx import glcanvas
import numpy as np


g_exclude_crash_trigger = False

def show_version_info():
    print('--------------------------------------')
    try:
        import distro
        print('{} {}  kernel {}'.format(distro.name(),distro.version(),pfinfo.uname().release))
    except ImportError:
        print('{} {} {}'.format(pfinfo.system(), pfinfo.release(), pfinfo.machine()))
    print('python : {}'.format(sys.version))
    print('wxpython : {}'.format(wx.version()))
    print('pyopengl : {}'.format(OpenGL.version.__version__))
    print('OpenGL : {}'.format(glGetString(GL_VERSION).decode('utf8')))
    print('--------------------------------------')
    sys.stdout.flush()



vertexSource = """
#version 130
in vec2 position;
void main()
{
    gl_Position = vec4(position, 0.0, 1.0);
}
"""
fragmentSource = """
#version 130
out vec4 outColor;
void main()
{
    outColor = vec4(0.2, 0.3, 1.0, 1.0);
}
"""

class OpenGLCanvas(glcanvas.GLCanvas):
    def __init__(self, parent):
        glcanvas.GLCanvas.__init__(self, parent, -1, size=(640, 480))
        self.init = False
        #self.context = glcanvas.GLContext(self)
        cxtAttrs = glcanvas.GLContextAttrs()
        cxtAttrs.CoreProfile().EndList()
        self.context = glcanvas.GLContext(self, ctxAttrs=cxtAttrs)

        #self.Bind(wx.EVT_ERASE_BACKGROUND, self.OnEraseBackground)
        self.Bind(wx.EVT_PAINT, self.OnPaint)

    def OnEraseBackground(self, event):
        pass # Do nothing, to avoid flashing on MSW.
    
    def OnPaint(self, event):
        dc = wx.PaintDC(self)
        self.SetCurrent(self.context)
        if not self.init:
            # Note: this is the proper time to initialise the OGL stuff. I
            # tried other events (EVT_SHOW, EVT_CREATE) that occur before the
            # first EVT_PAINT but the GL context does not exist yet
            # Note, the wx "view" of the GL context is that it has been created
            # and is proper:
            assert(self.context.IsOK())
            self.InitGL()
            self.init = True
        self.OnDraw()
    
    def InitGL(self):
    
        # Vertex Input
        ## Vertex Array Objects
        vao = glGenVertexArrays(1)
        glBindVertexArray(vao)
    
        ## Vertex Buffer Object
        vbo = glGenBuffers(1) # Generate 1 buffer
    
        vertices = np.array([0.0,  0.5, 0.5, -0.5, -0.5, -0.5], dtype=np.float32)
    
        ## Upload data to GPU
        glBindBuffer(GL_ARRAY_BUFFER, vbo)
        glBufferData(GL_ARRAY_BUFFER, vertices.nbytes, vertices, GL_STATIC_DRAW)
    
        # Compile shaders and combining them into a program
        ## Create and compile the vertex shader
        vertexShader = glCreateShader(GL_VERTEX_SHADER)
        glShaderSource(vertexShader, vertexSource)
        glCompileShader(vertexShader)
    
        ## Create and compile the fragment shader
        fragmentShader = glCreateShader(GL_FRAGMENT_SHADER)
        glShaderSource(fragmentShader, fragmentSource)
        glCompileShader(fragmentShader)

        show_version_info()

    
        ## Link the vertex and fragment shader into a shader program
        shaderProgram = glCreateProgram()
        glAttachShader(shaderProgram, vertexShader)
        glAttachShader(shaderProgram, fragmentShader)
        glBindFragDataLocation(shaderProgram, 0, "outColor")
        glLinkProgram(shaderProgram)
        glUseProgram(shaderProgram)
    
        # Making the link between vertex data and attributes
        posAttrib = glGetAttribLocation(shaderProgram, "position")
        glEnableVertexAttribArray(posAttrib)
        if not g_exclude_crash_trigger:
          glVertexAttribPointer(posAttrib, 2, GL_FLOAT, GL_FALSE, 0, None)

    def OnDraw(self):
        # Set clear color
        glClearColor(0.0, 0.2, 0.0, 1.0)
        #Clear the screen to dark green
        glClear(GL_COLOR_BUFFER_BIT)
    
        # draw the triangle
        if not g_exclude_crash_trigger:
          glDrawArrays(GL_TRIANGLES, 0, 3)
  
        self.SwapBuffers()

class Frame(wx.Frame):
    def __init__(self):
        wx.Frame.__init__(self, None, title="Blue triangle on dark green background", size=(640,480))
        canvas = OpenGLCanvas(self)

app = wx.App()
frame = Frame()
frame.Show()
app.MainLoop()

I have put in some details of the issue as comments in the code so that it can all be found in one place. This runs fine under windows but not under linux (that is a first!) and a triangle is seen on a dark green background. It fails on linux with the following error trace:

Traceback (most recent call last):
  File "./glctxproblem.py", line 116, in OnPaint
    self.InitGL()
  File "./glctxproblem.py", line 162, in InitGL
    glVertexAttribPointer(posAttrib, 2, GL_FLOAT, GL_FALSE, 0, None)
  File "/home/ichneumwn/.local/lib/python3.8/site-packages/OpenGL/latebind.py", line 63, in __call__
    return self.wrapperFunction( self.baseFunction, *args, **named )
  File "/home/ichneumwn/.local/lib/python3.8/site-packages/OpenGL/GL/VERSION/GL_2_0.py", line 469, in glVertexAttribPointer
    contextdata.setValue( key, array )
  File "/home/ichneumwn/.local/lib/python3.8/site-packages/OpenGL/contextdata.py", line 58, in setValue
    context = getContext( context )
  File "/home/ichneumwn/.local/lib/python3.8/site-packages/OpenGL/contextdata.py", line 40, in getContext
    raise error.Error(
OpenGL.error.Error: Attempt to retrieve context when no valid context

Any ideas what might be the issue? Is there a way to get the handle to the GLContext and somehow pass it on to pyopengl explicitly?

Cheers

Follow-up by myself :slight_smile:

Maybe someone could run the sample on a different flavour of Linux to exclude the possibility this is specific to my system?

Thx

Another follow-up by myself… in case anyone else stumbles over this. I have had to do what I was hoping to avoid: learn how to use glx to create a context myself. I create a Frame, get its XID using GetHandle() and pass it on to my C code. If the Frame’s XWindow does not have the right Visual, I create an XWindow with the required Visual with the Frame as its parent.

I will attempt to clean up the code and make it available once I have the time (might be a while).

As far as I can tell wxPython calls the underlying wxWidgets functions directly, so I guess there is an issue in wxWidgets. Hopefully it will work one day without my roundabout way - I would be happy to bin the work around code I created :slight_smile:

I guess I forgot to reply to this as I had intended. I don’t have an answer for you, and my OpenGL skills are very small, but I did notice that I get a different error on OSX:

Traceback (most recent call last):
  File "/Users/robind/tmp/opengl-test.py", line 116, in OnPaint
    self.InitGL()
  File "/Users/robind/tmp/opengl-test.py", line 156, in InitGL
    glUseProgram(shaderProgram)
  File "/Users/robind/.myPyEnv/Py39/lib/python3.9/site-packages/OpenGL/error.py", line 230, in glCheckError
    raise self._errorClass(
OpenGL.error.GLError: GLError(
	err = 1282,
	description = b'invalid operation',
	baseOperation = glUseProgram,
	cArguments = (51,)
)

Maybe this will provide some sort of clue that will help you figure out what is wrong.

The best way to investigate that is to re-create your sample in C++ and try it there.

Thanks Robin,

I am guessing that the OSX error shows that GL contexts work differently on different platforms. It seems some GL stuff runs event without a context, and what runs without a context might be different on different platforms - speculation :slight_smile:

On the other hand, the code that I got working with my own context creation is not quite the same as the original code I originally posted and after I got it working it coevolved with the context creation code. However, the working code (which I’ll include below without the context creation stuff as that still needs work) shows the same error as the problem code above and works when I replace the wxPython GLCanvas with my own (on Linux). As before, it works on Windows with the standard wxPython GLCanvas. You can see the “if False” at the top of the code that switches between wx’s GLCanvas and mine.

#! /usr/bin/env python

from OpenGL.GL import *

import wx
import wx.glcanvas as wxgl
if False:
  import ixmwn_glcontext as ixgl
else:
  import wx.glcanvas as ixgl
import numpy as np
import sys

vertexSource = """
#version 130
in vec2 position;
void main()
{
    gl_Position = vec4(position, 0.0, 1.0);
}
"""
fragmentSource = """
#version 130
out vec4 outColor;
void main()
{
    outColor = vec4(0.2, 0.3, 1.0, 1.0);
}
"""

class Frame(wx.Frame):
  def __init__(self):
    wx.Frame.__init__(self, None, title="Blue triangle on dark green background", size=(640,480))
    canvasAttrs = ixgl.GLAttributes()
    canvasAttrs.PlatformDefaults().MinRGBA(1, 1, 1, 0).DoubleBuffer().Depth(1).EndList()
    self.glcv = ixgl.GLCanvas(self, canvasAttrs)
    contextAttrs = ixgl.GLContextAttrs()
    self.glctx = ixgl.GLContext(self.glcv, ctxAttrs=contextAttrs)
    self.Bind(wx.EVT_PAINT, self.on_paint)
    self.Bind(wx.EVT_SIZE, self.on_size)
    self._must_init = True

  def InitGL(self):
    # Vertex Input
    ## Vertex Array Objects
    vao = glGenVertexArrays(1)
    glBindVertexArray(vao)

    ## Vertex Buffer Object
    vbo = glGenBuffers(1) # Generate 1 buffer

    vertices = np.array([0.0,  0.5, 1.0, -0.5, -0.5, -0.5], dtype=np.float32)

    ## Upload data to GPU
    glBindBuffer(GL_ARRAY_BUFFER, vbo)
    glBufferData(GL_ARRAY_BUFFER, vertices.nbytes, vertices, GL_STATIC_DRAW)

    # Compile shaders and combining them into a program
    ## Create and compile the vertex shader
    vertexShader = glCreateShader(GL_VERTEX_SHADER)
    glShaderSource(vertexShader, vertexSource)
    glCompileShader(vertexShader)

    ## Create and compile the fragment shader
    fragmentShader = glCreateShader(GL_FRAGMENT_SHADER)
    glShaderSource(fragmentShader, fragmentSource)
    glCompileShader(fragmentShader)

    ## Link the vertex and fragment shader into a shader program
    shaderProgram = glCreateProgram()
    glAttachShader(shaderProgram, vertexShader)
    glAttachShader(shaderProgram, fragmentShader)
    glBindFragDataLocation(shaderProgram, 0, "outColor")
    glLinkProgram(shaderProgram)
    glUseProgram(shaderProgram)

    # Making the link between vertex data and attributes
    posAttrib = glGetAttribLocation(shaderProgram, "position")
    glEnableVertexAttribArray(posAttrib)
    glVertexAttribPointer(posAttrib, 2, GL_FLOAT, GL_FALSE, 0, None)

  def actual_on_size(self):
    size = self.glcv.GetClientSize()
    self.glcv.SetCurrent(self.glctx)
    glViewport(0, 0, size[0], size[1])
    self.actual_draw()
  def actual_draw(self):
    self.glcv.SetCurrent(self.glctx)
    glClearColor(0.0, 0.2, 0.0, 1.0)
    #Clear the screen to dark green
    glClear(GL_COLOR_BUFFER_BIT)
    glDrawArrays(GL_TRIANGLES, 0, 3)
    self.glcv.SwapBuffers()
  def on_size(self, event):
    if self.glctx.IsOK():
      wx.CallAfter(self.actual_on_size)   # CallAfter necessary!
    event.Skip()
  def on_paint(self, event):
    if not self.glctx.IsOK():
      event.Skip()
      return
    if self._must_init:
      self._must_init = False
      self.glcv.SetCurrent(self.glctx)
      self.InitGL()
    self.actual_draw()
    event.Skip()

app = wx.App()
frame = Frame()
frame.Show()
app.MainLoop()

So it could be the original code contained an error (although it did work on windows, so probably not). Would you run this version on OSX out of curiosity?

I don’t know C++ so I can’t check wxWidgets :frowning_face:, but my work around does the job for me.

In any case, thanks again for looking into it

Good morning,
I ran both versions of your code under Ubuntu 20.04 and got the same result.
My system:
OS: Linux (5.4.0-70-generic)
Python: 3.8.5
wxPython: 4.1.1 gtk3 (phoenix) wxWidgets 3.1.5
PyOpenGL 3.1.5
Numpy: 1.20.1
Pillow: 8.1.2
Matplotlib: 3.3.4

and
OpenGL version: b’4.6 (Compatibility Profile) Mesa 20.2.6’
GLSL version: b’4.60’
Vendor: b’Intel’
Renderer: b’Mesa Intel® Iris® Plus Graphics 640 (Kaby Lake GT3e) (KBL GT3)’
GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS: 192
GL_MAX_CUBE_MAP_TEXTURE_SIZE: 16384
GL_MAX_DRAW_BUFFERS: 8
GL_MAX_FRAGMENT_UNIFORM_COMPONENTS: 16384
GL_MAX_TEXTURE_IMAGE_UNITS: 32
GL_MAX_TEXTURE_SIZE: 16384
GL_MAX_VARYING_FLOATS: 128
GL_MAX_VERTEX_ATTRIBS: 16
GL_MAX_VERTEX_TEXTURE_IMAGE_UNITS: 32
GL_MAX_VERTEX_UNIFORM_COMPONENTS: 16384
GL_MAX_RENDERBUFFER_SIZE: 16384
GL_MAX_VIEWPORT_DIMS: 16384, 16384
GL_STEREO: False

Thanks bhomer, interesting to see this is not limited to the nvidia drivers.

I was able to get this working on macOSX Monterey running on a 2017 iMac by making the following changes:

In source for both shaders, replace “#version 130” with “#version 410”

Hope this is not too late to help.