Remove glapi.sh code generation, replace it with hand-written loading
code that checks extension strings before calling eglGetProcAddress.
The GLES2 renderer still uses global state because of:
- {PUSH,POP}_GLES2_DEBUG macros
- wlr_gles2_texture_from_* taking a wlr_egl instead of the renderer
This requires functions without a prototype definition to be static.
This allows to detect dead code, export less symbols and put shared
functions in headers.
Prior to this commit, compositors needed to render the texture to an
intermediate off-screen buffer using wlr_renderer APIs if they wanted to
use a custom rendering path (e.g. render to a 3D scene).
A new wlr_gles2_texture_get_attribs exposes the GL texture target and ID
so that compositors can render wlr_textures with their own shaders. An
example of a compositor doing so is available at [1].
[1]: 3db905b784/src/render.c (L227)
We don't need our own enum for types. Instead we just use
GL_TEXTURE_{2D,EXTERNAL_OES}, which already describes usage.
Also fixes a situation where we were using GL_TEXTURE_2D in a situation
we should not have. wl_drm buffers are always GL_TEXTURE_EXTERNAL_OES,
no matter if they're RGB or any other format.
When a texture is destroyed between wlr_egl_make_current and
wlr_egl_swap_buffers, it resets the current EGL surface to NULL. This
makes wlr_egl_swap_buffers fail.
If the EGL context is already current, there's no need to reset it.
We were assuming GL_BGRA_EXT was always supported.
We now check that it's supported for rendering. We fail if it isn't because
this format is specified as "always supported" by the Wayland protocol.
We also check if it's supported for reading pixels. A new preferred_read_format
function returns the preferred format that can be used to read pixels. This is
used by the screencopy protocol.