Remove WLR_DRM_NO_ATOMIC_GAMMA workaround

This is fixed on amdgpu, so we don't need this anymore.
master
Scott Anderson 6 years ago committed by Simon Ser
parent d201fc3506
commit b85f0cbff9

@ -211,11 +211,7 @@ static bool atomic_crtc_set_gamma(struct wlr_drm_backend *drm,
uint16_t *r, uint16_t *g, uint16_t *b) { uint16_t *r, uint16_t *g, uint16_t *b) {
// Fallback to legacy gamma interface when gamma properties are not available // Fallback to legacy gamma interface when gamma properties are not available
// (can happen on older Intel GPUs that support gamma but not degamma). // (can happen on older Intel GPUs that support gamma but not degamma).
// TEMP: This is broken on AMDGPU. Provide a fallback to legacy until they if (crtc->props.gamma_lut == 0) {
// get it fixed. Ref https://bugs.freedesktop.org/show_bug.cgi?id=107459
const char *no_atomic_str = getenv("WLR_DRM_NO_ATOMIC_GAMMA");
bool no_atomic = no_atomic_str != NULL && strcmp(no_atomic_str, "1") == 0;
if (crtc->props.gamma_lut == 0 || no_atomic) {
return legacy_iface.crtc_set_gamma(drm, crtc, size, r, g, b); return legacy_iface.crtc_set_gamma(drm, crtc, size, r, g, b);
} }

@ -7,8 +7,6 @@ wlroots reads these environment variables
considered the primary DRM device. considered the primary DRM device.
* *WLR_DRM_NO_ATOMIC*: set to 1 to use legacy DRM interface instead of atomic * *WLR_DRM_NO_ATOMIC*: set to 1 to use legacy DRM interface instead of atomic
mode setting mode setting
* *WLR_DRM_NO_ATOMIC_GAMMA*: set to 1 to use legacy DRM interface for gamma
control instead of the atomic interface
* *WLR_LIBINPUT_NO_DEVICES*: set to 1 to not fail without any input devices * *WLR_LIBINPUT_NO_DEVICES*: set to 1 to not fail without any input devices
* *WLR_BACKENDS*: comma-separated list of backends to use (available backends: * *WLR_BACKENDS*: comma-separated list of backends to use (available backends:
wayland, x11, headless, noop) wayland, x11, headless, noop)

Loading…
Cancel
Save