OpenGL has had support for *real* 64b pointers in shaders through NV_shader_buffer_load/store extensions for almost 10 years now, why can't we get such a basic mechanism standard in all shading languages and "modern" APIs in 2018 ?

— Cyril Crassin (@Icare3D) November 7, 2018