error: GLSL 1.50 is not supported

Operating system:CentOS-7.6
Slicer version: 4.11.0
nvidia driver 418.87, cuda 10.1

Expected behavior: displays a DICOM image
Actual behavior:doesn’t display anything, and issues errors such as:

error: GLSL 1.50 is not supported. Supported versions are: 1.10, 1.20, 1.30, 1.00 ES, and 3.00 ES

I don’t know if the error message is connected with the lack of DICOM display.

Any suggestions?

Hi -

It can be a challenge to get GPU drivers working on linux - here are some instructions that might help, based on this document about how to start from scratch on a virtual machine. The are for debian but centos should be similar I guess.

Execute the following and take note of the BusID

sudo nvidia-xconfig --query-gpu-info
Open the X11 configuration file

sudo vim /etc/X11/xorg.conf
and insert the following BusID line using the BusID value you retrieved earlier into this Section:

Section "Device"
   Identifier     "Device0"
   Driver         "nvidia"
   VendorName     "NVIDIA Corporation"
   BusID          "PCI:0:4:0"
EndSection
or if /etc/X11/xorg.conf does not exist, create a file in /usr/share/X11/xorg.conf.d/xorg.conf with the contents listed above.

Thanks for replying. Sorry I should have said I have a GPU with nvidia driver installed ok but running headless and a remote x11 display. I get the same errors even after creating xorg.conf files with BusID.
The error seems to be output alongside a dump of vtkPolyData2DVS.glsl (54 lines starting with 1: #version 150

It’s probably your remote X11 session that’s leading to the crash then.

I suggest using VNC to connect and let the remote machine do the rendering, like is shown in this video (which used the setup described in the link I sent earlier):

My suggestion is to get VirtualGL installed on your Linux system, and connect it via VNC (vgl sister project TurboVNC works really well, even on broadband) or vglconnect (to tunnel) and execute Slicer (or any openGL based application) via vglrun command.

Instructions for installing on Linux:
https://cdn.rawgit.com/VirtualGL/virtualgl/2.6.3/doc/index.html#hd005001

Thanks for the tips. I had not heard of VirrtualGL, interesting.

However I have found a possible solution!

(MESA_GL_VERSION_OVERRIDE=3.2 ./Slicer )

I am not sure why it works but I can at least see a DICOM image now.
And the error message has gone away.

2 Likes

This worked for us too! we were having trouble getting slicer to work over RDP . We actually set:

export MESA_GL_VERSION_OVERRIDE=3.3

according to our installed GL driver version:

CentOS Linux release 7.7.1908 (Core)

server glx version string: 1.4
 client glx version string: 1.4
 GLX version: 1.4
 Max core profile version: 3.3
 Max compat profile version: 3.0
 Max GLES1 profile version: 1.1
 Max GLES[23] profile version: 3.0
OpenGL core profile version string: 3.3 (Core Profile) Mesa 18.0.5
 OpenGL core profile shading language version string: 3.30
 OpenGL version string: 3.0 Mesa 18.0.5
 OpenGL shading language version string: 1.30
 OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.0.5
 OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
2 Likes

Hi @kpopuri, given your ES 3.0 already satisfying minimum requirement:

OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.0.5

Did you realize why you needed the hack in the first place?

@tbillah actually setting export MESA_GL_VERSION_OVERRIDE=3.0 also does the trick, so I think the issue is not with the availability of the compatible openGL libs but rather letting slicer know about their existence. I am guessing slicer checks for some ENV variable to figure out the openGL version and if it cannot find the ENV variable, it defaults to setting that variable to an older GSL version (GSL 1.5) and then the downstream rendering engine reads that and falsely raises an error.

Hello @pieper, this may need more attention. If I am not parsing the log wrong, someone has:

OpenGL ES profile version string: OpenGL ES 3.0 Mesa 18.0.5

Yet, Slicer raised the reported error for them.

For what it’s worth, the hack worked for me on a cluster that has OpenGL ES 2.0

1 Like

I don’t use this approach for OpenGL on X (I use the methods I described above), so I don’t have much to suggest here. Do other VTK-based programs work? (e.g. try ParaView and also VTK examples built from source).