Hardware suggestion and graphic cards (Ubuntu or MacOS)

New user here.
I work as a neuroradiologist and love to create 3D models for teaching and research.
The work that the developer are doing with this software is awesome.

I currently use 3D Slicer on a small desktop (Intel NUC i5-8259U, 16GB, 512SSD, Ubuntu 20.04) and on a MacBook Pro (i5-7360U, 8GB, 256SSD, Catalina); I usually make segmentations from DICOM images of interventional procedures to show anatomy and needles/catheters, and performances are quite poor on both especially with “show 3D” activated.
But how to improve them?

  • buying a eGPU that I could use for both?
  • buying a better desktop PC (maybe the new XPS 8940 i7-10700, RTX 2060)?
  • buying a 16" MBP with discrete GPU?

I’m especially doubtful about graphic cards, OpenGL, drivers (GeForce 1660ti/2060 or Quadro P1000/P2000? Or maybe AMD? Are they compatible with Linux/Ubuntu?), and that’s the reason to consider the 16" MBP.
I am open to all the OS but if I could choose I would go with Linux/Ubuntu.

Thank you very much to anyone who will offer some suggestions.
Marco

Hi you can get a idea,

https://slicer.readthedocs.io/en/latest/user_guide/getting_started.html#system-requirements

Just as a data point:

I recently upgraded to a MacBook Pro 16" with the upgraded Radeon Pro 5500M 8BG graphics card and 64GB of memory. I also got a 2TB drive. My needs were specifically geared towards using Slicer.

I do micro-CT of living and fossil insects. My datasets range from 500MB to 3GB in size, and the models I produce are in the tens of millions of polygons (20-50 million polys in a model is not uncommon). I do a lot of segmentation by hand, and I find my setup to be pretty useable. I could have gone for a super powerful dell with a NVIDIA P5000, but that would have been significantly more expensive, and I tend to prefer unix based environments.

I’ve thought about an external GPU, but not taken the plunge. Interested to hear from anyone who has and how it’s worked out.

Hope this helps. Happy to discuss further.

Thank you Herhold for your answer.

That would be nice as the laptop is great and I could work both at home (32" USB-C monitor) and everywhere else, but it is quite expensive compared to a desktop setup. The XPS I mentioned is 3x/4x cheaper than the MBP 16" you mentioned and roughly as powerful (or maybe more).

Considering that my datasets are 500MB-2GB and that this is more like an hobby than my actual work, my specific doubts are:

  • is Quadro better than GeForce because of OpenGL (in the same price range)?
  • is NVIDIA really better than AMD? Or should I go AMD with Linux because of open source drivers?
  • if I understand well, CPU is more important than GPU even when showing 3D models during multiple segmentations. Is a 1660ti or 2060 enough or would I see a consistent improvement with a 2070?
  • can I expect Slicer and Ubuntu to work flawlessly with the XPS I mentioned? Or would Windows work better?
  • does anyone have experience with Mac or Ubuntu and eGPU for Slicer?

Quadro cards tend to be more powerful.

Ho boy, that’s like Ford vs Chevy, or take your pick on industry rivalries. Same on the subject of open source drivers. I would say that if you’re ever planning on doing any kind of machine learning, you should seriously consider NVIDIA. TensorFlow and things like that tend to be NVIDIA-only or at least NVIDIA-preferred. Would be nice if someone would weigh in on this - I do not do machine learning, largely because I have an AMD GPU.

Really, both are important, but if you’re pushing lots of triangles, you want a fast GPU. Also, if you’re doing volume rendering, a GPU is basically required. When you’re segmenting, a fast CPU is good for the segmentation part, and a fast GPU is good for the display of the 3D model. There are things you can do to speed things up (subsampling, decimation, turn off smoothing, etc) and there are a lot of people here who can help you out on that.

I don’t have any experience with Slicer on Linux but on Windows and Mac it’s fine. In my experience, cross-platform applications tend to be more solid. “Better” is probably relative. Plenty of people here use Linux.

Hope this helps.

Not necessarily.

Marketing difference is Quadro aimed for data center deployment (designed for dense installation) or commercial usage, long-term driver support and some difference in floating point operations (which may cause a difference in performance if you are doing that a lot). Also Quadro driver supports openGL over remote desktop quite good (not an option if you are using Geforce).

If you don’t need the large amount of RAM that Quadro offers, equivalent geforce works well.

As for eGPU, I have no experience. Is there even a Mac driver for those cards?

AMD continues to cap the GL_MAX_3D_TEXTURE_SIZE at 2K, even for their high-end cards. For Slicer, this means you cannot render a 3D volume on GPU if one of the dimensions is 2049 voxels or larger. Most new Nvidia cards this is 16K or more.

If you are using Linux, I think Nvidia has a much better driver support, at least for Centos and Ubuntu that I am familiar with. Whether the XPS works flawlessly with Linux, you need to check whether Linux is supported by Dell for that particular line. Mostly likely it will work, but occasionally driver weird issues arises (e.g., I had an ideapad from Lenovo that I couldn’t get the wifi card recognized, or the backlight didn’t work).

Yep, go with Murat’s advice - he’s far more knowledgeable.

I do know there are eGPUs for Macs, but they are AMD based. Never tried one, but would like to.

Slicer works flawlessly on Linux. (I am using with mint, ubuntu ). have a large swap space around 100 Gb. Also I have used on dell before but not the one that you mentions.

On Linux definitely Nvidia performs better.

NVidia started to ease up on this about 1-2 years ago. Now I don’t think it is nothing actively disabling remote desktop usage anymore. There may be still some driver incompatibilities. Also, shared usage in a server may violate licensing terms.

In the past we had more compatibility issues with Quadro cards in Slicer. Maybe because it is tested by magnitudes less users on much smaller subset of applications. Also, if someone reports an issue with a Quadro card, it is less likely that we (or VTK or other open-source software developers) can reproduce and debug the issue, because most developers use GeForce cards, too.

Thanks a lot for your support, it helps a lot.

I will go for the desktop i7 + GeForce + Ubuntu as it is open source and seems cost-effective; if anything goes wrong with drivers plan B is switch to Windows.

What I meant by remote rendering is that in a RDP session, I still cannot start Slicer from cold, if the remote computer has a Geforce card installed. If the remote computer has an Quadro, you can launch Slicer from cold and use GPU accelerated rendering. This is with 451.xx series drivers on windows 10 pros.

You may need to install some extra packages - see https://www.khronos.org/news/permalink/nvidia-provides-opengl-accelerated-remote-desktop-for-geforce-5e88fc2035e342.98417181

Wow, I got really excited, and installed as instructed but still doesn’t work for me. Splash screen comes and nothing else.

Do you use it on a shared server or to connect to a personal computer? I did not play with this much, as for a personal computer VNC, AnyDesk, etc. work similarly well as RDP. For a shared server you might need to do some extra steps to allow sharing GPU between multiple users.

It is between my two windows laptops (one with geforce other with quadro). It is not a big deal, I just wanted to give it a quick try…

1 Like

Resurrecting this thread after a long hiatus, sorry…

I’m not so sure about the 2048 restriction on AMD cards. I’m using a GL caps viewer on my 2016 MacBook Pro with an AMD Radeon Pro 5500M and the max texture 3D size is 16384. This is also backed up here: OpenGL capabilities report: GL_MAX_3D_TEXTURE_SIZE

It might be.

I am not a Mac user, we got one in the lab and when I searched the https://opengl.gpuinfo.org for Radeon GPUs, all the ones that I looked were capped at 2K. This was by no means an exhaustive search. ANd I see that most of them are reported either on Windows or Linux, there is a curious lack of MacOS listing.

It will be great, if you can verify that your Radeon can render volumes that is 2049 or more in at least one dimensions.

Easiest would be resample MRhead extensively, or use data from MorphoSource.

Yeah, here’s an AMD entry for MacOS that is relatively recent, and is 16k:

https://opengl.gpuinfo.org/displayreport.php?id=3777

I’ll make an example dataset using MRHead and make it like 100 x 100 x 3000 and see what happens.

Thanks!!

1 Like

OK, here’s MRHead, cropped and resampled to 351x440x4096. So 2048 is not a global AMD limit.

1 Like