Ask Your Question

Fedora and multi gpu usage

asked 2016-11-01 08:34:16 -0500

Elbi gravatar image

I always have the same problem on gnome that is i don't know where to find the status of my gpus. I run fedora 24 on a laptop with a nvidia card as well as an intel chip with optimus technology and I would like to know wich of these i'm using, and how can i manage them ?

I use linux for scientific computation, and i don't care about video games do i need the nvidia drivers or the nouveau will be just fine ?

The purpose is to use it with R and Matlab or Octave.

in lspci i got : 00:02.0 VGA compatible controller: Intel Corporation HD Graphics 5500 (rev 09) 04:00.0 3D controller: NVIDIA Corporation GM108M [GeForce 840M] (rev a2)


edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted

answered 2016-11-01 13:50:26 -0500

Entropy813 gravatar image

Disclaimer: This is a somewhat self promotional answer that links to another one of my answers here

I also use Fedora 24 on a laptop with an Intel and NVIDIA GPU for scientific computing, though I don't use R, Matlab or Octave. I usually write my code in C++ and recently installed CUDA to write GPU kernels.

Generally speaking, graphics switching is not automatic in Linux, you have to specify which GPU you want to utilize when running a program using DRI_PRIME=1 if you want to use your discrete GPU with nouveau drivers, or optirun/primusrun if you install Bumblebee (see for instructions on installing Bumblebee with the proprietary NVIDIA driver). For example after compiling some code with a GPU kernel, I have to execute it with
$ optirun ./Program_name
$ primusrun ./Program_name
otherwise the code doesn't work since it can't see the NVIDIA GPU. Since R is an interpreted language, and Matlab is more of a standalone application, you can try launching the interpreter for R with (assuming nouveau driver)
then run some scripts that you have and see if they run faster than when you launch without the DRI_PRIME. You could do similar tests with Matlab and Octave by launching them from the command line with DRI_PRIME=1.

If you do install Bumblebee, you can use the NVIDIA settings application to monitor the usage of the NVIDIA GPU, which would let you know without a doubt which GPU is being used when running your code. Lauch the NVIDIA X Server Settings from your application launcher (with Bumblebee it seems to not like launching from the command line). Click on your GPU in the list on the left (likely "GPU 0 - (GeForce 840M)") and you should see an entry on the right side labeled GPU Utilization. Keep that window visible and run some GPU code to see if the utilization increases.

However, based on this it seems like R may require CUDA in order to utilize a GPU as the tutorial specifies that you should install the CUDA SDK. Installing the CUDA SDK is actually not that difficult. See my answer to this question for details on how I got it setup and working on my system. Using the proprietary driver may also provide a performance boost when using the NVIDIA GPU, even if CUDA itself is not needed, in much the same way that game performance can be improved. After all, either way, the GPU is just doing a bunch of calculations.

I hope this helps!

edit flag offensive delete link more

answered 2016-12-01 23:35:48 -0500

Elbi gravatar image

Thanks man, I did install bumblebee. It works like a charm.

edit flag offensive delete link more

Question Tools

1 follower


Asked: 2016-11-01 08:34:16 -0500

Seen: 731 times

Last updated: Dec 01 '16