The NVIDIA MEX-Plugin & Jacket

by melonakos on January 7, 2009

in CUDA

One of the first questions people ask when considering Jacket for GPU MATLAB computing is the following:

How is Jacket different from the MATLAB plugin on the NVIDIA website (found here:  http://developer.nvidia.com/object/matlab_cuda.html)?

The short answer to this is that the NVIDIA MEX-plugin requires you to write CUDA code, while Jacket does not.  This has many implications and ends up resulting in a lot of advantages for you as a MATLAB programmer.  First let’s describe the features of the MEX-plugin:

  1. You write CUDA code that solves your problem.
  2. You use the MEX configuration files provided by NVIDIA to compile your CUDA code into a MEX file that is callable by MATLAB.
  3. MATLAB calls your MEX file, moves data out to the GPU, computes, then returns the data back to MATLAB.
  4. For FFT2, NVIDIA has already MEXified it for you, so you don’t have to write that one.

Now let’s explore the features of Jacket:

  1. Jacket code is MATLAB code enhanced by the simple addition of a new data type.  You don’t write any CUDA code.  For instance, we introduce the GPU analog to MATLAB’s single matrix type: “gsingle”.  Jacket’s “gsingle” behaves exactly like any other MATLAB matrix under manipulation (subscripted reference (full), subscripted assignment (full), display, and many MATLAB functions).  You can even index gsingles with gsingles!
  2. Jacket is a full runtime system which optimizes GPU-specific programming aspects for the MATLAB user, such as memory transfers, kernel configurations, and execution launches.
  3. Jacket allows memory to remain on the GPU between successive function calls, rather than resorting to a round trip memory transfer with each call.
  4. Jacket allows memory to remain on the GPU even when you want to visualize data.  That’s the point of the Graphics Toolbox, besides the fact that it’s just outright prettier than standard MATLAB visualizations (though it still needs “handle graphics” to do everything that MATLAB can do on the CPU – we’re working on that!).
  5. Jacket supports the exact M-language API, so that your functions will behave the same on the GPU as they do on the CPU (for fully supported features).
  6. Jacket let’s you load GPU variables into the MATLAB workspace and manipulate them as you would any other MATLAB variable.  For instance, you can clear GPU variables.

The MEX plugin aims to let you write GPU code using CUDA and then use that code from MATLAB.  Jacket aims to let you code for the GPU in M, the language underlying MATLAB.  An example of Jacket code would look something like this:

A = gones(5,5,100); B = A;
C = gones(5,5);
gfor i = 1:100
A(:,:,i) = B(:,:,i) * C;
gend

With Jacket’s “gsingle” function and the Jacket’s new GPU for-loop (coming in Jacket v1.0), all of the multiplication inside of the for-loop and all iterations of the for-loop are computed simultaneously on the GPU.  And with the exception of 4 extra g’s, this is standard MATLAB code.

Now that we understand what each does, let’s talk about incorporating custom CUDA code into Jacket.  Say, for instance, that Jacket does not support the “A = gangstr(M,tol)” function from the Optimization Toolbox (indeed, Jacket does not currently support “gangstr”).  But you really need that function, and you need it yesterday!  We provide an open interface, based on the MEX-plugin, that will allow you to write custom CUDA code and integrate it directly into Jacket.  To learn more about this, check out our MEX Example in the Jacket installation directory and also hosted here:  http://www.accelereyes.com/examples/mex_example.zip.  This example enables you to link your custom CUDA functions directly into the optimized Jacket runtime, using the MEX plugin to compile your CUDA function down to a MEX file.

So, hopefully this clarifies the synergy between the CUDA MEX plugin and Jacket.  And, if you end up figuring out how to crank out a custom CUDA MEX function for “gangstr”, we want to hire you!

Comments on this entry are closed.

Previous post:

Next post: