Clone wiki

jCODE-public / Remote_Visualization

Remote visualization using Paraview on GreatLakes

Using Paraview remotely prevents the transfer of large data on your local machine. Furthermore, the use of several CPUs eases the manipulation of result files.

Using an interactive job

  • Install the same version of Paraview on your local machine as the one on GreatLakes (currently Paraview 5.7.0).

  • Set up an ssh tunnel


  • Load the paraview module

    module load paraview

  • Allocate a node for your job (Note from HPC Support: "this does launch a job that is being billed for")

salloc --account=your_account --partition=standard --nodes=1 --ntasks-per-node=1 --cpus-per-task=yournumcores --mem-per-cpu=1g --time=24:00:00
The account can be jcaps1 for example. This will allocate resources. You will get the following message

salloc: Nodes glXXXX are ready for job
where glXXXX is the node allocated.

  • Run pvserver in your allocation ("This step uses the resources you allocated in the salloc step above").

    srun --cpus-per-task yournumcores pvserver --server-port=11111
    The server port should be individual so we don’t all use the same. (Note from HPC Support:" Do not use 5999").

  • Set up an ssh tunnel between your local machine and the pvserver running in your job (replace glXXX by the node allocated):

    ssh -N -L

  • Open paraview on your local machine.

Go to File then Connect. Add a server. Change the port number (11111 in this example). Press Configure and Save. Press Connect. Paraview.png

Using XQuartz

Paraview will run on the login node (not advised). Performance will be variable according to the current use of the node. You don’t need Paraview on your local machine.

  • Install XQuartz on your local machine

  • Set up a ssh tunnel using the following command

    ssh -Y

  • Load the paraview module:

    module load paraview

  • Open paraview by typing: