Running bio3d from HPC

Issue #464 resolved
debra ragland created an issue

Hello,

I have some pretty large md data I am trying to work with and running R from my desktop won't suffice. Therefore, I am trying to run and store some simple bio3d commands from my school cluster (lsf). I am wondering if anyone has a sample script for use of bio3d on a cluster? For instance, I have a submission script to use with the bsub command protein_network.sh

#!/bin/bash

#BSUB -J Protein_Network #BSUB -R "rusage[mem=4096]" #BSUB -W 300:0 #BSUB -q long

module load R/3.1.0

cd /path/to/protein_network.R

R CMD BATCH --no-save --quiet --slave protein_network.R prone.out

But I think my trouble is coming in with the actual R script, protein_network. R, can I just combine the commands I need; dcd <- read.dcd("wt.dcd") pdb <- read.pdb("Everything_Frame0.pdb") inds <- atom.select(pdb, resno = c(24:27, 85:90), elety = "CA") trj <- fit.xyz(fixed = pdb$xyz, mobile = dcd,fixed.inds = inds$xyz, mobile.inds = inds$xyz) cij <- dccm(trj) net <- cna(cij)

write.csv(cij, "cijmat.csv") write.csv(net, "netmat.csv")

Or are there elements missing?

Comments (3)

  1. Xinqiu Yao

    Hi,

    Running R codes on clusters should be no difference from running other commands, as long as you have R and all necessary packages installed properly. Yes, you can put all commands you need in one file and execute it using Rscript or R CMD BATCH. What's the problem you have here?

  2. debra ragland reporter

    No problems as of yet, just didn't know if there were more arguments needed at the beginning of the R script. Thank you.

  3. Log in to comment