BUG: read.ncdf() doesn't work for large trajectory file

Issue #64 resolved
Xinqiu Yao created an issue

Here is the report:

I converted the file to netcdf from an amber ascii file format using ptraj. It has 300000 frames and 2506 atoms. I am using the latest R (3.0.2) on a Mac OS X (version 10.7.5) on a MacBook Air. The Bio3D version is the latest, versions 2.0.

Here are the command and output on the screen:

> a.trj <- read.ncdf("cypa.apo.trajectory.ncdf", time = TRUE)

Loading required package: ncdf

[1] "Reading file cypa.apo.trajectory.ncdf"

[1] "Produced by program: sander"

[1] "File conventions AMBER version 1.0"

[1] "Frames: 300000"

[1] "Atoms: 2506"

Error in get.var.ncdf(nc, "coordinates", c(1, first.atom, ss), c(-1, count.atom,  : 

  long vectors (argument 5) are not supported in .Fortran

This is possibly because of the old 32-bit memory limit still used in NCDF package. It will be solved once NCDF is updated. Alternatively, we can add lines to check length of file in read.ncdf(): If longer than limit, we split the trajectory and read it in pieces. But this will definitely make codes more complicated.

Comments (1)

  1. Log in to comment