Memory error
Hi!
I am currently running eLSA on a matrix with 555150 rows and 7 columns (7 different samples from different timepoints, same system).
I am running the following command (inspired by the protocol at Weiss et al. (2016)):
lsa_compute input.file output.file -p theo -x 1000 -b 100 -n robustZ -r 1 -s 7 #
When doing so I get the following error:
lsa_compute (rev: 506cc2c) - copyright Li Charlie Xia, lixia@stanford.edu
firstData factorNum, repNum, spotNum = 555150, 1, 7 secondData factorNum, repNum, spotNum = 555150, 1, 7 calculating 308191522500 pairwise local similarity scores... inside applyAnalysis... Traceback (most recent call last): File "/services/tools/anaconda2/4.0.0/bin/lsa_compute", line 11, in <module> load_entry_point('lsa==1.0.2', 'console_scripts', 'lsa_compute')() File "/services/tools/anaconda2/4.0.0/lib/python2.7/site-packages/lsa/lsa_compute.py", line 335, in main secondFactorLabels=secondFactorLabels, qvalueMethod=qvalueMethod, progressive=progressive) File "/services/tools/anaconda2/4.0.0/lib/python2.7/site-packages/lsa/lsalib.py", line 960, in applyAnalysis pvalues = np.zeros(pairwiseNum, dtype='float') MemoryError #
I have looked at the manual and the FAQs and I read memory is normally not an issue. Maybe my arguments are not right.
Thanks a lot!
Comments (3)
-
repo owner -
repo owner - changed status to closed
-
repo owner - changed status to invalid
- Log in to comment
Hi, I haven't tried ELSA pair to pair with ~ 5x10^6 factors, which implies 10^13 comparison. One possibiility is to use the par_ana.py script in the package. Given,