Error in filtering of fragments

Issue #21 resolved
Ilya Flyamer
created an issue

Hi, I am getting an error while filtering duplicates in a HiCdataset (the code is very standard, don't think there is any need to post it here...). Here is the traceback:

hello from new mapping
----> New dataset opened, genome fasta, filename = ../3T3/raw_results/fragment_dataset.hdf5
----->!!!File already exists! It will be deleted

     loading data from file ../3T3/raw_results/mapped_reads.hdf5 (assuming h5dict)
----->Semi-dangling end filter: remove guys who start 5 bp near the rsite
----->Filtering duplicates in DS reads: 
Sorting using 36 chunks
Traceback (most recent call last):
  File "/home/ilya/Документы/biology/Vienna single-cell Hi-C/Illumina/mapping_etc/02_fragment_filtering.py", line 22, in <module>
    fragments.filterDuplicates()
  File "/usr/local/lib/python2.7/dist-packages/hiclib/fragmentHiC.py", line 1390, in filterDuplicates
    numutils.externalMergeSort(dset, tempdset, chunkSize=30000)
  File "/usr/local/lib/python2.7/dist-packages/mirnylib/numutils.py", line 268, in externalMergeSort
    maxes = [i.max() for i in currentChunks]
  File "/usr/lib/python2.7/dist-packages/numpy/core/_methods.py", line 17, in _amax
    out=out, keepdims=keepdims)
TypeError: cannot perform reduce with flexible type

I think I'm using the latest hiclib from the default branch.

Comments (3)

  1. Log in to comment