Analysis Halted

Issue #11 resolved
Former user created an issue

Hi ROSE Team,

Tried rose with example data sets, worked fine.

Ran some of my own via:

python /Users/workdir/ROSE_main.py -g hg19 -i /Users/workdir/ROSE-SuperEnhancers/Test/Rep1_K27ac_Peaks_ROSE_3.gff -r /Users/workdir/ROSE-SuperEnhancers/Test/Rep2_H3K27Ac_final.bam -o /Users/workdir/ROSE-SuperEnhancers/Test/ t- 2000 -c /Users/workdir/ROSE-SuperEnhancers/Test/Rep3_Input_final.bam

As far as I can tell the analysis in running OK then stops abruptly at:

#SETTING NEGATIVE VALUES IN THE rankBy_vector to 0

rankBy_vector[rankBy_vector < 0] <- 0

#FIGURING OUT THE CUTOFF

cutoff_options <- calculate_cutoff(rankBy_vector, drawPlot=FALSE,xlab=paste(rankBy_factor,'_enhancers'),ylab=paste(rankByFactor,' Signal','- ',wceName),lwd=2,col=4) Error in optimize(numPts_below_line, lower = 1, upper = length(inputVector), : 'xmin' not less than 'xmax' Calls: calculate_cutoff -> optimize Execution halted

I do get the 'stitched enhancer region map.txt' file produced but nothing else.

I'm using a different replicates for .gff, BAM and Input. However since these are the same tissue and mods I'm not expecting dramatic differences in enrichment.

Any Ideas?

Thanks!

Peter

Comments (5)

  1. charles_lin

    Hi there,

    It's easiest to debug if we can first confirm that your code is working on our example data with our example calls.

    Please download the example data and confirm that ROSE runs correctly

    Example data can be obtained here: http://younglab.wi.mit.edu/super_enhancer_code.html

    Also in your call, you have 't- 2000' as opposed to '-t 2000' that typo should not cause the code to bug, but you never know.

    Also check if your 'stitched enhancer region map.txt' file contains signal values for your rankby and control bams

    Best,

  2. Peter McErlean

    Hi Charles,

    Thanks for the quick reply.

    I've tried ROSE with the test data and everything worked fine, tables and plot produced etc.

    The 'stitched enhance region map.txt' file actually produced only one row:

    REGION_ID CHROM START STOP NUM_LOCI CONSTITUENT_SIZE Rep1_H3K27Ac_final.bam Rep3_Input_final.bam 1_Combined_BEC_H3K27Ac_rep_peak_1_lociStitched chr1 713872 714036 1 165 0 0

    I've fixed the typo you've spotted and re-ran the analysis of my samples and got this:

    #FIGURING OUT THE CUTOFF

    cutoff_options <- calculate_cutoff(rankBy_vector, drawPlot=FALSE,xlab=paste(rankBy_factor,'_enhancers'),ylab=paste(rankByFactor,' Signal','- ',wceName),lwd=2,col=4) Error in optimize(numPts_below_line, lower = 1, upper = length(inputVector), : 'xmin' not less than 'xmax' Calls: calculate_cutoff -> optimize In addition: Warning messages: 1: In max(inputVector) : no non-missing arguments to max; returning -Inf 2: In min(inputVector) : no non-missing arguments to min; returning Inf Execution halted

    This time the 'region map.txt' file produced only the headers.

    I can send you my files I have just in case there is something inherently wrong with them.

    In the meantime, I'll reinstall and run the analysis again.

    Thanks for you help.

    Best

    Peter

  3. Peter McErlean

    Hi Charles;

    Sorry was meant to include that after I run the example data I get returned to the terminal prompt. OK fine, but then all of a sudden the analysis kicks off again with this:

    using a MMR value of 17.4141 using a MMR value of 17.4141 has chr has chr Number lines processed 0 Number lines processed 0 using a MMR value of 20.5167 using a MMR value of 20.5167 has chr has chr Number lines processed 0 Number lines processed 0 100 100 100 100 200 200 200 200 300 300 300 300 400 400 400 400 500 500 500 500 600 600 600 600 700 700 700 700 800 800 800 900 800 900 900 900

    Not sure if this matters also.

    Best

    Peter

  4. Peter McErlean

    HI Charles,

    I have it sorted out now. The issue was my .gff file. Ended up having to send my .bed through GALAXY and export it as a .tabular file. Seems ROSE picked it up and was able to run with it.

    Thanks

    Peter

  5. Log in to comment