PHP issue

Issue #101 closed
Nicolas Tromas created an issue

Hi Simon,

Hope all’s well! Got an issue with PHP and wonder if you have already seen that:

Loading custom kmer counting

Processing S48C32.18337NC12965.fasta

Traceback (most recent call last):

File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/site-packages/iphop/utils/PHP.py", line 138, in <module>

File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/site-packages/iphop/utils/PHP.py", line 133, in main

File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/site-packages/iphop/utils/PHP_src/countKmer.py", line 50, in getKmer

File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/concurrent/futures/process.py", line 645, in submit

File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/concurrent/futures/process.py", line 584, in _start_queue_management_thread

File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/concurrent/futures/process.py", line 608, in _adjust_process_count

File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/multiprocessing/process.py", line 121, in start

File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/multiprocessing/context.py", line 277, in _Popen

File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/multiprocessing/popen_fork.py", line 19, in __init__

File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/multiprocessing/popen_fork.py", line 69, in _launch

OSError: [Errno 24] Too many open files

Any ideas?

Cheers!

Nico

Comments (13)

  1. Nicolas Tromas reporter

    The number of threads in the command (if I stop and rerun reducing the threads) don’t change anything…

    Cheers,

    Nico

  2. Nicolas Tromas reporter

    Hi Simon,

    I have 73308 files on “split_input” and I already have ulimit as unlimited…

    Cheers,

    Nico

  3. Nicolas Tromas reporter

    Correction, I had ulimit -n giving 1024 I increased to 4096 but might increase more that…

  4. Simon Roux repo owner

    Oh then I think the solution is to split your input file (73k sequences is a lot to process at once). You can split it into batches of ~ 5k or ~10k, and it should be better (and likely quicker / more efficient)

  5. Simon Roux repo owner

    Ok so the issue is not exactly the one I was thinking about. Can you try to run iPHoP with the test dataset and the test database indicated in the readme ? That would be my next step to see if this is an issue with the input file / database you are trying to run.

  6. Nicolas Tromas reporter

    Ok I just increase the ulimit to 20k and it seems working with the splitted data…Will keep you informed! Thanks Simon!

  7. Nicolas Tromas reporter

    No worries at all :) I really appreciated that you answered quickly to all the issues !

  8. Log in to comment