PHP issue
Hi Simon,
Hope all’s well! Got an issue with PHP and wonder if you have already seen that:
Loading custom kmer counting
Processing S48C32.18337NC12965.fasta
Traceback (most recent call last):
File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/site-packages/iphop/utils/PHP.py", line 138, in <module>
File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/site-packages/iphop/utils/PHP.py", line 133, in main
File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/site-packages/iphop/utils/PHP_src/countKmer.py", line 50, in getKmer
File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/concurrent/futures/process.py", line 645, in submit
File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/concurrent/futures/process.py", line 584, in _start_queue_management_thread
File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/concurrent/futures/process.py", line 608, in _adjust_process_count
File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/multiprocessing/process.py", line 121, in start
File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/multiprocessing/context.py", line 277, in _Popen
File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/multiprocessing/popen_fork.py", line 19, in __init__
File "/mfs/nicot/miniconda3/envs/iphop_env/lib/python3.8/multiprocessing/popen_fork.py", line 69, in _launch
OSError: [Errno 24] Too many open files
Any ideas?
Cheers!
Nico
Comments (13)
-
reporter -
repo owner Hi Nico,
Sorry, I’ve never seen that.. Could you try to check how many files you can currently open at the same time, and maybe change this limit with “ulimit” before trying to run iPHoP again ? (see https://www.tecmint.com/increase-set-open-file-limits-in-linux/). Hopefully this will fix this.
Best,
Simon
-
reporter Hi Simon,
I have 73308 files on “split_input” and I already have ulimit as unlimited…
Cheers,
Nico
-
reporter Correction, I had ulimit -n giving 1024 I increased to 4096 but might increase more that…
-
repo owner Oh then I think the solution is to split your input file (73k sequences is a lot to process at once). You can split it into batches of ~ 5k or ~10k, and it should be better (and likely quicker / more efficient)
-
reporter I just did it (I split the fasta into 10 files) and got exactly similar issue…
-
repo owner That's strange.. can you check if you include only 1 sequence in the input file ?
-
reporter I just did it and got the same error…
-
repo owner Ok so the issue is not exactly the one I was thinking about. Can you try to run iPHoP with the test dataset and the test database indicated in the readme ? That would be my next step to see if this is an issue with the input file / database you are trying to run.
-
reporter Ok I just increase the ulimit to 20k and it seems working with the splitted data…Will keep you informed! Thanks Simon!
-
repo owner Ok, fingers crossed, and sorry you had to go through all this trial-and-error !
-
reporter No worries at all :) I really appreciated that you answered quickly to all the issues !
-
repo owner - changed status to closed
- Log in to comment
The number of threads in the command (if I stop and rerun reducing the threads) don’t change anything…
Cheers,
Nico