FileNotFoundError: [Errno 2] No such file or directory: '/project/pi_rbeinart_uri_edu/michelle/iphop/iphop_db2/Aug_2023_pub_rw/db_infos/gtdbtk.ar122.decorated.tree'
I first ran:
gtdbtk de_novo_wf --genome_dir /project/pi_rbeinart_uri_edu/michelle/iphop/symbiont_MAGs --bacteria --outgroup_taxon p__Patescibacteria --out_dir symbiont_MAGs_GTDB-tk_results2/ --cpus 32 --force --extension fasta
Then I ran:
iphop add_to_db --fna_dir symbiont_MAGs/ --gtdb_dir symbiont_MAGs_GTDB-tk_results/ --out_dir April_2024_pub_rw_w_MAG_hosts --db_dir /project/pi_rbeinart_uri_edu/michelle/iphop/iphop_db2/Aug_2023_pub_rw
However, it throws the error:
FileNotFoundError: [Errno 2] No such file or directory: '/project/pi_rbeinart_uri_edu/michelle/iphop/iphop_db2/Aug_2023_pub_rw/db_infos/gtdbtk.ar122.decorated.tree'
When I go into this directory, it is because I have gtdbtk.ar53.decorated.tree in there instead of 122. What does this mean, and what do I have to change to make this work?
Thank you.
Comments (24)
-
repo owner -
reporter Hi there. Thank you, but I think something else might be going on too unfortunately!
To clarify, in my iphop_db2/Aug_2023_pub_rw/db_infos/ I have a file gtdbtk.ar53.decorated.tree
I actually don’t have any file in my symbiont_MAGs_GTDB-tk_results/ directory with xxx.arYY.xxx. However, after I read your response, I figured this might have been because I skipped:
$ gtdbtk de_novo_wf --genome_dir Wetland_MAGs/ --archaea --outgroup_taxon p__Altiarchaeota --out_dir Wetland_MAGs_GTDB-tk_results/ --cpus 32 --force --extension fa
in your tutorial. I didn’t think I needed to do this step since I wasn’t adding any archae to the db. However, assuming I needed to run this for the downstream analyses, I tried to run it but got the following error:
[2024-04-15 00:52:32] INFO: GTDB-Tk v2.3.2 [2024-04-15 00:52:32] INFO: gtdbtk de_novo_wf --genome_dir /project/pi_rbeinart_uri_edu/michelle/iphop/symbiont_MAGs/ --archaea --outgroup_taxon p__Altiarchaeota --out_dir symbiont_MAGs_GTDB-tk_results2/ --cpus 32 --force --extension fasta [2024-04-15 00:52:32] INFO: Using GTDB-Tk reference data version r207: /project/pi_rbeinart_uri_edu/Databases/gtdbtk-2.1.1 [2024-04-15 00:52:33] INFO: Identifying markers in 590 genomes with 32 threads. [2024-04-15 00:52:33] TASK: Running Prodigal V2.6.3 to identify genes. [2024-04-15 00:54:49] INFO: Completed 590 genomes in 2.26 minutes (260.61 genomes/minute). [2024-04-15 00:54:49] TASK: Identifying TIGRFAM protein families. [2024-04-15 00:56:02] INFO: Completed 590 genomes in 1.22 minutes (485.55 genomes/minute). [2024-04-15 00:56:02] TASK: Identifying Pfam protein families. [2024-04-15 00:56:07] INFO: Completed 590 genomes in 4.58 seconds (128.82 genomes/second). [2024-04-15 00:56:07] INFO: Annotations done using HMMER 3.1b2 (February 2015). [2024-04-15 00:56:07] TASK: Summarising identified marker genes. [2024-04-15 00:56:17] INFO: Completed 590 genomes in 10.35 seconds (57.01 genomes/second). [2024-04-15 00:56:18] INFO: Done. [2024-04-15 00:56:26] INFO: Aligning markers in 590 genomes with 32 CPUs. [2024-04-15 00:56:26] INFO: Processing 590 genomes identified as bacterial. [2024-04-15 00:59:07] INFO: Read concatenated alignment for 62,291 GTDB genomes. [2024-04-15 00:59:07] TASK: Generating concatenated alignment for each marker. [2024-04-15 00:59:08] INFO: Completed 590 genomes in 0.57 seconds (1,041.06 genomes/second). [2024-04-15 00:59:09] TASK: Aligning 120 identified markers using hmmalign 3.1b2 (February 2015). [2024-04-15 00:59:18] INFO: Completed 120 markers in 6.89 seconds (17.42 markers/second). [2024-04-15 00:59:18] TASK: Masking columns of bacterial multiple sequence alignment using canonical mask. [2024-04-15 01:00:50] INFO: Completed 62,877 sequences in 1.54 minutes (40,904.12 sequences/minute). [2024-04-15 01:00:50] INFO: Masked bacterial alignment from 41,084 to 5,036 AAs. [2024-04-15 01:00:50] INFO: 12 bacterial user genomes have amino acids in <10.0% of columns in filtered MSA. [2024-04-15 01:00:50] INFO: Creating concatenated alignment for 62,865 bacterial GTDB and user genomes. [2024-04-15 01:01:06] INFO: Creating concatenated alignment for 574 bacterial user genomes. [2024-04-15 01:01:06] INFO: Done. [2024-04-15 01:01:06] ERROR: Input file does not exist: symbiont_MAGs_GTDB-tk_results2/align/gtdbtk.ar53.msa.fasta.gz [2024-04-15 01:01:06] ERROR: Controlled exit resulting from an unrecoverable error or warning. ================================================================================ EXCEPTION: BioLibFileNotFound MESSAGE: Input file does not exist: symbiont_MAGs_GTDB-tk_results2/align/gtdbtk.ar53.msa.fasta.gz ________________________________________________________________________________ Traceback (most recent call last): File "/project/pi_rbeinart_uri_edu/michelle/conda/gtdb_2.1.1/lib/python3.8/site-packages/gtdbtk/__main__.py", line 102, in main gt_parser.parse_options(args) File "/project/pi_rbeinart_uri_edu/michelle/conda/gtdb_2.1.1/lib/python3.8/site-packages/gtdbtk/main.py", line 1052, in parse_options self.infer(options) File "/project/pi_rbeinart_uri_edu/michelle/conda/gtdb_2.1.1/lib/python3.8/site-packages/gtdbtk/main.py", line 413, in infer check_file_exists(options.msa_file) File "/project/pi_rbeinart_uri_edu/michelle/conda/gtdb_2.1.1/lib/python3.8/site-packages/gtdbtk/biolib_lite/common.py", line 96, in check_file_exists raise BioLibFileNotFound('Input file does not exist: ' + input_file) gtdbtk.biolib_lite.exceptions.BioLibFileNotFound: Input file does not exist: symbiont_MAGs_GTDB-tk_results2/align/gtdbtk.ar53.msa.fasta.gz ================================================================================
Please let me know if you have any insight! Thank you.
-
repo owner So if you don’t have any archaea to add, then iPHoP should be able to handle it by simply copying over the archaeal tree from the original database. I’m not 100% sure what is going wrong here in this process, but can you check which version of iPHoP you are using ? I am wondering if maybe this is due to the iPHoP version being older than the database Aug_2023_pub_rw, which would cause some issues.
-
reporter Thank you for the quick replies! I am using iPhop 1.3.3
-
repo owner Oh ok, I think I may see what is likely happening (and this is the same bug I was mentioning earlier, i.e. iPHoP is getting confused between different GTDB-tk versions). I think the quickest way for you to fix is to go to the original iPHoP database folder (/project/pi_rbeinart_uri_edu/michelle/iphop/iphop_db2/Aug_2023_pub_rw/db_infos/), and make a copy if the archaeal tree “gtdbtk.ar53.decorated.tree” to “gtdbtk.ar122.decorated.tree” (i.e. “cp gtdbtk.ar53.decorated.tree gtdbtk.ar122.decorated.tree”). Then you can try “add_to_db” again, but this time it should be ok (fingers crossed).
-
reporter Thank you! That fixed the issue. However, I did end up with a new error that I’m not clear about:
Loading miniconda version 22.11.1-1 Looks like everything is now set up, we will first clean up the input file, and then we will start the host prediction steps themselves [1/1/Run] Running blastn against genomes... [1/3/Run] Get relevant blast matches... [2/1/Run] Running blastn against CRISPR... [2/2/Run] Get relevant crispr matches... [3/1/Run] Running (recoded)WIsH... [3/1/Run] Running WIsH extra database... [3/2/Run] Get relevant WIsH hits... [4/1/Run] Running VHM s2 similarities... [4/2/Run] Get relevant VHM hits... [5/1/Run] Running PHP... [5/2/Run] Get relevant PHP hits... Traceback (most recent call last): File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/bin/iphop", line 10, in <module> sys.exit(cli()) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/iphop.py", line 128, in cli args"func" File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/modules/master_predict.py", line 92, in main php.run_and_parse_php(args) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/modules/php.py", line 28, in run_and_parse_php get_php_results(args["fasta_file"],args["phprawresult"],args["phpparsed"],logger,args['messages']) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/modules/php.py", line 42, in get_php_results df_pred = pd.read_csv(pred_file,delimiter=',',quotechar='"', index_col=0) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/util/_decorators.py", line 311, in wrapper return func(*args, **kwargs) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 586, in read_csv return _read(filepath_or_buffer, kwds) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 482, in _read parser = TextFileReader(filepath_or_buffer, **kwds) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 811, in init self._engine = self._make_engine(self.engine) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 1040, in _make_engine return mapping[engine](self.f, **self.options) # type: ignore[call-arg] File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/c_parser_wrapper.py", line 51, in init self._open_handles(src, kwds) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/base_parser.py", line 222, in _open_handles self.handles = get_handle( File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/common.py", line 702, in get_handle handle = open( FileNotFoundError: [Errno 2] No such file or directory: 'iphop_output/Wdir/php_results/php_db_Prediction_Allhost.csv'
-
repo owner Yes, seems like PHP had some issues. Can you check what is the content of the file “php.log” and “php.cmd” (these should be in “Wdir” in your output directory). Also, can you check the list of files available in “Wdir/php_results” ? Thanks !
-
reporter Yes!
output of php.log starts by saying “processing XXXX.fasta” but then at the end it says:
Preparing output file Traceback (most recent call last): File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/utils/PHP.py", line 138, in <module> main() File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/utils/PHP.py", line 134, in main predictVirusHost(scriptPath,bacteriaKmerDir,bacteriaKmerName,outFileDir,dicVirusSeqLength) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/utils/PHP.py", line 7, in predictVirusHost modelFullLength = joblib.load(scriptPath+'/PHP_src/FullLength/FullLength.m') File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/joblib/numpy_pickle.py", line 585, in load obj = _unpickle(fobj, filename, mmap_mode) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/joblib/numpy_pickle.py", line 504, in _unpickle obj = unpickler.load() File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/pickle.py", line 1212, in load dispatch[key[0]](self) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/pickle.py", line 1528, in load_global klass = self.find_class(module, name) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/pickle.py", line 1579, in find_class __import__(module, level=0) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/mixture/__init__.py", line 5, in <module> from ._gaussian_mixture import GaussianMixture File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/mixture/_gaussian_mixture.py", line 11, in <module> from ._base import BaseMixture, _check_shape File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/mixture/_base.py", line 13, in <module> from .. import cluster File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/cluster/__init__.py", line 6, in <module> from ._spectral import spectral_clustering, SpectralClustering File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/cluster/_spectral.py", line 16, in <module> from ..neighbors import kneighbors_graph, NearestNeighbors File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/neighbors/__init__.py", line 17, in <module> from ._nca import NeighborhoodComponentsAnalysis File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/neighbors/_nca.py", line 22, in <module> from ..decomposition import PCA File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/decomposition/__init__.py", line 17, in <module> from .dict_learning import dict_learning File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/decomposition/dict_learning.py", line 4, in <module> from . import _dict_learning File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/decomposition/_dict_learning.py", line 21, in <module> from ..linear_model import Lasso, orthogonal_mp_gram, LassoLars, Lars File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/linear_model/__init__.py", line 12, in <module> from ._least_angle import (Lars, LassoLars, lars_path, lars_path_gram, LarsCV, File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/sklearn/linear_model/_least_angle.py", line 30, in <module> method='lar', copy_X=True, eps=np.finfo(np.float).eps, File "/home/michellehauer_uri_edu/.local/lib/python3.8/site-packages/numpy/__init__.py", line 305, in __getattr__ raise AttributeError(__former_attrs__[attr]) AttributeError: module 'numpy' has no attribute 'float'. `np.float` was a deprecated alias for the builtin `float`. To avoid this error in existing code, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations (END)
php.cmd gives:
python3 /home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/utils/PHP.py --virusFastaFileDir iphop_output/Wdir/split_input/ --outFileDir iphop_output/Wdir/php_results/ --bacteriaKmerDir April_2024_pub_rw_w_MAG_hosts/db --bacteriaKmerName php_db> iphop_output/Wdir/php.log 2>&1
and lastly, the only file in Wdir/php_results is virusKmer
Thank you!
-
repo owner Oh this looks like a version issue for sklearn vs numpy :-( These are annoying to fix, sorry. Can you check with conda list -n iphop_env what version of scikit-learn you are currently running ?
-
reporter The version of scikit-learn is 0.22.2.post1
The full output for conda list is:
# Name Version Build Channel _libgcc_mutex 0.1 conda_forge conda-forge _openmp_mutex 4.5 2_gnu conda-forge _r-mutex 1.0.1 anacondar_1 conda-forge abseil-cpp 20210324.2 h9c3ff4c_0 conda-forge absl-py 2.1.0 pyhd8ed1ab_0 conda-forge aiohttp 3.9.3 py38h01eb140_0 conda-forge aiosignal 1.3.1 pyhd8ed1ab_0 conda-forge alsa-lib 1.2.8 h166bdaf_0 conda-forge archspec 0.2.3 pyhd8ed1ab_0 conda-forge argtable2 2.13 h14c3975_1001 conda-forge astunparse 1.6.3 pyhd8ed1ab_0 conda-forge async-timeout 4.0.3 pyhd8ed1ab_0 conda-forge attrs 23.2.0 pyh71513ae_0 conda-forge binutils_impl_linux-64 2.40 hf600244_0 conda-forge biopython 1.79 py38h0a891b7_3 conda-forge blas 1.1 openblas conda-forge blast 2.12.0 hf3cf87c_4 bioconda blast-legacy 2.2.26 h9ee0642_3 bioconda blinker 1.7.0 pyhd8ed1ab_0 conda-forge boltons 23.1.1 pyhd8ed1ab_0 conda-forge boost-cpp 1.74.0 h75c5d50_8 conda-forge bottleneck 1.3.8 py38h7f0c24c_0 conda-forge brotli-python 1.1.0 py38h17151c0_1 conda-forge bwidget 1.9.14 ha770c72_1 conda-forge bzip2 1.0.8 hd590300_5 conda-forge c-ares 1.27.0 hd590300_0 conda-forge ca-certificates 2024.2.2 hbcca054_0 conda-forge cached-property 1.5.2 hd8ed1ab_1 conda-forge cached_property 1.5.2 pyha770c72_1 conda-forge cachetools 4.2.4 pyhd8ed1ab_0 conda-forge cairo 1.16.0 hb05425b_5 certifi 2024.2.2 pyhd8ed1ab_0 conda-forge cffi 1.16.0 py38h6d47a40_0 conda-forge charset-normalizer 3.3.2 pyhd8ed1ab_0 conda-forge click 8.0.4 py38h578d9bd_0 conda-forge clustalo 1.2.4 hdbdd923_7 bioconda clustalw 2.1 h4ac6f70_9 bioconda colorama 0.4.6 pyhd8ed1ab_0 conda-forge conda 23.1.0 py38h578d9bd_0 conda-forge conda-libmamba-solver 23.1.0 pyhd8ed1ab_0 conda-forge conda-package-handling 2.2.0 pyh38be061_0 conda-forge conda-package-streaming 0.9.0 pyhd8ed1ab_0 conda-forge crisper_recognition_tool 1.2 hdfd78af_2 bioconda cryptography 39.0.0 py38h1724139_0 conda-forge curl 7.86.0 h7bff187_1 conda-forge diamond 2.0.15 hb97b32f_1 bioconda distro 1.9.0 pyhd8ed1ab_0 conda-forge entrez-direct 21.6 he881be0_0 bioconda expat 2.6.2 h59595ed_0 conda-forge fmt 9.1.0 h924138e_0 conda-forge font-ttf-dejavu-sans-mono 2.37 hab24e00_0 conda-forge font-ttf-inconsolata 3.000 h77eed37_0 conda-forge font-ttf-source-code-pro 2.038 h77eed37_0 conda-forge font-ttf-ubuntu 0.83 h77eed37_1 conda-forge fontconfig 2.14.2 h14ed4e7_0 conda-forge fonts-conda-ecosystem 1 0 conda-forge fonts-conda-forge 1 0 conda-forge freetype 2.12.1 h267a509_2 conda-forge fribidi 1.0.10 h36c2ea0_0 conda-forge frozenlist 1.4.1 py38h01eb140_0 conda-forge gast 0.4.0 pyh9f0ad1d_0 conda-forge gawk 5.3.0 ha916aea_0 conda-forge gcc_impl_linux-64 13.2.0 h338b0a0_5 conda-forge gettext 0.21.1 h27087fc_0 conda-forge gfortran_impl_linux-64 13.2.0 h76e1118_5 conda-forge giflib 5.2.1 h0b41bf4_3 conda-forge glib 2.78.1 hfc55251_0 conda-forge glib-tools 2.78.1 hfc55251_0 conda-forge gmp 6.3.0 h59595ed_1 conda-forge google-auth 1.35.0 pyh6c4a22f_0 conda-forge google-auth-oauthlib 0.4.6 pyhd8ed1ab_0 conda-forge google-pasta 0.2.0 pyh8c360ce_0 conda-forge graphite2 1.3.13 h58526e2_1001 conda-forge grpc-cpp 1.43.2 h9e046d8_3 conda-forge grpcio 1.43.0 py38hdd6454d_0 conda-forge gsl 2.7 he838d99_0 conda-forge gxx_impl_linux-64 13.2.0 h338b0a0_5 conda-forge h5py 3.8.0 nompi_py38hd5fa8ee_100 conda-forge harfbuzz 6.0.0 h8e241bc_0 conda-forge hdf5 1.12.2 nompi_h2386368_100 conda-forge hmmer 3.3.2 hdbdd923_4 bioconda icu 70.1 h27087fc_0 conda-forge idna 3.6 pyhd8ed1ab_0 conda-forge importlib-metadata 7.0.2 pyha770c72_0 conda-forge iphop 1.3.3 pyhdfd78af_0 bioconda joblib 1.0.1 pyhd8ed1ab_0 conda-forge jpeg 9e h0b41bf4_3 conda-forge jsonpatch 1.33 pyhd8ed1ab_0 conda-forge jsonpointer 2.4 py38h578d9bd_3 conda-forge keras 2.7.0 pyhd8ed1ab_0 conda-forge keras-preprocessing 1.1.2 pyhd8ed1ab_0 conda-forge kernel-headers_linux-64 2.6.32 he073ed8_17 conda-forge keyutils 1.6.1 h166bdaf_0 conda-forge krb5 1.19.3 h3790be6_0 conda-forge lcms2 2.14 h6ed2654_0 conda-forge ld_impl_linux-64 2.40 h41732ed_0 conda-forge lerc 4.0.0 h27087fc_0 conda-forge libarchive 3.6.2 hc8874e4_0 conda-forge libblas 3.9.0 21_linux64_openblas conda-forge libcblas 3.9.0 21_linux64_openblas conda-forge libcups 2.3.3 h3e49a29_2 conda-forge libcurl 7.86.0 h7bff187_1 conda-forge libdb 6.2.32 h9c3ff4c_0 conda-forge libdeflate 1.14 h166bdaf_0 conda-forge libedit 3.1.20191231 he28a2e2_2 conda-forge libev 4.33 hd590300_2 conda-forge libexpat 2.6.2 h59595ed_0 conda-forge libffi 3.4.2 h7f98852_5 conda-forge libgcc-devel_linux-64 13.2.0 ha9c7c90_105 conda-forge libgcc-ng 13.2.0 h807b86a_5 conda-forge libgfortran 3.0.0 1 conda-forge libgfortran-ng 13.2.0 h69a702a_5 conda-forge libgfortran5 13.2.0 ha4646dd_5 conda-forge libglib 2.78.1 hebfc3b9_0 conda-forge libgomp 13.2.0 h807b86a_5 conda-forge libiconv 1.17 hd590300_2 conda-forge libidn2 2.3.7 hd590300_0 conda-forge liblapack 3.9.0 21_linux64_openblas conda-forge libmamba 1.1.0 h83d9b23_3 conda-forge libmambapy 1.1.0 py38h1f54a8e_3 conda-forge libnghttp2 1.51.0 hdcd2b5c_0 conda-forge libnsl 2.0.1 hd590300_0 conda-forge libopenblas 0.3.26 pthreads_h413a1c8_0 conda-forge libpng 1.6.43 h2797004_0 conda-forge libprotobuf 3.19.6 h3eb15da_0 conda-forge libsanitizer 13.2.0 h7e041cc_5 conda-forge libsolv 0.7.28 hfc55251_0 conda-forge libsqlite 3.45.2 h2797004_0 conda-forge libssh2 1.10.0 haa6b8db_3 conda-forge libstdcxx-devel_linux-64 13.2.0 ha9c7c90_105 conda-forge libstdcxx-ng 13.2.0 h7e041cc_5 conda-forge libtiff 4.4.0 h82bc61c_5 conda-forge libunistring 0.9.10 h7f98852_0 conda-forge libuuid 2.38.1 h0b41bf4_0 conda-forge libwebp-base 1.3.2 hd590300_0 conda-forge libxcb 1.15 h0b41bf4_0 conda-forge libxcrypt 4.4.36 hd590300_1 conda-forge libxml2 2.10.3 hca2bb57_4 conda-forge libzlib 1.2.13 hd590300_5 conda-forge lz4-c 1.9.4 hcb278e6_0 conda-forge lzo 2.10 h516909a_1000 conda-forge mafft 7.525 h031d066_0 bioconda make 4.3 hd18ef5c_1 conda-forge mamba 1.1.0 py38h1abaa86_3 conda-forge markdown 3.5.2 pyhd8ed1ab_0 conda-forge markupsafe 2.1.5 py38h01eb140_0 conda-forge menuinst 2.0.2 py38h578d9bd_0 conda-forge mpfr 4.2.1 h9458935_0 conda-forge multidict 6.0.5 py38h01eb140_0 conda-forge muscle 5.1 h4ac6f70_3 bioconda ncurses 6.4 h59595ed_2 conda-forge nomkl 1.0 h5ca1d4c_0 conda-forge numexpr 2.8.4 py38hb2af0cf_101 conda-forge numpy 1.23.5 py38h7042d01_0 conda-forge oauthlib 3.2.2 pyhd8ed1ab_0 conda-forge openblas 0.3.26 pthreads_h7a3da1a_0 conda-forge openjdk 17.0.3 h58dac75_5 conda-forge openssl 1.1.1w hd590300_0 conda-forge opt_einsum 3.3.0 pyhc1e730c_2 conda-forge packaging 24.0 pyhd8ed1ab_0 conda-forge paml 4.10.7 h031d066_0 bioconda pandas 1.3.5 py38h8c16a72_0 pango 1.50.14 hd33c08f_0 conda-forge pcre 8.45 h9c3ff4c_0 conda-forge pcre2 10.40 hc3806b6_0 conda-forge perl 5.32.1 7_hd590300_perl5 conda-forge perl-algorithm-diff 1.201 pl5321hd8ed1ab_0 conda-forge perl-archive-tar 2.40 pl5321hdfd78af_0 bioconda perl-base 2.23 pl5321hd8ed1ab_0 conda-forge perl-bio-asn1-entrezgene 1.73 pl5321hdfd78af_3 bioconda perl-bio-coordinate 1.007001 pl5321hdfd78af_3 bioconda perl-bio-featureio 1.6.905 pl5321hdfd78af_4 bioconda perl-bio-samtools 1.43 pl5321he4a0461_4 bioconda perl-bio-searchio-hmmer 1.7.3 pl5321hdfd78af_0 bioconda perl-bio-tools-phylo-paml 1.7.3 pl5321hdfd78af_3 bioconda perl-bio-tools-run-alignment-clustalw 1.7.4 pl5321hdfd78af_3 bioconda perl-bio-tools-run-alignment-tcoffee 1.7.4 pl5321hdfd78af_5 bioconda perl-bioperl 1.7.8 hdfd78af_1 bioconda perl-bioperl-core 1.7.8 pl5321hdfd78af_1 bioconda perl-bioperl-run 1.007003 pl5321hdfd78af_0 bioconda perl-business-isbn 3.007 pl5321hd8ed1ab_0 conda-forge perl-business-isbn-data 20210112.006 pl5321hd8ed1ab_0 conda-forge perl-carp 1.50 pl5321hd8ed1ab_0 conda-forge perl-class-data-inheritable 0.09 pl5321ha770c72_0 conda-forge perl-common-sense 3.75 pl5321hd8ed1ab_0 conda-forge perl-compress-raw-bzip2 2.201 pl5321h166bdaf_0 conda-forge perl-compress-raw-zlib 2.202 pl5321h166bdaf_0 conda-forge perl-constant 1.33 pl5321hd8ed1ab_0 conda-forge perl-data-dumper 2.183 pl5321h166bdaf_0 conda-forge perl-db_file 1.858 pl5321h166bdaf_0 conda-forge perl-devel-stacktrace 2.04 pl5321ha770c72_0 conda-forge perl-digest-hmac 1.04 pl5321hdfd78af_0 bioconda perl-digest-md5 2.58 pl5321h166bdaf_0 conda-forge perl-encode 3.19 pl5321h166bdaf_0 conda-forge perl-encode-locale 1.05 pl5321hdfd78af_7 bioconda perl-exception-class 1.45 pl5321hdfd78af_0 bioconda perl-exporter 5.74 pl5321hd8ed1ab_0 conda-forge perl-exporter-tiny 1.002002 pl5321hd8ed1ab_0 conda-forge perl-extutils-makemaker 7.70 pl5321hd8ed1ab_0 conda-forge perl-file-listing 6.16 pl5321hdfd78af_0 bioconda perl-file-slurp-tiny 0.004 pl5321hdfd78af_2 bioconda perl-file-sort 1.01 pl5321hdfd78af_3 bioconda perl-file-spec 3.48_01 pl5321hdfd78af_2 bioconda perl-getopt-long 2.54 pl5321hdfd78af_0 bioconda perl-html-parser 3.81 pl5321h4ac6f70_1 bioconda perl-html-tagset 3.20 pl5321hdfd78af_4 bioconda perl-http-cookies 6.10 pl5321hdfd78af_0 bioconda perl-http-daemon 6.16 pl5321hdfd78af_0 bioconda perl-http-date 6.06 pl5321hdfd78af_0 bioconda perl-http-message 6.36 pl5321hdfd78af_0 bioconda perl-http-negotiate 6.01 pl5321hdfd78af_4 bioconda perl-inc-latest 0.500 pl5321ha770c72_0 conda-forge perl-io-compress 2.201 pl5321hdbdd923_2 bioconda perl-io-html 1.004 pl5321hdfd78af_0 bioconda perl-io-socket-ssl 2.075 pl5321hd8ed1ab_0 conda-forge perl-io-string 1.08 pl5321hdfd78af_4 bioconda perl-io-tty 1.16 pl5321h166bdaf_0 conda-forge perl-io-zlib 1.14 pl5321hdfd78af_0 bioconda perl-ipc-run 20200505.0 pl5321hdfd78af_0 bioconda perl-json 4.10 pl5321hdfd78af_0 bioconda perl-json-xs 2.34 pl5321h4ac6f70_6 bioconda perl-libwww-perl 6.67 pl5321hdfd78af_0 bioconda perl-libxml-perl 0.08 pl5321hdfd78af_3 bioconda perl-list-moreutils 0.430 pl5321hdfd78af_0 bioconda perl-list-moreutils-xs 0.430 pl5321h031d066_2 bioconda perl-lwp-mediatypes 6.04 pl5321hdfd78af_1 bioconda perl-mime-base64 3.16 pl5321h166bdaf_0 conda-forge perl-module-build 0.4234 pl5321ha770c72_0 conda-forge perl-net-http 6.22 pl5321hdfd78af_0 bioconda perl-net-ssleay 1.92 pl5321haa6b8db_1 conda-forge perl-ntlm 1.09 pl5321hdfd78af_5 bioconda perl-parent 0.241 pl5321hd8ed1ab_0 conda-forge perl-pathtools 3.75 pl5321h166bdaf_0 conda-forge perl-scalar-list-utils 1.63 pl5321h166bdaf_0 conda-forge perl-socket 2.027 pl5321h031d066_4 bioconda perl-storable 3.15 pl5321h166bdaf_0 conda-forge perl-sub-uplevel 0.2800 pl5321h166bdaf_0 conda-forge perl-test-deep 1.130 pl5321hd8ed1ab_0 conda-forge perl-test-differences 0.71 pl5321ha770c72_0 conda-forge perl-test-exception 0.43 pl5321hd8ed1ab_0 conda-forge perl-test-fatal 0.016 pl5321ha770c72_0 conda-forge perl-test-most 0.38 pl5321hdfd78af_0 bioconda perl-test-warn 0.37 pl5321hd8ed1ab_0 conda-forge perl-test-warnings 0.031 pl5321ha770c72_0 conda-forge perl-text-diff 1.45 pl5321hd8ed1ab_0 conda-forge perl-time-local 1.35 pl5321hdfd78af_0 bioconda perl-timedate 2.33 pl5321hdfd78af_2 bioconda perl-tree-dag_node 1.32 pl5321hdfd78af_0 bioconda perl-try-tiny 0.31 pl5321ha770c72_0 conda-forge perl-types-serialiser 1.01 pl5321hdfd78af_0 bioconda perl-uri 5.17 pl5321ha770c72_0 conda-forge perl-url-encode 0.03 pl5321h9ee0642_0 bioconda perl-www-robotrules 6.02 pl5321hdfd78af_4 bioconda perl-xml-dom 1.46 pl5321hdfd78af_1 bioconda perl-xml-dom-xpath 0.14 pl5321hdfd78af_2 bioconda perl-xml-parser 2.44_01 pl5321hc3e0081_1003 conda-forge perl-xml-regexp 0.04 pl5321hdfd78af_3 bioconda perl-xml-xpathengine 0.14 pl5321hdfd78af_3 bioconda piler-cr 1.06 h4ac6f70_4 bioconda pip 24.0 pyhd8ed1ab_0 conda-forge pixman 0.43.2 h59595ed_0 conda-forge platformdirs 4.2.0 pyhd8ed1ab_0 conda-forge pluggy 1.4.0 pyhd8ed1ab_0 conda-forge poa 2.0 h031d066_5 bioconda pooch 1.8.1 pyhd8ed1ab_0 conda-forge prodigal 2.6.3 h031d066_7 bioconda protobuf 3.19.6 py38h8dc9893_0 conda-forge pthread-stubs 0.4 h36c2ea0_1001 conda-forge pyasn1 0.5.1 pyhd8ed1ab_0 conda-forge pyasn1-modules 0.3.0 pyhd8ed1ab_0 conda-forge pybind11-abi 4 hd8ed1ab_3 conda-forge pycosat 0.6.6 py38h01eb140_0 conda-forge pycparser 2.21 pyhd8ed1ab_0 conda-forge pyjwt 2.8.0 pyhd8ed1ab_1 conda-forge pyopenssl 23.2.0 pyhd8ed1ab_1 conda-forge pysocks 1.7.1 pyha2e5f31_6 conda-forge python 3.8.15 h257c98d_0_cpython conda-forge python-dateutil 2.9.0 pyhd8ed1ab_0 conda-forge python-flatbuffers 2.0 pyhd8ed1ab_0 conda-forge python_abi 3.8 4_cp38 conda-forge pytz 2024.1 pyhd8ed1ab_0 conda-forge pyu2f 0.1.5 pyhd8ed1ab_0 conda-forge r-base 4.0.5 hb87df5d_8 conda-forge r-lattice 0.20_45 r40hcfec24a_0 conda-forge r-matrix 1.4_1 r40h0154571_0 conda-forge r-ranger 0.13.1 r40h03ef668_0 conda-forge r-rcpp 1.0.9 r40h7525677_1 conda-forge r-rcppeigen 0.3.3.9.2 r40h43535f1_0 conda-forge re2 2022.02.01 h9c3ff4c_0 conda-forge readline 8.2 h8228510_1 conda-forge reproc 14.2.4.post0 hd590300_1 conda-forge reproc-cpp 14.2.4.post0 h59595ed_1 conda-forge requests 2.31.0 pyhd8ed1ab_0 conda-forge requests-oauthlib 1.4.0 pyhd8ed1ab_0 conda-forge rsa 4.9 pyhd8ed1ab_0 conda-forge ruamel.yaml 0.17.40 py38h01eb140_0 conda-forge ruamel.yaml.clib 0.2.8 py38h01eb140_0 conda-forge scikit-learn 0.22.2.post1 py38hcdab131_0 conda-forge scipy 1.10.1 py38h32ae08f_1 sed 4.8 he412f7d_0 conda-forge setuptools 69.2.0 pyhd8ed1ab_0 conda-forge six 1.16.0 pyh6c4a22f_0 conda-forge snappy 1.1.10 h9fff704_0 conda-forge sqlite 3.45.2 h2c6b66d_0 conda-forge sysroot_linux-64 2.12 he073ed8_17 conda-forge t-coffee 12.00.7fb08c2 h26a2512_0 bioconda tensorboard 2.6.0 pyhd8ed1ab_1 conda-forge tensorboard-data-server 0.6.1 py38h2b5fc30_4 conda-forge tensorboard-plugin-wit 1.8.1 pyhd8ed1ab_0 conda-forge tensorflow 2.7.0 pypi_0 pypi tensorflow-base 2.7.1 cpu_py38ha28dbe6_0 conda-forge tensorflow-decision-forests 0.2.2 pypi_0 pypi tensorflow-estimator 2.7.1 cpu_py38h4e23bc6_0 conda-forge termcolor 2.4.0 pyhd8ed1ab_0 conda-forge tk 8.6.13 noxft_h4845f30_101 conda-forge tktable 2.10 h0c5db8f_5 conda-forge toolz 0.12.1 pyhd8ed1ab_0 conda-forge tqdm 4.66.2 pyhd8ed1ab_0 conda-forge typing-extensions 4.10.0 hd8ed1ab_0 conda-forge typing_extensions 4.10.0 pyha770c72_0 conda-forge urllib3 2.2.1 pyhd8ed1ab_0 conda-forge viennarna 2.6.4 py38pl5321h5cf8b27_0 bioconda werkzeug 3.0.1 pyhd8ed1ab_0 conda-forge wget 1.20.3 ha56f1ee_1 conda-forge wheel 0.42.0 pyhd8ed1ab_0 conda-forge wrapt 1.16.0 py38h01eb140_0 conda-forge xorg-fixesproto 5.0 h7f98852_1002 conda-forge xorg-inputproto 2.3.2 h7f98852_1002 conda-forge xorg-kbproto 1.0.7 h7f98852_1002 conda-forge xorg-libice 1.1.1 hd590300_0 conda-forge xorg-libsm 1.2.4 h7391055_0 conda-forge xorg-libx11 1.8.7 h8ee46fc_0 conda-forge xorg-libxau 1.0.11 hd590300_0 conda-forge xorg-libxdmcp 1.1.3 h7f98852_0 conda-forge xorg-libxext 1.3.4 h0b41bf4_2 conda-forge xorg-libxfixes 5.0.3 h7f98852_1004 conda-forge xorg-libxi 1.7.10 h7f98852_0 conda-forge xorg-libxrender 0.9.11 hd590300_0 conda-forge xorg-libxt 1.3.0 hd590300_1 conda-forge xorg-libxtst 1.2.3 h7f98852_1002 conda-forge xorg-recordproto 1.14.2 h7f98852_1002 conda-forge xorg-renderproto 0.11.1 h7f98852_1002 conda-forge xorg-xextproto 7.3.0 h0b41bf4_1003 conda-forge xorg-xproto 7.0.31 h7f98852_1007 conda-forge xz 5.2.6 h166bdaf_0 conda-forge yaml-cpp 0.7.0 h59595ed_3 conda-forge yarl 1.9.4 py38h01eb140_0 conda-forge zipp 3.17.0 pyhd8ed1ab_0 conda-forge zlib 1.2.13 hd590300_5 conda-forge zstandard 0.22.0 py38ha98ab4e_0 conda-forge zstd 1.5.5 hfc55251_0 conda-forge
-
repo owner From what I can see it looks good.. I’m really confused why you get the error, because on my side I'm using the same version of numpy and scikit-learn and it does not complain.
Can you try maybe “pip show scikit-learn” in case there are multiple version of scikit-learn installed ?
-
reporter hmmm, it says:
(iphop_env) michellehauer_uri_edu@login1:~$ pip show scikit-learn Name: scikit-learn Version: 0.22.2.post1 Summary: A set of python modules for machine learning and data mining Home-page: http://scikit-learn.org Author: Author-email: License: new BSD Location: /home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages Requires: joblib, numpy, scipy Required-by: iphop
-
repo owner Ok, so nothing weird.. And you got this error with both the default database and the custom database ?
-
reporter Hi there,
I just tried running it on the default database – it ran for a few days before finishing with a different error than what we see above:
Loading miniconda version 22.11.1-1 Looks like everything is now set up, we will first clean up the input file, and then we will start the host prediction steps themselves [1/1/Run] Running blastn against genomes... [1/3/Run] Get relevant blast matches... [2/1/Run] Running blastn against CRISPR... [2/2/Run] Get relevant crispr matches... [3/1/Run] Running (recoded)WIsH... [3/2/Run] Get relevant WIsH hits... [4/1/Run] Running VHM s2 similarities... [4/2/Run] Get relevant VHM hits... [5/1/Run] Running PHP... [5/2/Run] Get relevant PHP hits... Traceback (most recent call last): File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/bin/iphop", line 10, in <module> sys.exit(cli()) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/iphop.py", line 128, in cli args["func"](args) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/modules/master_predict.py", line 92, in main php.run_and_parse_php(args) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/modules/php.py", line 28, in run_and_parse_php get_php_results(args["fasta_file"],args["phprawresult"],args["phpparsed"],logger,args['messages']) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/iphop/modules/php.py", line 42, in get_php_results df_pred = pd.read_csv(pred_file,delimiter=',',quotechar='"', index_col=0) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/util/_decorators.py", line 311, in wrapper return func(*args, **kwargs) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 586, in read_csv return _read(filepath_or_buffer, kwds) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 482, in _read parser = TextFileReader(filepath_or_buffer, **kwds) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 811, in __init__ self._engine = self._make_engine(self.engine) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/readers.py", line 1040, in _make_engine return mapping[engine](self.f, **self.options) # type: ignore[call-arg] File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/c_parser_wrapper.py", line 51, in __init__ self._open_handles(src, kwds) File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/parsers/base_parser.py", line 222, in _open_handles self.handles = get_handle( File "/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages/pandas/io/common.py", line 702, in get_handle handle = open( FileNotFoundError: [Errno 2] No such file or directory: 'iphop_test_results/test_input_phages_iphop/Wdir/php_results/php_db_Prediction_Allhost.csv'
Thanks!
-
repo owner HI,
Ok I think this is the same PHP error, and if you look in php.log you’ll likely see the same “`np.float` was a deprecated alias for the builtin `float`. To avoid this error in existing code, use `float` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.float64` here. The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:” as we saw previously.I think I’m starting to see why though. You can see at the end of this file the error is raised by:
File "/home/michellehauer_uri_edu/.local/lib/python3.8/site-packages/numpy/__init__.py", line 305, in __getattr__ raise AttributeError(__former_attrs__[attr])
This numpy path is in your local install, not the conda environment, so it may not be the right version of numpy that iphop needs. Could you try
pip show numpy
after loading the iphop environment with conda ?
-
reporter Hi there,
Thank you, here is the output after loading the iphop env – looks like the same path as above.
michellehauer_uri_edu@login1:/project/pi_rbeinart_uri_edu/michelle/iphop$ conda activate iphop_env (iphop_env) michellehauer_uri_edu@login1:/project/pi_rbeinart_uri_edu/michelle/iphop$ pip show numpy Name: numpy Version: 1.24.4 Summary: Fundamental package for array computing in Python Home-page: https://www.numpy.org Author: Travis E. Oliphant et al. Author-email: License: BSD-3-Clause Location: /home/michellehauer_uri_edu/.local/lib/python3.8/site-packages Requires: Required-by: biopython, Bottleneck, h5py, Keras-Preprocessing, numexpr, opt-einsum, pandas, scikit-learn, scipy, tensorboard, tensorflow, tensorflow-decision-forests
-
repo owner Right, this is the issue: you can see this is the numpy package from your home local folder, and the version is 1.24.4. The numpy package installed with iphop via conda is:
numpy 1.23.5 py38h7042d01_0 conda-forge
So I suspect 1.23 is fine, but 1.24 is not compatible with the scikit-learn version from the iphop environment.
Can you check the content of PYTHONPATH after you load the conda environment, i.e.:
echo $PYTHONPATH
It may be that we can simply remove the reference to /home/michellehauer_uri_edu/.local/lib/python3.8/ and then the correct version of numpy would be picked up
-
reporter Thank you again for your quick replies!
echo $PYTHONPATH actually does not return anything when the iphop_env environment is activated
-
repo owner Oh, that’s weird. What is the output of:
which python
and
$python >>> import sys >>> sys.path
?
-
reporter Here is the output:
(iphop_env) michellehauer_uri_edu@login1:~$ which python /home/michellehauer_uri_edu/.conda/envs/iphop_env/bin/python (iphop_env) michellehauer_uri_edu@login1:~$ python Python 3.8.15 | packaged by conda-forge | (default, Nov 22 2022, 08:46:39) [GCC 10.4.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import sys >>> sys.path ['', '/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python38.zip', '/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8', '/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/lib-dynload', '/home/michellehauer_uri_edu/.local/lib/python3.8/site-packages', '/home/michellehauer_uri_edu/.conda/envs/iphop_env/lib/python3.8/site-packages']
-
repo owner Right, so the sys path includes both the conda environment and your local pip install, unfortunately :-(
Do you know if you need numpy 1.24 in your local environment, i.e. before loading any conda ? If not maybe this version of numpy could be downgraded to 1.23.5 (sorry that we are starting to mess up with your install outside of the conda environment, but that would be a quick “fix” to at least get these run to work).
Another thing to try may be the docker container (https://hub.docker.com/r/simroux/iphop), as it may be avoid using your locally installed python packages then (since the user is different)
-
reporter Thank you so much! This solved the issue.
-
repo owner Oh great ! I’m sorry this meant we had to change something to your local environment (we shouldn’t have to do that if everything worked well..), but glad that it seems to work now ! I’ll close this issue, feel free to open another one if you encounter another problem.
-
repo owner - changed status to closed
Issue was conflict between multiple versions of numpy (local vs in iphop conda), solved by downgrading the local version
- Log in to comment
Hi !
Sorry this is a known bug in the current iPHoP that does not adjust correctly to different GTDB-tk versions. You can fix it by coping every file in symbiont_MAGs_GTDB-tk_results/ that is name xxx.ar122.xxx to xxx.ar53.xxx (i.e. “cp gtdbtk.ar122.decorated.tree gtdbtk.ar53.decorated.tree”, etc). add_to_db should run afterwards (but let me know if there are other issues.
Best,
Simon