marbl / metacompass Goto Github PK
View Code? Open in Web Editor NEWMetaCompass: Reference-guided Assembly of Metagenomes
Home Page: https://github.com/marbl/MetaCompass/wiki
License: Other
MetaCompass: Reference-guided Assembly of Metagenomes
Home Page: https://github.com/marbl/MetaCompass/wiki
License: Other
Hello, i am running the following command:
#!/bin/bash
#SBATCH -J metacom_CJ_Car
#SBATCH -t UNLIMITED
#SBATCH -p Node2
#SBATCH -n 12
python3 /home/geonho/tools/MetaCompass-2.0-beta/go_metacompass.py -r /home/geonho/WGS/REF_ACP/REF_GENOMES/C_rudii_JRPAMB4_complete_genome_NZ_CP041245/ACP_C_rudii.ref.fas -1 /home/geonho/WGS/trimmed/CJ_S7_L003_R1_001.fastq.gz -2 /home/geonho/WGS/trimmed/CJ_S7_L003_R2_001.fastq.gz -o /home/geonho/WGS/trimmed/metacompass/Carsonella/CJ -t 12 -y 12
and i get the following error:
[Sat Oct 16 17:25:58 2021] Building DAG of jobs...
[Sat Oct 16 17:25:58 2021] Provided cores: 12
[Sat Oct 16 17:25:58 2021] Rules claiming more threads will be scaled down.
[Sat Oct 16 17:25:58 2021] Job counts:
[Sat Oct 16 17:25:58 2021] count jobs
[Sat Oct 16 17:25:58 2021] 1 all
[Sat Oct 16 17:25:58 2021] 1 assemble_unmapped
[Sat Oct 16 17:25:58 2021] 1 assembled_references
[Sat Oct 16 17:25:58 2021] 1 bam_sort
[Sat Oct 16 17:25:58 2021] 1 bowtie2_map
[Sat Oct 16 17:25:58 2021] 1 build_contigs
[Sat Oct 16 17:25:58 2021] 1 create_tsv
[Sat Oct 16 17:25:58 2021] 1 join_contigs
[Sat Oct 16 17:25:58 2021] 1 mapping_stats
[Sat Oct 16 17:25:58 2021] 1 pilon_contigs
[Sat Oct 16 17:25:58 2021] 1 pilon_map
[Sat Oct 16 17:25:58 2021] 1 sam_to_bam
[Sat Oct 16 17:25:58 2021] 1 stats_all
[Sat Oct 16 17:25:58 2021] 1 stats_genome
[Sat Oct 16 17:25:58 2021] 14
[Sat Oct 16 17:25:58 2021] Job 10: ---Build index .
[Sat Oct 16 17:25:58 2021] bowtie2-build -o 3 --threads 12 -q /home/geonho/WGS/REF_ACP/REF_GENOMES/C_rudii_JRPAMB4_co$
[Sat Oct 16 17:46:20 2021] Finished job 10.
[Sat Oct 16 17:46:20 2021] 1 of 14 steps (7%) done
[Sat Oct 16 17:46:20 2021] Job 7: ---Build contigs .
[Sat Oct 16 17:46:20 2021] /home/geonho/tools/MetaCompass-2.0-beta/bin/buildcontig -r /home/geonho/WGS/REF_ACP/REF_GE$
[Sat Oct 16 17:46:24 2021] Finished job 7.
[Sat Oct 16 17:46:24 2021] 2 of 14 steps (14%) done
[Sat Oct 16 17:46:24 2021] Job 11: ---Map reads for pilon polishing.
[Sat Oct 16 17:46:24 2021] bowtie2-build --threads 12 -q /home/geonho/WGS/trimmed/metacompass/Carsonella/CJ/assembly/$
[Sat Oct 16 18:34:32 2021] Finished job 11.
[Sat Oct 16 18:34:32 2021] 3 of 14 steps (21%) done
[Sat Oct 16 18:34:32 2021] Job 9: ---Assembled references .
[Sat Oct 16 18:34:33 2021] grep '>' /home/geonho/WGS/trimmed/metacompass/Carsonella/CJ/assembly/contigs.fasta |rev| c$
[Sat Oct 16 18:34:33 2021] Job 13: ---Convert sam to bam .
[Sat Oct 16 18:34:33 2021] samtools view -bS /home/geonho/WGS/trimmed/metacompass/Carsonella/CJ/error_correction/mc.s$
[Sat Oct 16 18:34:41 2021] Finished job 9.
[Sat Oct 16 18:34:41 2021] 4 of 14 steps (29%) done
[Sat Oct 16 18:34:41 2021] Job 3: ---mapping stats per genome in reference-guided contigs
[Sat Oct 16 18:34:44 2021] grep '>' /home/geonho/WGS/REF_ACP/REF_GENOMES/C_rudii_JRPAMB4_complete_genome_NZ_CP041245/$
[Sat Oct 16 18:34:53 2021] Finished job 13.
[Sat Oct 16 18:34:53 2021] 5 of 14 steps (36%) done
[Sat Oct 16 18:35:27 2021] Finished job 3.
[Sat Oct 16 18:35:27 2021] 6 of 14 steps (43%) done
[Sat Oct 16 18:35:27 2021] Job 12: ---Sort bam .
[Sat Oct 16 18:35:27 2021] samtools sort -@ 12 /home/geonho/WGS/trimmed/metacompass/Carsonella/CJ/error_correction/mc$
[Sat Oct 16 18:36:01 2021] Finished job 12.
[Sat Oct 16 18:36:02 2021] 7 of 14 steps (50%) done
[Sat Oct 16 18:36:02 2021] Job 5: ---Assemble unmapped reads .
[Sat Oct 16 18:36:03 2021] if [[ -s /home/geonho/WGS/trimmed/metacompass/Carsonella/CJ/error_correction/mc.sam.unmapp$
[Mon Oct 18 17:16:58 2021] Finished job 5.
[Mon Oct 18 17:16:58 2021] 8 of 14 steps (57%) done
[Mon Oct 18 17:16:58 2021] Job 6: ---Pilon polish contigs .
[Mon Oct 18 17:16:58 2021] java -Xmx12G -jar /home/geonho/tools/MetaCompass-2.0-beta/bin/pilon-1.23.jar --flank 5 --t$
[Mon Oct 18 17:16:58 2021] Error in rule pilon_contigs:
[Mon Oct 18 17:16:58 2021] jobid: 6
[Mon Oct 18 17:16:58 2021] output: /home/geonho/WGS/trimmed/metacompass/Carsonella/CJ/error_correction/contigs.pi$
[Mon Oct 18 17:16:58 2021] log: /home/geonho/WGS/trimmed/metacompass/Carsonella/CJ/logs/pilon.log
[Mon Oct 18 17:16:58 2021] RuleException:
[Mon Oct 18 17:16:58 2021] CalledProcessError in line 130 of /home/geonho/tools/MetaCompass-2.0-beta/snakemake/metaco$
[Mon Oct 18 17:16:58 2021] Command ' set -euo pipefail; java -Xmx12G -jar /home/geonho/tools/MetaCompass-2.0-beta/bi$
[Mon Oct 18 17:16:58 2021] File "/home/geonho/tools/MetaCompass-2.0-beta/snakemake/metacompass.ref.paired.py", line$
[Mon Oct 18 17:16:58 2021] File "/home/geonho/conda/metacompass/lib/python3.6/concurrent/futures/thread.py", line 5$
[Mon Oct 18 17:16:58 2021] Will exit after finishing currently running jobs.
[Mon Oct 18 17:16:58 2021] Exiting because a job execution failed. Look above for error message
[Mon Oct 18 17:16:58 2021] Complete log: .snakemake/log/2021-10-16T172558.569258.snakemake.log
Here's the log file pilon.log:
/bin/bash: java: command not found
Do you have any solution for this problem?
Thanks : )
Hi,
was running the tutorial example:
$ python3 go_metacompass.py -r tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta -1 tutorial/thao2000.1.fq -2 tutorial/thao2000.2.fq -l 150 -o tutorial/example1_output -t 30 --notimestamps
but it failed:
REFERENCE genome file provided. Reference Selection step will be skipped.
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 30
Rules claiming more threads will be scaled down.
Job stats:
job count
-------------------- -------
all 1
assemble_unmapped 1
assembled_references 1
bowtie2_map 1
build_contigs 1
create_tsv 1
join_contigs 1
mapping_stats 1
polish_contigs 1
polish_map 1
stats_all 1
stats_genome 1
total 12
Select jobs to execute...
[Mon Dec 18 15:14:24 2023]
Job 5: ---Build index .
Reason: Missing output files: tutorial/example1_output/logs/bowtie2map.log, tutorial/example1_output/assembly/mc.sam
bowtie2-build -o 3 --threads 30 -q tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta tutorial/example1_output/assembly/mc.index 1>> tutorial/example1_output/assembly/mc.index 2>&1;bowtie2 -a --end-to-end --sensitive --no-unal -p 30 -x tutorial/example1_output/assembly/mc.index -q -U tutorial/thao2000.1.fq,tutorial/thao2000.2.fq -S tutorial/example1_output/assembly/mc.sam.all > tutorial/example1_output/logs/bowtie2map.log 2>&1; python3 /media/5c679734-9376-4617-815c-d4bd4177b8b2/leon/projects/01/soft/MetaCompass/bin/best_strata.py tutorial/example1_output/assembly/mc.sam.all tutorial/example1_output/assembly/mc.sam; rm tutorial/example1_output/assembly/mc.sam.all && touch tutorial/example1_output/assembly/.run1.ok
[Mon Dec 18 15:14:30 2023]
Finished job 5.
1 of 12 steps (8%) done
Select jobs to execute...
[Mon Dec 18 15:14:30 2023]
Job 4: ---Build contigs .
Reason: Missing output files: tutorial/example1_output/assembly/selected_maps.sam, tutorial/example1_output/assembly/contigs.fasta; Input files updated by another job: tutorial/example1_output/assembly/mc.sam
/media/5c679734-9376-4617-815c-d4bd4177b8b2/leon/projects/01/soft/MetaCompass/bin/buildcontig -r tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta -s tutorial/example1_output/assembly/mc.sam -o tutorial/example1_output/assembly -c 1 -l 1 -n F -b F -u F -k breadth 1>> tutorial/example1_output/logs/buildcontigs.log 2>&1 && touch tutorial/example1_output/assembly/.run2.ok
[Mon Dec 18 15:14:31 2023]
Finished job 4.
2 of 12 steps (17%) done
Select jobs to execute...
[Mon Dec 18 15:14:32 2023]
Job 3: ---ntEDit polish contigs .
Reason: Missing output files: tutorial/example1_output/error_correction/contigs_edited.fa; Input files updated by another job: tutorial/example1_output/assembly/contigs.fasta
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 30
Rules claiming more threads will be scaled down.
Select jobs to execute...
touch tutorial/example1_output/error_correction/contigs_edited.fa
/usr/bin/time -v -o tutorial/example1_output/error_correction/solidBF_c1.time nthits -c1 -b 36 -k 25 -t16 --outbloom tutorial/thao2000.1.fq tutorial/thao2000.2.fq -p tutorial/example1_output/error_correction/solidBF_c1> tutorial/example1_output/logs/polish.log 2>&1
[Mon Dec 18 15:14:33 2023]
Error in rule polish_contigs:
jobid: 0
input: tutorial/example1_output/assembly/contigs.fasta, tutorial/thao2000.1.fq, tutorial/thao2000.2.fq
output: tutorial/example1_output/error_correction/contigs_edited.fa
log: tutorial/example1_output/logs/polish.log (check log file(s) for error details)
RuleException:
CalledProcessError in file /media/5c679734-9376-4617-815c-d4bd4177b8b2/leon/projects/01/soft/MetaCompass/snakemake/metacompass.ref.paired.py, line 134:
Command 'set -euo pipefail; /usr/bin/time -v -o tutorial/example1_output/error_correction/solidBF_c1.time nthits -c1 -b 36 -k 25 -t16 --outbloom tutorial/thao2000.1.fq tutorial/thao2000.2.fq -p tutorial/example1_output/error_correction/solidBF_c1> tutorial/example1_output/logs/polish.log 2>&1' returned non-zero exit status 1.
File "/media/5c679734-9376-4617-815c-d4bd4177b8b2/leon/projects/01/soft/MetaCompass/snakemake/metacompass.ref.paired.py", line 134, in __rule_polish_contigs
File "/home/leon/miniconda3/envs/metacompass_env/lib/python3.10/concurrent/futures/thread.py", line 58, in run
Removing output files of failed job polish_contigs since they might be corrupted:
tutorial/example1_output/error_correction/contigs_edited.fa
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: .snakemake/log/2023-12-18T151423.726751.snakemake.log
ERROR: snakemake command failed; exiting..
tutorial/example1_output/logs/polish.log
Unknown argument: -c1
Usage: ntHits --frequencies VAR --out-file VAR [--min-count VAR] [--max-count VAR] [--kmer-length VAR] [--seeds VAR] [-h] [--error-rate VAR] [--threads VAR] [--solid] [--long-mode] out_type files
Filters k-mers based on counts (cmin <= count <= cmax) in input files
Positional arguments:
out_type Output format: Bloom filter 'bf', counting Bloom filter ('cbf'), or table ('table') [required]
files Input files [nargs: 0 or more] [required]
Optional arguments:
-f, --frequencies Frequency histogram file (e.g. from ntCard) [required]
-o, --out-file Output file's name [required]
-cmin, --min-count Minimum k-mer count (>=1), ignored if using --solid [default: 1]
-cmax, --max-count Maximum k-mer count (<=254) [default: 254]
-k, --kmer-length k-mer length, ignored if using spaced seeds (-s) [default: 64]
-s, --seeds If specified, use spaced seeds (separate with commas, e.g. 10101,11011)
-h, --num-hashes Number of hashes to generate per k-mer/spaced seed [default: 3]
-p, --error-rate Target Bloom filter error rate [default: 0.0001]
-t, --threads Number of parallel threads [default: 4]
--solid Automatically tune 'cmin' to filter out erroneous k-mers
--long-mode Optimize data reader for long sequences (>5kbp)
-v Level of details printed to stdout (-v: normal, -vv detailed)
Copyright 2023 Canada's Michael Smith Genome Science Centre
Installed the dependencies in a conda environment:
mamba create -n metacompass_env -c conda-forge -c bioconda "python=>3.1" biopython "snakemake>=3.7.1" "blast>=2.4.0" "bowtie2>=2.2.9" "mash>=2.1" "samtools>=1.2.13" "megahit>=1.0.6" nthits ntedit meryl
here are the versions:
# Name Version Build Channel
_libgcc_mutex 0.1 conda_forge conda-forge
_openmp_mutex 4.5 2_gnu conda-forge
aioeasywebdav 2.4.0 pyha770c72_0 conda-forge
aiohttp 3.9.1 py310h2372a71_0 conda-forge
aiosignal 1.3.1 pyhd8ed1ab_0 conda-forge
amply 0.1.6 pyhd8ed1ab_0 conda-forge
appdirs 1.4.4 pyh9f0ad1d_0 conda-forge
async-timeout 4.0.3 pyhd8ed1ab_0 conda-forge
attmap 0.13.2 pyhd8ed1ab_0 conda-forge
attrs 23.1.0 pyh71513ae_1 conda-forge
bcrypt 4.1.2 py310hcb5633a_0 conda-forge
**biopython 1.81 py310h2372a71_1 conda-forge**
**blast 2.15.0 pl5321h6f7f691_1 bioconda**
boto3 1.34.2 pyhd8ed1ab_0 conda-forge
botocore 1.34.2 pyhd8ed1ab_0 conda-forge
**bowtie2 2.5.2 py310ha0a81b8_0 bioconda**
brotli-python 1.1.0 py310hc6cd4ac_1 conda-forge
btllib 1.7.0 py310h0dbaff4_0 bioconda
bzip2 1.0.8 hd590300_5 conda-forge
c-ares 1.24.0 hd590300_0 conda-forge
ca-certificates 2023.11.17 hbcca054_0 conda-forge
cachetools 5.3.2 pyhd8ed1ab_0 conda-forge
capnproto 0.9.1 ha19adfc_4 conda-forge
certifi 2023.11.17 pyhd8ed1ab_0 conda-forge
cffi 1.16.0 py310h2fee648_0 conda-forge
charset-normalizer 3.3.2 pyhd8ed1ab_0 conda-forge
coin-or-cbc 2.10.10 h9002f0b_0 conda-forge
coin-or-cgl 0.60.7 h516709c_0 conda-forge
coin-or-clp 1.17.8 h1ee7a9c_0 conda-forge
coin-or-osi 0.108.8 ha2443b9_0 conda-forge
coin-or-utils 2.11.9 hee58242_0 conda-forge
coincbc 2.10.10 0_metapackage conda-forge
colorama 0.4.6 pyhd8ed1ab_0 conda-forge
configargparse 1.7 pyhd8ed1ab_0 conda-forge
connection_pool 0.0.3 pyhd3deb0d_0 conda-forge
cryptography 41.0.7 py310hb8475ec_1 conda-forge
curl 8.5.0 hca28451_0 conda-forge
datrie 0.8.2 py310h2372a71_7 conda-forge
defusedxml 0.7.1 pyhd8ed1ab_0 conda-forge
docutils 0.20.1 py310hff52083_3 conda-forge
dpath 2.1.6 pyha770c72_0 conda-forge
dropbox 11.36.2 pyhd8ed1ab_0 conda-forge
eido 0.2.2 pyhd8ed1ab_0 conda-forge
entrez-direct 16.2 he881be0_1 bioconda
exceptiongroup 1.2.0 pyhd8ed1ab_0 conda-forge
filechunkio 1.8 py_2 conda-forge
frozenlist 1.4.1 py310h2372a71_0 conda-forge
ftputil 5.0.4 pyhd8ed1ab_0 conda-forge
gettext 0.21.1 h27087fc_0 conda-forge
gitdb 4.0.11 pyhd8ed1ab_0 conda-forge
gitpython 3.1.40 pyhd8ed1ab_0 conda-forge
google-api-core 2.15.0 pyhd8ed1ab_0 conda-forge
google-api-python-client 2.111.0 pyhd8ed1ab_0 conda-forge
google-auth 2.25.2 pyhca7485f_0 conda-forge
google-auth-httplib2 0.2.0 pyhd8ed1ab_0 conda-forge
google-cloud-core 2.4.1 pyhd8ed1ab_0 conda-forge
google-cloud-storage 2.14.0 pyhca7485f_0 conda-forge
google-crc32c 1.1.2 py310hc5c09a0_5 conda-forge
google-resumable-media 2.7.0 pyhd8ed1ab_0 conda-forge
googleapis-common-protos 1.62.0 pyhd8ed1ab_0 conda-forge
grpcio 1.60.0 py310h1b8f574_0 conda-forge
gsl 2.7 he838d99_0 conda-forge
gzip 1.13 hd590300_0 conda-forge
htslib 1.19 h81da01d_0 bioconda
httplib2 0.22.0 pyhd8ed1ab_0 conda-forge
humanfriendly 10.0 pyhd8ed1ab_6 conda-forge
icu 73.2 h59595ed_0 conda-forge
idna 3.6 pyhd8ed1ab_0 conda-forge
importlib_resources 6.1.1 pyhd8ed1ab_0 conda-forge
iniconfig 2.0.0 pyhd8ed1ab_0 conda-forge
jinja2 3.1.2 pyhd8ed1ab_1 conda-forge
jmespath 1.0.1 pyhd8ed1ab_0 conda-forge
jsonschema 4.20.0 pyhd8ed1ab_0 conda-forge
jsonschema-specifications 2023.11.2 pyhd8ed1ab_0 conda-forge
jupyter_core 5.5.0 py310hff52083_0 conda-forge
keyutils 1.6.1 h166bdaf_0 conda-forge
krb5 1.21.2 h659d440_0 conda-forge
ld_impl_linux-64 2.40 h41732ed_0 conda-forge
libabseil 20230802.1 cxx17_h59595ed_0 conda-forge
libblas 3.9.0 20_linux64_openblas conda-forge
libcblas 3.9.0 20_linux64_openblas conda-forge
libcrc32c 1.1.2 h9c3ff4c_0 conda-forge
libcurl 8.5.0 hca28451_0 conda-forge
libdeflate 1.19 hd590300_0 conda-forge
libedit 3.1.20191231 he28a2e2_2 conda-forge
libev 4.33 hd590300_2 conda-forge
libffi 3.4.2 h7f98852_5 conda-forge
libgcc 7.2.0 h69d50b8_2 conda-forge
libgcc-ng 13.2.0 h807b86a_3 conda-forge
libgfortran-ng 13.2.0 h69a702a_3 conda-forge
libgfortran5 13.2.0 ha4646dd_3 conda-forge
libgomp 13.2.0 h807b86a_3 conda-forge
libgrpc 1.60.0 hd6c4280_0 conda-forge
libhwloc 2.9.3 default_h554bfaf_1009 conda-forge
libiconv 1.17 hd590300_2 conda-forge
libidn2 2.3.4 h166bdaf_0 conda-forge
liblapack 3.9.0 20_linux64_openblas conda-forge
liblapacke 3.9.0 20_linux64_openblas conda-forge
libnghttp2 1.58.0 h47da74e_1 conda-forge
libnsl 2.0.1 hd590300_0 conda-forge
libopenblas 0.3.25 pthreads_h413a1c8_0 conda-forge
libprotobuf 4.24.4 hf27288f_0 conda-forge
libre2-11 2023.06.02 h7a70373_0 conda-forge
libsodium 1.0.18 h36c2ea0_1 conda-forge
libsqlite 3.44.2 h2797004_0 conda-forge
libssh2 1.11.0 h0841786_0 conda-forge
libstdcxx-ng 13.2.0 h7e041cc_3 conda-forge
libunistring 0.9.10 h7f98852_0 conda-forge
libuuid 2.38.1 h0b41bf4_0 conda-forge
libxml2 2.11.6 h232c23b_0 conda-forge
libzlib 1.2.13 hd590300_5 conda-forge
logmuse 0.2.6 pyh8c360ce_0 conda-forge
lrzip 0.621 hedc9cd1_7 bioconda
lzo 2.10 h516909a_1000 conda-forge
markdown-it-py 3.0.0 pyhd8ed1ab_0 conda-forge
markupsafe 2.1.3 py310h2372a71_1 conda-forge
**mash 2.3 ha9a2dd8_3 bioconda**
mdurl 0.1.0 pyhd8ed1ab_0 conda-forge
**megahit 1.2.9 h43eeafb_4 bioconda**
meryl 2013 0 bioconda
multidict 6.0.4 py310h2372a71_1 conda-forge
nbformat 5.9.2 pyhd8ed1ab_0 conda-forge
ncbi-vdb 3.0.9 hdbdd923_0 bioconda
ncurses 6.4 h59595ed_2 conda-forge
**ntedit 1.3.5 hd03093a_1 bioconda**
**nthits 1.0.2 h4ac6f70_0 bioconda**
numpy 1.26.2 py310hb13e2d6_0 conda-forge
oauth2client 4.1.3 py_0 conda-forge
openssl 3.2.0 hd590300_1 conda-forge
ossuuid 1.6.2 hf484d3e_1000 conda-forge
packaging 23.2 pyhd8ed1ab_0 conda-forge
pandas 2.1.4 py310hcc13569_0 conda-forge
paramiko 3.3.1 pyhd8ed1ab_0 conda-forge
pcre 8.45 h9c3ff4c_0 conda-forge
peppy 0.35.7 pyhd8ed1ab_0 conda-forge
perl 5.22.2.1 0 conda-forge
perl-archive-tar 2.18 1 bioconda
perl-common-sense 3.74 0 bioconda
perl-exporter-tiny 0.042 1 bioconda
perl-json 2.90 1 bioconda
perl-json-xs 2.34 0 bioconda
perl-list-moreutils 0.413 1 bioconda
perl-threaded 5.32.1 hdfd78af_1 bioconda
perl-uri 1.71 0 bioconda
perl-xml-libxml 2.0124 0 bioconda
perl-xml-namespacesupport 1.11 0 bioconda
perl-xml-sax 0.99 0 bioconda
perl-xml-sax-base 1.08 0 bioconda
pigz 2.8 h2797004_0 conda-forge
pip 23.3.2 pyhd8ed1ab_0 conda-forge
pkgutil-resolve-name 1.3.10 pyhd8ed1ab_1 conda-forge
plac 1.4.2 pyhd8ed1ab_0 conda-forge
platformdirs 4.1.0 pyhd8ed1ab_0 conda-forge
pluggy 1.3.0 pyhd8ed1ab_0 conda-forge
ply 3.11 py_1 conda-forge
prettytable 3.9.0 pyhd8ed1ab_0 conda-forge
protobuf 4.24.4 py310h620c231_0 conda-forge
psutil 5.9.7 py310h2372a71_0 conda-forge
pulp 2.7.0 py310hff52083_1 conda-forge
pyasn1 0.5.1 pyhd8ed1ab_0 conda-forge
pyasn1-modules 0.3.0 pyhd8ed1ab_0 conda-forge
pycparser 2.21 pyhd8ed1ab_0 conda-forge
pygments 2.17.2 pyhd8ed1ab_0 conda-forge
pynacl 1.5.0 py310h2372a71_3 conda-forge
pyopenssl 23.3.0 pyhd8ed1ab_0 conda-forge
pyparsing 3.1.1 pyhd8ed1ab_0 conda-forge
pysftp 0.2.9 py_1 conda-forge
pysocks 1.7.1 pyha2e5f31_6 conda-forge
pytest 7.4.3 pyhd8ed1ab_0 conda-forge
**python 3.10.13 hd12c33a_0_cpython conda-forge**
python-dateutil 2.8.2 pyhd8ed1ab_0 conda-forge
python-fastjsonschema 2.19.0 pyhd8ed1ab_0 conda-forge
python-irodsclient 1.1.9 pyhd8ed1ab_0 conda-forge
python-tzdata 2023.3 pyhd8ed1ab_0 conda-forge
python_abi 3.10 4_cp310 conda-forge
pytz 2023.3.post1 pyhd8ed1ab_0 conda-forge
pyu2f 0.1.5 pyhd8ed1ab_0 conda-forge
pyyaml 6.0.1 py310h2372a71_1 conda-forge
re2 2023.06.02 h2873b5e_0 conda-forge
readline 8.2 h8228510_1 conda-forge
referencing 0.32.0 pyhd8ed1ab_0 conda-forge
requests 2.31.0 pyhd8ed1ab_0 conda-forge
reretry 0.11.8 pyhd8ed1ab_0 conda-forge
rich 13.7.0 pyhd8ed1ab_0 conda-forge
rpds-py 0.13.2 py310hcb5633a_0 conda-forge
rsa 4.9 pyhd8ed1ab_0 conda-forge
s3transfer 0.9.0 pyhd8ed1ab_0 conda-forge
**samtools 1.19 h50ea8bc_0 bioconda**
setuptools 68.2.2 pyhd8ed1ab_0 conda-forge
setuptools-scm 8.0.4 pyhd8ed1ab_0 conda-forge
six 1.16.0 pyh6c4a22f_0 conda-forge
slacker 0.14.0 py_0 conda-forge
smart_open 6.4.0 pyhd8ed1ab_0 conda-forge
smmap 5.0.0 pyhd8ed1ab_0 conda-forge
**snakemake 7.32.4 hdfd78af_1 bioconda**
snakemake-minimal 7.32.4 pyhdfd78af_1 bioconda
stone 3.3.1 pyhd8ed1ab_0 conda-forge
stopit 1.1.2 py_0 conda-forge
tabulate 0.9.0 pyhd8ed1ab_1 conda-forge
tar 1.34 hb2e2bae_1 conda-forge
tbb 2021.11.0 h00ab1b0_0 conda-forge
throttler 1.2.2 pyhd8ed1ab_0 conda-forge
tk 8.6.13 noxft_h4845f30_101 conda-forge
tomli 2.0.1 pyhd8ed1ab_0 conda-forge
toposort 1.10 pyhd8ed1ab_0 conda-forge
traitlets 5.14.0 pyhd8ed1ab_0 conda-forge
typing-extensions 4.9.0 hd8ed1ab_0 conda-forge
typing_extensions 4.9.0 pyha770c72_0 conda-forge
tzdata 2023c h71feb2d_0 conda-forge
ubiquerg 0.6.3 pyhd8ed1ab_0 conda-forge
uritemplate 4.1.1 pyhd8ed1ab_0 conda-forge
urllib3 1.26.18 pyhd8ed1ab_0 conda-forge
veracitools 0.1.3 py_0 conda-forge
wcwidth 0.2.12 pyhd8ed1ab_0 conda-forge
wget 1.20.3 ha35d2d1_1 conda-forge
wheel 0.42.0 pyhd8ed1ab_0 conda-forge
wrapt 1.16.0 py310h2372a71_0 conda-forge
xz 5.2.6 h166bdaf_0 conda-forge
yaml 0.2.5 h7f98852_2 conda-forge
yarl 1.9.3 py310h2372a71_0 conda-forge
yte 1.5.4 pyha770c72_0 conda-forge
zip 3.0 hd590300_3 conda-forge
zipp 3.17.0 pyhd8ed1ab_0 conda-forge
zlib 1.2.13 hd590300_5 conda-forge
zstd 1.5.5 hfc55251_0 conda-forge
Is there a version requirement for nthits
(because it's not specified) or where does the error come from?
Also, what is go_metacompass2.py
and how does it differ from go_metacompass.py
?
Thanks!
Hello,
I'm trying to run the tutorial example with the new release (https://github.com/marbl/MetaCompass/releases/tag/paper-v1.0). After checking dependencies and installing I ran
python go_metacompass.py -r tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta -P tutorial/thao2000.1.fq,tutorial/thao2000.2.fq -o example1_output -m 1 -t 4
the output is
confirming file containing reference genomes exists..
[OK]
checking for dependencies (Bowtie2, Blast, kmermask, Snakemake)
Bowtie2--->[OK]
/usr/bin/blastn
Blast+--->[OK]
/home/talex/apps/MetaCompass-paper-v1.0/bin/kmer-mask
kmer-mask--->[OK]
/home/talex/.pyenv/versions/miniconda3-latest/bin/snakemake
Snakemake--->[OK]
Full Traceback (most recent call last):
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/__init__.py", line 296, in snakemake
print_compilation=print_compilation)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/workflow.py", line 522, in include
exec(compile(code, snakefile, "exec"), self.globals)
File "/home/talex/apps/MetaCompass-paper-v1.0/snakemake/metacompass.iter0.ref.py", line 314, in <module>
TypeError: %d format: a number is required, not str
TypeError in line 133 of /home/talex/apps/MetaCompass-paper-v1.0/snakemake/metacompass.iter0.ref.py:
%d format: a number is required, not str
File "/home/talex/apps/MetaCompass-paper-v1.0/snakemake/metacompass.iter0.ref.py", line 133, in <module>
ERROR: snakemake command failed; exiting..
touch: cannot touch 'example1_output/thao2000.0.assembly.out/run.fail': No such file or directory
I tried modifying line 133 of MetaCompass-paper-v1.0/snakemake/metacompass.iter0.ref.py
as either
shell:"rm -rf %s/%s.0.assembly.out/%s.megahit; megahit -o %s/%s.0.assembly.out/%s.megahit --min-count %s --min-contig-len %s --presets meta-sensitive -t {threads} -1 {input.r1} -2 {input.r2} 1>> {log} 2>&1"%(config['prefix'],config['sample'],config['sample'],config['prefix'],config['sample'],config['sample'],config['mincov'],config['minlen'])
shell:"rm -rf %s/%s.0.assembly.out/%s.megahit; megahit -o %s/%s.0.assembly.out/%s.megahit --min-count 3 --min-contig-len 300 --presets meta-sensitive -t {threads} -1 {input.r1} -2 {input.r2} 1>> {log} 2>&1"%(config['prefix'],config['sample'],config['sample'],config['prefix'],config['sample'],config['sample'])
both modifications resolve the immediate TypeError, but produce the following error instead...
confirming file containing reference genomes exists.. [100/46216]
[OK]
checking for dependencies (Bowtie2, Blast, kmermask, Snakemake)
Bowtie2--->[OK]
/usr/bin/blastn
Blast+--->[OK]
/home/talex/apps/MetaCompass-paper-v1.0/bin/kmer-mask
kmer-mask--->[OK]
/home/talex/.pyenv/versions/miniconda3-latest/bin/snakemake
Snakemake--->[OK]
Provided cores: 4
Rules claiming more threads will be scaled down.
Job counts:
count jobs
1 assemble_unmapped
1 bam_sort
1 bowtie2_map
1 build_contigs
1 join_contigs
1 merge_reads
1 pilon_contigs
1 pilon_map
1 sam_to_bam
9
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (1):
merge_reads
Selected jobs (1):
merge_reads
Resources after job selection: {'_cores': 3, '_nodes': 9223372036854775806}
---merge fastq reads
Releasing 1 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
1 of 9 steps (11%) done
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (1):
bowtie2_map
Selected jobs (1):
bowtie2_map
Resources after job selection: {'_cores': 0, '_nodes': 9223372036854775806}
---Build index .
Releasing 4 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
2 of 9 steps (22%) done
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (1):
build_contigs
Selected jobs (1):
build_contigs
Resources after job selection: {'_cores': 3, '_nodes': 9223372036854775806}
---Build contigs .
Releasing 1 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
3 of 9 steps (33%) done
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (1):
pilon_map
Selected jobs (1):
pilon_map
Resources after job selection: {'_cores': 0, '_nodes': 9223372036854775806}
---Map reads for pilon polishing.
Releasing 4 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
4 of 9 steps (44%) done
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (2):
assemble_unmapped
sam_to_bam
Selected jobs (1):
sam_to_bam
Resources after job selection: {'_cores': 3, '_nodes': 9223372036854775806}
---Convert sam to bam .
Releasing 1 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
5 of 9 steps (56%) done
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (2):
bam_sort
assemble_unmapped
Selected jobs (1):
bam_sort
Resources after job selection: {'_cores': 3, '_nodes': 9223372036854775806}
---Sort bam .
Releasing 1 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
6 of 9 steps (67%) done
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (2):
assemble_unmapped
pilon_contigs
Selected jobs (1):
pilon_contigs
Resources after job selection: {'_cores': 3, '_nodes': 9223372036854775806}
---Pilon polish contigs .
Releasing 1 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
7 of 9 steps (78%) done
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (1):
assemble_unmapped
Selected jobs (1):
assemble_unmapped
Resources after job selection: {'_cores': 0, '_nodes': 9223372036854775806}
---Assemble unmapped reads .
Full Traceback (most recent call last):
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/executors.py", line 784, in run_wrapper
version)
File "/home/talex/apps/MetaCompass-paper-v1.0/snakemake/metacompass.iter0.ref.py", line 349, in __rule_assemble_unmapped
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/shell.py", line 74, in __new__
raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'rm -rf example1_output/thao2000.0.assembly.out/thao2000.megahit; megahit -o example1_output/thao2000.0.assembly.out/thao2000.megahit --min-count 3 --min-contig-len 300 --presets meta-sensitive -t 4 -1 example1_output/thao2000.0.assembly.out/thao2000.mc.sam.unmapped.1.fq -2 example1_output/thao2000.0.assembly.out/thao2000.mc.sam.unmapped.2.fq 1>> example1_output/thao2000.0.megahit.log 2>&1' returned non-zero exit status 1
Error in job assemble_unmapped while creating output file example1_output/thao2000.0.assembly.out/thao2000.megahit/final.contigs.fa.
Full Traceback (most recent call last):
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/executors.py", line 784, in run_wrapper
version)
File "/home/talex/apps/MetaCompass-paper-v1.0/snakemake/metacompass.iter0.ref.py", line 349, in __rule_assemble_unmapped
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/shell.py", line 74, in __new__
raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'rm -rf example1_output/thao2000.0.assembly.out/thao2000.megahit; megahit -o example1_output/thao2000.0.assembly.out/thao2000.megahit --min-count 3 --min-contig-len 300 --presets meta-sensitive -t 4 -1 example1_output/thao2000.0.assembly.out/thao2000.mc.sam.unmapped.1.fq -2 example1_output/thao2000.0.assembly.out/thao2000.mc.sam.unmapped.2.fq 1>> example1_output/thao2000.0.megahit.log 2>&1' returned non-zero exit status 1
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/executors.py", line 247, in _callback
raise ex
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/concurrent/futures/thread.py", line 55, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/executors.py", line 798, in run_wrapper
show_traceback=True))
snakemake.exceptions.RuleException: CalledProcessError in line 133 of /home/talex/apps/MetaCompass-paper-v1.0/snakemake/metacompass.iter0.ref.py:
Command 'rm -rf example1_output/thao2000.0.assembly.out/thao2000.megahit; megahit -o example1_output/thao2000.0.assembly.out/thao2000.megahit --min-count 3 --min-contig-len 300 --presets meta-sensitive -t 4 -1 example1_output/thao2000.0.assembly.out/thao2000.mc.sam.unmapped.1.fq -2 example1_output/thao2000.0.assembly.out/thao2000.mc.sam.unmapped.2.fq 1>> example1_output/thao2000.0.megahit.log 2>&1' returned non-zero exit status 1
File "/home/talex/apps/MetaCompass-paper-v1.0/snakemake/metacompass.iter0.ref.py", line 133, in __rule_assemble_unmapped
RuleException:
CalledProcessError in line 133 of /home/talex/apps/MetaCompass-paper-v1.0/snakemake/metacompass.iter0.ref.py:
Command 'rm -rf example1_output/thao2000.0.assembly.out/thao2000.megahit; megahit -o example1_output/thao2000.0.assembly.out/thao2000.megahit --min-count 3 --min-contig-len 300 --presets meta-sensitive -t 4 -1 example1_output/thao2000.0.assembly.out/thao2000.mc.sam.unmapped.1.fq -2 example1_output/thao2000.0.assembly.out/thao2000.mc.sam.unmapped.2.fq 1>> example1_output/thao2000.0.megahit.log 2>&1' returned non-zero exit status 1
File "/home/talex/apps/MetaCompass-paper-v1.0/snakemake/metacompass.iter0.ref.py", line 133, in __rule_assemble_unmapped
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/concurrent/futures/thread.py", line 55, in run
Releasing 4 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
Will exit after finishing currently running jobs.
Exiting because a job execution failed. Look above for error message
unlocking
removing lock
removing lock
removed all locks
ERROR: snakemake command failed; exiting..
I thought maybe it was just an error to the megahit call (using version 1.1.3), but the only possibly issue I see is that --presets meta-sensitive
overrides --min-count
(removing the --min-count arg from line 133 has no effect though).
My only guess might be an issue with snakemake (which I have no experience with) or the object class in the initial construction of the config object (which throws the original TypeError). At least the previous issue I raised (#5) seems to be resolved. I am looking forward to using this program at some point.
Thanks,
Alex
Apologies, crashed at the final stages. Proceeded as follows:
I. $git pull https://github.com/marbl/MetaCompass.git
remote: Enumerating objects: 48, done.
remote: Counting objects: 100% (48/48), done.
remote: Compressing objects: 100% (34/34), done.
remote: Total 40 (delta 30), reused 8 (delta 6), pack-reused 0
Unpacking objects: 100% (40/40), done.
From https://github.com/marbl/MetaCompass
II. ERROR (Partial text from screen)
Finished job 7.
9 of 14 steps (64%) done
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (2):
pilon_contigs
assemble_unmapped
Selected jobs (1):
assemble_unmapped
Resources after job selection: {'_cores': 0, '_nodes': 9223372036854775806}
Job 4: ---Assemble unmapped reads .
Reason: Missing output files: CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/final.contigs.fa; Input files updated by another job: CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.1.fq, CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.2.fq
Full Traceback (most recent call last):
File "/home/bharat/opt/anaconda3/envs/metacompass/lib/python3.6/site-packages/snakemake/executors.py", line 923, in run_wrapper
log, version, rule, conda_env, None)
File "/home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/snakemake/metacompass.iter0.paired.py", line 598, in __rule_assemble_unmapped
File "/home/bharat/opt/anaconda3/envs/metacompass/lib/python3.6/site-packages/snakemake/shell.py", line 88, in new
raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'if [[ -s CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.1.fq || -s CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.2.fq ]]; then rm -rf CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit; megahit -o CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit --min-count 3 --min-contig-len 300 --presets meta-sensitive -t 4 -1 CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.1.fq -2 CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.2.fq 1>> CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log 2>&1; else touch CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/final.contigs.fa CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log; echo 'No unmapped reads to run de novo assembly' >CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log ;fi' returned non-zero exit status 250.
Error in job assemble_unmapped while creating output file CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/final.contigs.fa.
Full Traceback (most recent call last):
File "/home/bharat/opt/anaconda3/envs/metacompass/lib/python3.6/site-packages/snakemake/executors.py", line 923, in run_wrapper
log, version, rule, conda_env, None)
File "/home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/snakemake/metacompass.iter0.paired.py", line 598, in __rule_assemble_unmapped
File "/home/bharat/opt/anaconda3/envs/metacompass/lib/python3.6/site-packages/snakemake/shell.py", line 88, in new
raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'if [[ -s CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.1.fq || -s CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.2.fq ]]; then rm -rf CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit; megahit -o CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit --min-count 3 --min-contig-len 300 --presets meta-sensitive -t 4 -1 CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.1.fq -2 CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.2.fq 1>> CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log 2>&1; else touch CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/final.contigs.fa CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log; echo 'No unmapped reads to run de novo assembly' >CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log ;fi' returned non-zero exit status 250.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/bharat/opt/anaconda3/envs/metacompass/lib/python3.6/site-packages/snakemake/executors.py", line 326, in _callback
raise ex
File "/home/bharat/opt/anaconda3/envs/metacompass/lib/python3.6/concurrent/futures/thread.py", line 56, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/bharat/opt/anaconda3/envs/metacompass/lib/python3.6/site-packages/snakemake/executors.py", line 935, in run_wrapper
show_traceback=True))
snakemake.exceptions.RuleException: CalledProcessError in line 206 of /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/snakemake/metacompass.iter0.paired.py:
Command 'if [[ -s CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.1.fq || -s CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.2.fq ]]; then rm -rf CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit; megahit -o CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit --min-count 3 --min-contig-len 300 --presets meta-sensitive -t 4 -1 CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.1.fq -2 CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.2.fq 1>> CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log 2>&1; else touch CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/final.contigs.fa CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log; echo 'No unmapped reads to run de novo assembly' >CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log ;fi' returned non-zero exit status 250.
File "/home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/snakemake/metacompass.iter0.paired.py", line 206, in __rule_assemble_unmapped
RuleException:
CalledProcessError in line 206 of /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/snakemake/metacompass.iter0.paired.py:
Command 'if [[ -s CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.1.fq || -s CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.2.fq ]]; then rm -rf CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit; megahit -o CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit --min-count 3 --min-contig-len 300 --presets meta-sensitive -t 4 -1 CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.1.fq -2 CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.2.fq 1>> CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log 2>&1; else touch CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/final.contigs.fa CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log; echo 'No unmapped reads to run de novo assembly' >CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.megahit.log ;fi' returned non-zero exit status 250.
File "/home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/snakemake/metacompass.iter0.paired.py", line 206, in __rule_assemble_unmapped
File "/home/bharat/opt/anaconda3/envs/metacompass/lib/python3.6/concurrent/futures/thread.py", line 56, in run
Releasing 4 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
Will exit after finishing currently running jobs.
Exiting because a job execution failed. Look above for error message
unlocking
removing lock
removing lock
removed all locks
ERROR: snakemake command failed; exiting..
III. megathit Log:
(metacompass) bharat@bharat-T3500:~/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit$ more log
2019-07-27 11:59:31 - MEGAHIT v1.2.6
2019-07-27 11:59:31 - Using megahit_core without POPCNT and BMI2 support, because the features not detected by CPUID
2019-07-27 11:59:31 - Convert reads to binary library
2019-07-27 11:59:31 - command /home/bharat/opt/anaconda3/envs/metacompass/bin/megahit_core_no_hw_accel buildlib /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/tmp/reads.
lib /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/tmp/reads.lib
2019-07-27 12:03:04 - b'INFO sequence/io/sequence_lib.cpp : 77 - Lib 0 (/home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.1.fq,/home/bharat/opt/anaconda3/envs/m
etacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.mc.sam.unmapped.2.fq): pe, 16018132 reads, 300 max length'
2019-07-27 12:03:04 - b'INFO utils/utils.h : 152 - Real: 212.7322\tuser: 31.6104\tsys: 5.8834\tmaxrss: 235248'
2019-07-27 12:03:04 - k-max reset to: 141
2019-07-27 12:03:04 - Start assembly. Number of CPU threads 4
2019-07-27 12:03:04 - k list: 21,29,39,49,59,69,79,89,99,109,119,129,141
2019-07-27 12:03:04 - Memory used: 11330378956
2019-07-27 12:03:04 - Extracting solid (k+1)-mers and building sdbg for k = 21
2019-07-27 12:03:04 - command /home/bharat/opt/anaconda3/envs/metacompass/bin/megahit_core_no_hw_accel read2sdbg -k 21 -m 1 --host_mem 11330378956 --mem_flag 1 --output_prefix /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001
R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/tmp/k21/21 --num_cpu_threads 4 --read_lib_file /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/tmp/reads.lib
2019-07-27 12:03:04 - b'INFO sequence/io/sequence_lib.cpp : 115 - Before reading, sizeof seq_package: 1328551936'
2019-07-27 12:03:13 - b'INFO sequence/io/sequence_lib.cpp : 117 - After reading, sizeof seq_package: 1328551936'
2019-07-27 12:03:13 - b'INFO sorting/read_to_sdbg_s1.cpp : 105 - 16018132 reads, 300 max read length, 4801627481 total bases'
2019-07-27 12:03:13 - b'INFO sorting/read_to_sdbg_s1.cpp : 109 - 2 words per substring'
2019-07-27 12:03:13 - b'INFO sorting/read_to_sdbg_s1.cpp : 124 - Number of files for mercy candidate reads: 2'
2019-07-27 12:03:13 - b'INFO sorting/base_engine.cpp : 140 - Preparing data...'
2019-07-27 12:03:13 - b'INFO sorting/read_to_sdbg_s2.cpp : 103 - 2 words per substring, words per dummy node ($v): 2'
2019-07-27 12:03:13 - b'INFO sorting/base_engine.cpp : 145 - Preparing data... Done. Time elapsed: 0.0039'
2019-07-27 12:03:13 - b'INFO sorting/base_engine.cpp : 148 - Preparing partitions and calculating bucket sizes...'
2019-07-27 12:03:34 - b'INFO sorting/base_engine.cpp : 130 - Lv1 items: 1199279870, Lv2 items: 3260984'
2019-07-27 12:03:34 - b'INFO sorting/base_engine.cpp : 132 - Memory of derived class: 1366064680, Memory for Lv1+Lv2: 4823207352'
2019-07-27 12:03:34 - b'INFO sorting/base_engine.cpp : 160 - Preparing partitions and calculating bucket sizes... Done. Time elapsed: 20.2475'
2019-07-27 12:03:34 - b'INFO sorting/base_engine.cpp : 164 - Start main loop...'
2019-07-27 12:03:34 - b'INFO sorting/base_engine.cpp : 178 - Lv1 scanning from bucket 0 to 1317'
2019-07-27 12:04:00 - b'INFO sorting/base_engine.cpp : 185 - Lv1 scanning done. Large diff: 0. Time elapsed: 26.2043'
2019-07-27 12:04:54 - b'INFO sorting/base_engine.cpp : 191 - Lv1 fetching & sorting done. Time elapsed: 54.4853'
2019-07-27 12:04:54 - b'INFO sorting/base_engine.cpp : 178 - Lv1 scanning from bucket 1317 to 3694'
2019-07-27 12:05:23 - b'INFO sorting/base_engine.cpp : 185 - Lv1 scanning done. Large diff: 0. Time elapsed: 28.7587'
2019-07-27 12:06:28 - b'INFO sorting/base_engine.cpp : 191 - Lv1 fetching & sorting done. Time elapsed: 64.8290'
2019-07-27 12:06:28 - b'INFO sorting/base_engine.cpp : 178 - Lv1 scanning from bucket 3694 to 7217'
2019-07-27 12:06:58 - b'INFO sorting/base_engine.cpp : 185 - Lv1 scanning done. Large diff: 0. Time elapsed: 30.5164'
2019-07-27 12:08:07 - b'INFO sorting/base_engine.cpp : 191 - Lv1 fetching & sorting done. Time elapsed: 68.4976'
2019-07-27 12:08:07 - b'INFO sorting/base_engine.cpp : 178 - Lv1 scanning from bucket 7217 to 12266'
2019-07-27 12:08:38 - b'INFO sorting/base_engine.cpp : 185 - Lv1 scanning done. Large diff: 0. Time elapsed: 31.2060'
2019-07-27 12:09:50 - b'INFO sorting/base_engine.cpp : 191 - Lv1 fetching & sorting done. Time elapsed: 72.0028'
2019-07-27 12:09:50 - b'INFO sorting/base_engine.cpp : 178 - Lv1 scanning from bucket 12266 to 19408'
2019-07-27 12:10:22 - b'INFO sorting/base_engine.cpp : 185 - Lv1 scanning done. Large diff: 0. Time elapsed: 32.3821'
2019-07-27 12:11:36 - b'INFO sorting/base_engine.cpp : 191 - Lv1 fetching & sorting done. Time elapsed: 73.4038'
2019-07-27 12:11:36 - b'INFO sorting/base_engine.cpp : 178 - Lv1 scanning from bucket 19408 to 30035'
2019-07-27 12:12:10 - b'INFO sorting/base_engine.cpp : 185 - Lv1 scanning done. Large diff: 0. Time elapsed: 33.6176'
2019-07-27 12:13:28 - b'INFO sorting/base_engine.cpp : 191 - Lv1 fetching & sorting done. Time elapsed: 78.7099'
2019-07-27 12:13:28 - b'INFO sorting/base_engine.cpp : 178 - Lv1 scanning from bucket 30035 to 47361'
2019-07-27 12:14:06 - b'INFO sorting/base_engine.cpp : 185 - Lv1 scanning done. Large diff: 0. Time elapsed: 37.5211'
2019-07-27 12:15:30 - b'INFO sorting/base_engine.cpp : 191 - Lv1 fetching & sorting done. Time elapsed: 84.4306'
2019-07-27 12:15:30 - b'INFO sorting/base_engine.cpp : 178 - Lv1 scanning from bucket 47361 to 65536'
2019-07-27 12:15:58 - b'INFO sorting/base_engine.cpp : 185 - Lv1 scanning done. Large diff: 0. Time elapsed: 28.1617'
2019-07-27 12:16:43 - b'INFO sorting/base_engine.cpp : 191 - Lv1 fetching & sorting done. Time elapsed: 44.8233'
2019-07-27 12:16:43 - b'INFO sorting/base_engine.cpp : 196 - Main loop done. Time elapsed: 789.5767'
2019-07-27 12:16:43 - b'INFO sorting/base_engine.cpp : 199 - Postprocessing...'
2019-07-27 12:16:43 - b'INFO sorting/read_to_sdbg_s2.cpp : 618 - Number of $ A C G T A- C- G- T-:'
2019-07-27 12:16:43 - b'INFO sorting/read_to_sdbg_s2.cpp : 619 - 9553149 476345632 831616175 823988055 478011054 8668645 18122348 18314576 8375755'
2019-07-27 12:16:43 - b'INFO sorting/read_to_sdbg_s2.cpp : 624 - Total number of edges: 2672995389'
2019-07-27 12:16:43 - b'INFO sorting/read_to_sdbg_s2.cpp : 625 - Total number of ONEs: 2609960916'
2019-07-27 12:16:43 - b'INFO sorting/read_to_sdbg_s2.cpp : 627 - Total number of $v edges: 9553149'
2019-07-27 12:16:43 - b'INFO sorting/base_engine.cpp : 202 - Postprocess done. Time elapsed: 0.0878'
2019-07-27 12:16:44 - b'INFO utils/utils.h : 152 - Real: 819.1114\tuser: 3197.4817\tsys: 11.6779\tmaxrss: 6072408'
2019-07-27 12:16:45 - Assemble contigs from SdBG for k = 21
2019-07-27 12:16:45 - command /home/bharat/opt/anaconda3/envs/metacompass/bin/megahit_core_no_hw_accel assemble -s /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/tmp/k21
/21 -o /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/intermediate_contigs/k21 -t 4 --min_standalone 422 --prune_level 2 --merge_len 20 --merge_similar 0.95 --cleaning_r
ounds 5 --disconnect_ratio 0.1 --low_local_ratio 0.2 --cleaning_rounds 5 --min_depth 2 --bubble_level 2 --max_tip_len -1 --careful_bubble
2019-07-27 12:17:59 - b'INFO main_assemble.cpp : 129 - Loading succinct de Bruijn graph: /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/tmp/k21/21Done. Tim
e elapsed: 74.428029'
2019-07-27 12:17:59 - b'INFO main_assemble.cpp : 133 - Number of Edges: 2672995389; K value: 21'
2019-07-27 12:17:59 - b'INFO main_assemble.cpp : 140 - Number of CPU threads: 4'
2019-07-27 12:20:52 - b'INFO assembly/sdbg_pruning.cpp : 160 - Removing tips with length less than 2; Accumulated tips removed: 428748; time elapsed: 8.9007'
2019-07-27 12:21:06 - b'INFO assembly/sdbg_pruning.cpp : 160 - Removing tips with length less than 4; Accumulated tips removed: 1228136; time elapsed: 13.9837'
2019-07-27 12:21:34 - b'INFO assembly/sdbg_pruning.cpp : 160 - Removing tips with length less than 8; Accumulated tips removed: 2655594; time elapsed: 28.2101'
2019-07-27 12:22:30 - b'INFO assembly/sdbg_pruning.cpp : 160 - Removing tips with length less than 16; Accumulated tips removed: 5100116; time elapsed: 56.3092'
2019-07-27 12:24:00 - b'INFO assembly/sdbg_pruning.cpp : 160 - Removing tips with length less than 32; Accumulated tips removed: 8024050; time elapsed: 89.6935'
2019-07-27 12:25:35 - b'INFO assembly/sdbg_pruning.cpp : 169 - Removing tips with length less than 42; Accumulated tips removed: 8927490; time elapsed: 95.0219'
2019-07-27 12:25:35 - b'INFO main_assemble.cpp : 158 - Tips removal done! Time elapsed(sec): 455.901'
2019-07-27 13:01:51 - b'INFO assembly/unitig_graph.cpp : 84 - Graph size without loops: 64623603, palindrome: 4301'
2019-07-27 13:01:53 - b"terminate called after throwing an instance of 'std::bad_alloc'"
2019-07-27 13:01:53 - b' what(): std::bad_alloc'
2019-07-27 13:01:56 - Error occurs, please refer to /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/log for detail
2019-07-27 13:01:56 - Command: /home/bharat/opt/anaconda3/envs/metacompass/bin/megahit_core_no_hw_accel assemble -s /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/tmp/k2
1/21 -o /home/bharat/opt/anaconda3/envs/metacompass/MetaCompass/CS1BS.output_270719/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/CS1BS_ATBW8_GCCAAT_L001_R1.megahit/intermediate_contigs/k21 -t 4 --min_standalone 422 --prune_level 2 --merge_len 20 --merge_similar 0.95 --cleaning
rounds 5 --disconnect_ratio 0.1 --low_local_ratio 0.2 --cleaning_rounds 5 --min_depth 2 --bubble_level 2 --max_tip_len -1 --careful_bubble; Exit code -6
Encountered lock error during STAMPS MBL 2019 tutorial. Can you disable locking, or enable --unlock
so we can unlock without deleting .snakemake
?
MissingInputException in line 42 of /opt/MetaCompass/snakemake/metacompass.iter0.paired.py:
Missing input files for rule merge_reads:
SRR606249_subset10_1.fqSRR606249_subset10_2.fq
unlocking
removed all locks
Error: Directory cannot be locked. Please make sure that no other Snakemake process is trying to create the same files in the following directory:
/output
If you are sure that no other instances of snakemake are running on this directory, the remaining lock was likely caused by a kill signal ora power loss. It can be removed with the --unlock argument.
Hi,
I am running the following command:
python3 go_metacompass2.py -1 /home1/pbravakos/Software/MetaCompass/tutorial/thao2000.1.fq -2 /home1/pbravakos/Software/MetaCompass/tutorial/thao2000.2.fq --memory 223 --threads 19 --verbose --mincov 500 -o example2_output
And I get the following error:
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 19
Rules claiming more threads will be scaled down.
Job counts:
count jobs
1 all
1 assemble_unmapped
1 assembled_references
1 bam_sort
1 bowtie2_map
1 build_contigs
1 create_tsv
1 fastq2fasta
1 join_contigs
1 kmer_mask
1 mapping_stats
1 pilon_contigs
1 pilon_map
1 reference_selection
1 sam_to_bam
1 stats_all
1 stats_genome
17
Resources before job selection: {'_cores': 19, '_nodes': 9223372036854775807}
Ready jobs (1):
kmer_mask
Selected jobs (1):
kmer_mask
Resources after job selection: {'_cores': 0, '_nodes': 9223372036854775806}
[Mon Aug 3 13:37:33 2020]
Job 16: ---kmer-mask fastq
ESC[33mJob counts:
count jobs
1 kmer_mask
1ESC[0m
ESC[33mtouch example2_output/reference_selection/marker.match.1.fastq;kmer-mask -ms 28 -mdb /home1/pbravakos/Software/MetaCompass/refseq/kmer-mask_db/markers.mdb -1 /home1/pbravakos/Software/MetaCompass/tutorial/thao2000.1.fq -clean 0.0 -match 0.01 -nomasking -t 19 -l 103 -o example2_output/reference_selection/markertmp$RANDOM 1>> example2_output/logs/kmermask.log 2>&1ESC[0m
ESC[32m[Mon Aug 3 13:38:08 2020]ESC[0m
ESC[31mError in rule kmer_mask:ESC[0m
ESC[31m jobid: 0ESC[0m
ESC[31m output: example2_output/reference_selection/marker.match.1.fastqESC[0m
ESC[31m log: example2_output/logs/kmermask.log (check log file(s) for error message)ESC[0m
ESC[31mESC[0m
ESC[31mRuleException:
CalledProcessError in line 49 of /home1/pbravakos/Software/MetaCompass/snakemake/metacompass.paired.py:
Command 'set -euo pipefail; touch example2_output/reference_selection/marker.match.1.fastq;kmer-mask -ms 28 -mdb /home1/pbravakos/Software/MetaCompass/refseq/kmer-mask_db/markers.mdb -1 /home1/pbravakos/Software/MetaCompass/tutorial/thao2000.1.fq -clean 0.0 -match 0.01 -nomasking -t 19 -l 103 -o example2_output/reference_selection/markertmp$RANDOM 1>> example2_output/logs/kmermask.log 2>&1' returned non-zero exit status 1.
File "/home1/pbravakos/Software/miniconda3/envs/snakemake/lib/python3.8/site-packages/snakemake/executors/__init__.py", line 2168, in run_wrapper
File "/home1/pbravakos/Software/MetaCompass/snakemake/metacompass.paired.py", line 49, in __rule_kmer_mask
File "/home1/pbravakos/Software/miniconda3/envs/snakemake/lib/python3.8/site-packages/snakemake/executors/__init__.py", line 529, in _callback
File "/home1/pbravakos/Software/miniconda3/envs/snakemake/lib/python3.8/concurrent/futures/thread.py", line 57, in run
File "/home1/pbravakos/Software/miniconda3/envs/snakemake/lib/python3.8/site-packages/snakemake/executors/__init__.py", line 515, in cached_or_run
File "/home1/pbravakos/Software/miniconda3/envs/snakemake/lib/python3.8/site-packages/snakemake/executors/__init__.py", line 2199, in run_wrapperESC[0m
ESC[31mExiting because a job execution failed. Look above for error messageESC[0m
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /home1/pbravakos/OSD_Metagenome/MetaCompass/.snakemake/log/2020-08-03T133733.155741.snakemake.log
unlocking
removing lock
removing lock
removed all locks
Do you have any idea why this is happening?
Thanks
I ran the example go_metacompass.py -1 SRS044742/SRS044742.denovo_duplicates_marked.trimmed.1.fastq -2 SRS044742/SRS044742.denovo_duplicates_marked.trimmed.2.fastq -U SRS044742/SRS044742.denovo_duplicates_marked.trimmed.singleton.fastq -l 100 -o example_output -k
It failed in the reference_selection
step. When I checked the log, I found this Could not open reference sequences file /refseq/markers/genome2markers.length
. When I checked my markers
folder under the refseq
folder, I cannot find this genome2markers.length
. Did I download the wrong refseq database? I followed the install.sh
.
How should I fix this issue?
Thanks!
Installed metaCompass v 1.3 (github version in conda in py3.5 env)
Tested installation using: Reference-guided assembly with known reference genomes (no reference selection). Completed without erros
Ran Reference-guided assembly with reference selection with own samples copied to dir tutorial:
(8010015 seqs, length 300)
$python3 go_metacompass.py -P tutorial/CS1BS_ATBW8_GCCAAT_L001_R1.fastq,tutorial/CS1BS_ATBW8_GCCAAT_L001_R2.fastq -l 300 -o CS1BS.output
Created error as follows (tried a few things e.g. downgraded snakeman to 3.11) but none worked. Assistance in trouble shooting appreciated
checking for dependencies (Bowtie2, Blast, kmermask, Snakemake, etc)
/home/bharat/opt/anaconda3/envs/py35/bin/snakemake
Snakemake--->[OK]
Bowtie2--->[OK]
/home/bharat/opt/anaconda3/envs/py35/bin/blastn
Blast+--->[OK]
/home/bharat/opt/anaconda3/envs/py35/bin/kmer-mask
kmer-mask--->[OK]
/usr/bin/mash
mash--->[OK]
Provided cores: 1
Rules claiming more threads will be scaled down.
Job counts:
count jobs
1 all
1 assemble_unmapped
1 bam_sort
1 bowtie2_map
1 build_contigs
1 create_tsv
1 fastq2fasta
1 join_contigs
1 kmer_mask
1 merge_reads
1 pilon_contigs
1 pilon_map
1 reference_recruitment
1 sam_to_bam
14
Resources before job selection: {'_nodes': 9223372036854775807, '_cores': 1}
Ready jobs (1):
merge_reads
Selected jobs (1):
merge_reads
Resources after job selection: {'_nodes': 9223372036854775806, '_cores': 0}
---merge fastq reads
Reason: Missing output files: CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.merged.fq
Releasing 1 _nodes (now 9223372036854775807).
Releasing 1 _cores (now 1).
Finished job 10.
1 of 14 steps (7%) done
Resources before job selection: {'_nodes': 9223372036854775807, '_cores': 1}
Ready jobs (1):
kmer_mask
Selected jobs (1):
kmer_mask
Resources after job selection: {'_nodes': 9223372036854775806, '_cores': 0}
---kmer-mask fastq
Reason: Missing output files: CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.marker.match.1.fastq; Input files updated by another job: CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.merged.fq
Releasing 1 _nodes (now 9223372036854775807).
Releasing 1 _cores (now 1).
Finished job 13.
2 of 14 steps (14%) done
Resources before job selection: {'_nodes': 9223372036854775807, '_cores': 1}
Ready jobs (1):
fastq2fasta
Selected jobs (1):
fastq2fasta
Resources after job selection: {'_nodes': 9223372036854775806, '_cores': 0}
---Converting fastq to fasta.
Reason: Missing output files: CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.fasta; Input files updated by another job: CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.marker.match.1.fastq
Releasing 1 _nodes (now 9223372036854775807).
Releasing 1 _cores (now 1).
Finished job 11.
3 of 14 steps (21%) done
Resources before job selection: {'_nodes': 9223372036854775807, '_cores': 1}
Ready jobs (1):
reference_recruitment
Selected jobs (1):
reference_recruitment
Resources after job selection: {'_nodes': 9223372036854775806, '_cores': 0}
---reference recruitment.
Reason: Missing output files: CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/mc.refseq.fna; Input files updated by another job: CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.merged.fq, CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.fasta
Full Traceback (most recent call last):
File "/home/bharat/opt/anaconda3/envs/py35/lib/python3.5/site-packages/snakemake/executors.py", line 848, in run_wrapper
version, rule, conda_env)
File "/home/bharat/opt/anaconda3/envs/py35/MetaCompass/snakemake/metacompass.iter0.paired.py", line 168, in __rule_reference_recruitment
sam=rules.pilon_map.output.sam
File "/home/bharat/opt/anaconda3/envs/py35/lib/python3.5/site-packages/snakemake/shell.py", line 80, in new
raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'mkdir -p CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out; python3 /home/bharat/opt/anaconda3/envs/py35/MetaCompass/bin/select_references.py tax CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.fasta CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.merged.fq CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out 1 2 1>> CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.reference_recruitement.log 2>&1' returned non-zero exit status 1
Error in job reference_recruitment while creating output files CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/mc.refseq.ids, CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out, CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/mc.refseq.fna.
Full Traceback (most recent call last):
File "/home/bharat/opt/anaconda3/envs/py35/lib/python3.5/site-packages/snakemake/executors.py", line 848, in run_wrapper
version, rule, conda_env)
File "/home/bharat/opt/anaconda3/envs/py35/MetaCompass/snakemake/metacompass.iter0.paired.py", line 168, in __rule_reference_recruitment
sam=rules.pilon_map.output.sam
File "/home/bharat/opt/anaconda3/envs/py35/lib/python3.5/site-packages/snakemake/shell.py", line 80, in new
raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'mkdir -p CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out; python3 /home/bharat/opt/anaconda3/envs/py35/MetaCompass/bin/select_references.py tax CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.fasta CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.merged.fq CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out 1 2 1>> CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.reference_recruitement.log 2>&1' returned non-zero exit status 1
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/bharat/opt/anaconda3/envs/py35/lib/python3.5/site-packages/snakemake/executors.py", line 300, in _callback
raise ex
File "/home/bharat/opt/anaconda3/envs/py35/lib/python3.5/concurrent/futures/thread.py", line 55, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/bharat/opt/anaconda3/envs/py35/lib/python3.5/site-packages/snakemake/executors.py", line 862, in run_wrapper
show_traceback=True))
snakemake.exceptions.RuleException: CalledProcessError in line 82 of /home/bharat/opt/anaconda3/envs/py35/MetaCompass/snakemake/metacompass.iter0.paired.py:
Command 'mkdir -p CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out; python3 /home/bharat/opt/anaconda3/envs/py35/MetaCompass/bin/select_references.py tax CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.fasta CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.merged.fq CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out 1 2 1>> CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.reference_recruitement.log 2>&1' returned non-zero exit status 1
File "/home/bharat/opt/anaconda3/envs/py35/MetaCompass/snakemake/metacompass.iter0.paired.py", line 82, in __rule_reference_recruitment
RuleException:
CalledProcessError in line 82 of /home/bharat/opt/anaconda3/envs/py35/MetaCompass/snakemake/metacompass.iter0.paired.py:
Command 'mkdir -p CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out; python3 /home/bharat/opt/anaconda3/envs/py35/MetaCompass/bin/select_references.py tax CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.fasta CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.merged.fq CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out 1 2 1>> CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.reference_recruitement.log 2>&1' returned non-zero exit status 1
File "/home/bharat/opt/anaconda3/envs/py35/MetaCompass/snakemake/metacompass.iter0.paired.py", line 82, in __rule_reference_recruitment
File "/home/bharat/opt/anaconda3/envs/py35/lib/python3.5/concurrent/futures/thread.py", line 55, in run
Removing output files of failed job reference_recruitment since they might be corrupted:
CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out
Releasing 1 _nodes (now 9223372036854775807).
Releasing 1 _cores (now 1).
Will exit after finishing currently running jobs.
Exiting because a job execution failed. Look above for error message
unlocking
removing lock
removing lock
removed all locks
ERROR: snakemake command failed; exiting..
touch: cannot touch 'CS1BS.output/CS1BS_ATBW8_GCCAAT_L001_R1.0.assembly.out/run.fail': No such file or directory
I'm trying to run metacompass, and it runs fine if I specify a single reference .fasta genome with -r, but cannot recognize multiple .fasta genomes, and crashes if I provide it with a .fasta file containing all the reference genomes in a single file. Is it possible to run MetaCompass with multiple .fasta reference genomes?
python3 go_metacompass.py -1 test_for.fastq -2 test_rev.fastq -l 250 -o MC_assembly -m 2 -t 10 -y 100 --clobber
###OUTPUT###
MetaCompass metagenome assembler version 2.0.0 by Victoria Cepeda ([email protected])
usage: snakemake [-h] [--dryrun] [--profile PROFILE] [--snakefile FILE]
[--cores [N]] [--local-cores N]
[--resources [NAME=INT [NAME=INT ...]]]
[--config [KEY=VALUE [KEY=VALUE ...]]] [--configfile FILE]
[--directory DIR] [--touch] [--keep-going] [--force]
[--forceall] [--forcerun [TARGET [TARGET ...]]]
[--prioritize TARGET [TARGET ...]]
[--until TARGET [TARGET ...]]
[--omit-from TARGET [TARGET ...]] [--rerun-incomplete]
[--shadow-prefix DIR] [--report HTMLFILE] [--export-cwl FILE]
[--list] [--list-target-rules] [--dag] [--rulegraph]
[--d3dag] [--summary] [--detailed-summary] [--archive FILE]
[--cleanup-metadata FILE [FILE ...]] [--cleanup-shadow]
[--unlock] [--list-version-changes] [--list-code-changes]
[--list-input-changes] [--list-params-changes]
[--list-untracked] [--delete-all-output]
[--delete-temp-output] [--bash-completion] [--version]
[--reason] [--gui [PORT]] [--printshellcmds] [--debug-dag]
[--stats FILE] [--nocolor] [--quiet] [--print-compilation]
[--verbose] [--force-use-threads] [--allow-ambiguity]
[--nolock] [--ignore-incomplete] [--latency-wait SECONDS]
[--wait-for-files [FILE [FILE ...]]] [--notemp]
[--keep-remote] [--keep-target-files]
[--allowed-rules ALLOWED_RULES [ALLOWED_RULES ...]]
[--max-jobs-per-second MAX_JOBS_PER_SECOND]
[--max-status-checks-per-second MAX_STATUS_CHECKS_PER_SECOND]
[--restart-times RESTART_TIMES] [--attempt ATTEMPT]
[--wrapper-prefix WRAPPER_PREFIX]
[--default-remote-provider {S3,GS,FTP,SFTP,S3Mocked,gfal,gridftp,iRODS}]
[--default-remote-prefix DEFAULT_REMOTE_PREFIX]
[--no-shared-fs] [--greediness GREEDINESS] [--no-hooks]
[--overwrite-shellcmd OVERWRITE_SHELLCMD] [--debug]
[--runtime-profile FILE] [--mode {0,1,2}]
[--cluster CMD | --cluster-sync CMD | --drmaa [ARGS]]
[--cluster-config FILE] [--immediate-submit]
[--jobscript SCRIPT] [--jobname NAME]
[--cluster-status CLUSTER_STATUS] [--drmaa-log-dir DIR]
[--kubernetes [NAMESPACE]]
[--kubernetes-env ENVVAR [ENVVAR ...]]
[--container-image IMAGE] [--use-conda] [--list-conda-envs]
[--cleanup-conda] [--conda-prefix DIR] [--create-envs-only]
[--use-singularity] [--singularity-prefix DIR]
[--singularity-args ARGS]
[target [target ...]]
snakemake: error: unrecognized arguments: -T
ERROR: snakemake command failed; exiting..
$ snakemake -v
5.4.0
Hi,
I'm using MetaCompass v2.0-beta. Example1 worked but example2 didn't. It ended up with
Error in job reference_selection while creating output files example2_output/reference_selection/mc.refseq.fna, example2_output/reference_selection/mc.refseq.ids.
MissingOutputException in line 64 of /home/mpg02/MMTM/yiming.shi/metacompass/snakemake/metacompass.py:
Missing files after 5 seconds:
example2_output/reference_selection/mc.refseq.ids
This might be due to filesystem latency. If that is the case, consider to increase the wait time with --latency-wait.
Exiting because a job execution failed. Look above for error message
[Thu Sep 30 18:14:47 2021] Will exit after finishing currently running jobs.
[Thu Sep 30 18:14:47 2021] Exiting because a job execution failed. Look above for error message
Any idea how to fix the problem? Thanks in advance.
When I try to run MetaCompass on my single-end reads, I get the following error:
snakemake: error: unrecognized arguments: -T
I go for my own reference genome and the command I use to run MetaCompass is as follows:
./MetaCompass/go_metacompass.py -U sample.fastq -r reference.fasta -o workdir/assembly/metacompass
hello
i found this error when running metacompass
please help me to understand what is that error mean and how to solve it
usage: snakemake [-h] [--dry-run] [--profile PROFILE] [--cache [RULE ...]] [--snakefile FILE] [--cores [N]] [--jobs [N]] [--local-cores N]
[--resources [NAME=INT ...]] [--set-threads RULE=THREADS [RULE=THREADS ...]] [--max-threads MAX_THREADS]
[--set-resources RULE:RESOURCE=VALUE [RULE:RESOURCE=VALUE ...]] [--set-scatter NAME=SCATTERITEMS [NAME=SCATTERITEMS ...]]
[--default-resources [NAME=INT ...]] [--preemption-default PREEMPTION_DEFAULT]
[--preemptible-rules PREEMPTIBLE_RULES [PREEMPTIBLE_RULES ...]] [--config [KEY=VALUE ...]] [--configfile FILE [FILE ...]]
[--envvars VARNAME [VARNAME ...]] [--directory DIR] [--touch] [--keep-going] [--force] [--forceall] [--forcerun [TARGET ...]]
[--prioritize TARGET [TARGET ...]] [--batch RULE=BATCH/BATCHES] [--until TARGET [TARGET ...]] [--omit-from TARGET [TARGET ...]]
[--rerun-incomplete] [--shadow-prefix DIR] [--scheduler [{ilp,greedy}]] [--wms-monitor [WMS_MONITOR]] [--wms-monitor-arg [NAME=VALUE ...]]
[--scheduler-ilp-solver {COIN_CMD}] [--scheduler-solver-path SCHEDULER_SOLVER_PATH] [--conda-base-path CONDA_BASE_PATH]
[--no-subworkflows] [--groups GROUPS [GROUPS ...]] [--group-components GROUP_COMPONENTS [GROUP_COMPONENTS ...]] [--report [FILE]]
[--report-stylesheet CSSFILE] [--draft-notebook TARGET] [--edit-notebook TARGET] [--notebook-listen IP:PORT] [--lint [{text,json}]]
[--generate-unit-tests [TESTPATH]] [--containerize] [--export-cwl FILE] [--list] [--list-target-rules] [--dag] [--rulegraph] [--filegraph]
[--d3dag] [--summary] [--detailed-summary] [--archive FILE] [--cleanup-metadata FILE [FILE ...]] [--cleanup-shadow]
[--skip-script-cleanup] [--unlock] [--list-version-changes] [--list-code-changes] [--list-input-changes] [--list-params-changes]
[--list-untracked] [--delete-all-output] [--delete-temp-output] [--bash-completion] [--keep-incomplete] [--drop-metadata] [--version]
[--reason] [--gui [PORT]] [--printshellcmds] [--debug-dag] [--stats FILE] [--nocolor] [--quiet] [--print-compilation] [--verbose]
[--force-use-threads] [--allow-ambiguity] [--nolock] [--ignore-incomplete] [--max-inventory-time SECONDS] [--latency-wait SECONDS]
[--wait-for-files [FILE ...]] [--wait-for-files-file FILE] [--notemp] [--all-temp] [--keep-remote] [--keep-target-files]
[--allowed-rules ALLOWED_RULES [ALLOWED_RULES ...]] [--local-groupid LOCAL_GROUPID] [--max-jobs-per-second MAX_JOBS_PER_SECOND]
[--max-status-checks-per-second MAX_STATUS_CHECKS_PER_SECOND] [-T RESTART_TIMES] [--attempt ATTEMPT] [--wrapper-prefix WRAPPER_PREFIX]
[--default-remote-provider {S3,GS,FTP,SFTP,S3Mocked,gfal,gridftp,iRODS,AzBlob,XRootD}] [--default-remote-prefix DEFAULT_REMOTE_PREFIX]
[--no-shared-fs] [--greediness GREEDINESS] [--no-hooks] [--overwrite-shellcmd OVERWRITE_SHELLCMD] [--debug] [--runtime-profile FILE]
[--mode {0,1,2}] [--show-failed-logs] [--log-handler-script FILE] [--log-service {none,slack,wms}]
[--cluster CMD | --cluster-sync CMD | --drmaa [ARGS]] [--cluster-config FILE] [--immediate-submit] [--jobscript SCRIPT] [--jobname NAME]
[--cluster-status CLUSTER_STATUS] [--cluster-cancel CLUSTER_CANCEL] [--cluster-cancel-nargs CLUSTER_CANCEL_NARGS]
[--cluster-sidecar CLUSTER_SIDECAR] [--drmaa-log-dir DIR] [--kubernetes [NAMESPACE]] [--container-image IMAGE] [--tibanna]
[--tibanna-sfn TIBANNA_SFN] [--precommand PRECOMMAND] [--tibanna-config TIBANNA_CONFIG [TIBANNA_CONFIG ...]] [--google-lifesciences]
[--google-lifesciences-regions GOOGLE_LIFESCIENCES_REGIONS [GOOGLE_LIFESCIENCES_REGIONS ...]]
[--google-lifesciences-location GOOGLE_LIFESCIENCES_LOCATION] [--google-lifesciences-keep-cache] [--tes URL] [--use-conda]
[--conda-not-block-search-path-envvars] [--list-conda-envs] [--conda-prefix DIR] [--conda-cleanup-envs]
[--conda-cleanup-pkgs [{tarballs,cache}]] [--conda-create-envs-only] [--conda-frontend {conda,mamba}] [--use-singularity]
[--singularity-prefix DIR] [--singularity-args ARGS] [--use-envmodules]
[target ...]
snakemake: error: argument -T/--restart-times: expected one argument
snakemake -v
7.3.4
I got the following Bowtie2 error:
Error in rule bowtie2_map:
jobid: 8
output: ctrl_mc/assembly/mc.index, ctrl_mc/assembly/mc.index, ctrl_mc/assembly/mc.sam, ctrl_mc/logs/bowtie2map.log
shell:
bowtie2-build -o 3 --threads 18 -q ctrl_mc/reference_selection/mc.refseq.fna ctrl_mc/assembly/mc.index 1>> ctrl_mc/assembly/mc.index 2>&1;bowtie2 -a --end-to-end --sensitive --no-unal -p 18 -x ctrl_mc/assembly/mc.index -q -U ctrl_R1_paired.fastq,ctrl_R2_paired.fastq,ctrl_unpaired.fastq -S ctrl_mc/assembly/mc.sam.all > ctrl_mc/logs/bowtie2map.log 2>&1; python3 /home/modesto/Applications/MetaCompass-2.0-beta/bin/best_strata.py ctrl_mc/assembly/mc.sam.all ctrl_mc/assembly/mc.sam; rm ctrl_mc/assembly/mc.sam.all && touch ctrl_mc/assembly/.run1.ok
(one of the commands exited with non-zero exit code; note that snakemake uses bash strict mode!)
I just updated all dependencies before using. I tried the 1.12 and 2.0Beta releases and I got the same error. Running on Ubuntu 20.04.1 LTS.
Thanks
Hi,
I got some errors generated during installation on Catalina MacOS 10.15.7, related with cstdaling library. I'm copying the output below. Any ideas??
Thanks
In file included from ./src/utils/breadth.cpp:2:
./src/utils/stdc++.h:55:10: fatal error: 'cstdalign' file not found
#include
^~~~~~~~~~~
1 error generated.
g++ -Wall -W -O2 -o ./bin/extractContigs ./src/utils/extractContigs.cpp -std=gnu++11
g++ -W -O2 -o ./bin/processmash ./src/utils/processmash.cpp -std=gnu++11
In file included from ./src/utils/processmash.cpp:2:
./src/utils/stdc++.h:55:10: fatal error: 'cstdalign' file not found
#include
^~~~~~~~~~~
1 error generated.
g++ -Wall -W -O2 -o ./bin/fq2fa ./src/utils/fq2fa.cpp -std=gnu++11
In file included from ./src/utils/fq2fa.cpp:2:
./src/utils/stdc++.h:55:10: fatal error: 'cstdalign' file not found
#include
^~~~~~~~~~~
1 error generated.
I already installed the Datrie using pip install but this error pops and the assembly gets exiting.
python3.7 go_metacompass.py -r ~/Desktop/MGS\ ref/G.Vaginalis/g.vaginalis.fna -P /Volumes/G-DRIVE\ USB-C/Binned_reads/DN00207A_10/DN00297A_host_removed_r1.fastq,/Volumes/G-DRIVE\ USB-C/Binned_reads/DN00207A_10/DN00297A_host_removed_r2.fastq -o example1_output -m 1 -t 4
confirming file containing reference genomes exists..
usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvXc] source_file target_file
cp [-R [-H | -L | -P]] [-fi | -n] [-apvXc] source_file ... target_directory
[OK]
checking for dependencies (Bowtie2, Blast, kmermask, Snakemake, etc)
Bowtie2--->[OK]
/Users/bhanuprakashchowdarysakhamuri/miniconda2/bin//blastn
Blast+--->[OK]
/Users/bhanuprakashchowdarysakhamuri/MetaCompass/bin//kmer-mask
kmer-mask--->[OK]
/usr/local/bin/snakemake
Snakemake--->[OK]
Traceback (most recent call last):
File "/usr/local/bin/snakemake", line 6, in
from pkg_resources import load_entry_point
File "/usr/local/lib/python3.7/site-packages/pkg_resources/init.py", line 3095, in
@_call_aside
File "/usr/local/lib/python3.7/site-packages/pkg_resources/init.py", line 3079, in _call_aside
f(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/pkg_resources/init.py", line 3108, in _initialize_master_working_set
working_set = WorkingSet._build_master()
File "/usr/local/lib/python3.7/site-packages/pkg_resources/init.py", line 570, in _build_master
ws.require(requires)
File "/usr/local/lib/python3.7/site-packages/pkg_resources/init.py", line 888, in require
needed = self.resolve(parse_requirements(requirements))
File "/usr/local/lib/python3.7/site-packages/pkg_resources/init.py", line 774, in resolve
raise DistributionNotFound(req, requirers)
pkg_resources.DistributionNotFound: The 'datrie' distribution was not found and is required by snakemake
ERROR: snakemake command failed; exiting..
touch: example1_output/DN00297A_host_removed_r1.0.assembly.out/run.fail: No such file or directory
Hi,
I have an error with the kmer-mask command
here is the error :
Full Traceback (most recent call last):
File "/home/michoug/.conda/envs/my_root/lib/python3.5/site-packages/snakemake/executors.py", line 848, in run_wrapper
version, rule, conda_env)
File "/ibex/scratch/michoug/MetaCompass/snakemake/metacompass.iter0.paired.py", line 95, in __rule_kmer_mask
shell:"echo {params.mfilter};python3 %s/bin/mash_filter.py {input.r1} {input.g1} {output.reffile} {params.mfilter} 1>> {log} 2>&1"%(config["mcdir"])
File "/home/michoug/.conda/envs/my_root/lib/python3.5/site-packages/snakemake/shell.py", line 80, in __new__
raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'kmer-mask -ms 28 -mdb /ibex/scratch/michoug/MetaCompass/refseq/kmer-mask_db/markers.mdb -1 SRR7287240/SRR7287240_1_val_1.merged.fq -clean 0.0 -match 0.01 -nomasking -t 32 -l 515 -o SRR7287240/SRR7287240_1_val_1.marker 1>> SRR7287240/SRR7287240_1_val_1.0.kmermask.log 2>&1' returned non-zero exit status 1
and the log of the software
ERROR: -l too small for reads:
a = '๎=??ยก?>?+&=??ฮฉ???$6?`."???P?'
b = ''
I think it's because the file SRR7287240_1_val_1.merged.fq is in fact compressed
Hi,
what does this mean? is it that the quality of my reads are bad or that I don't have any MAG related to the ref genome I am searching for? Thanks!
Job 2: ---assembly stats reference-guided contigs
python3 /gpfs/software/ada/build/MetaCompass-2.0-beta/bin/assembly_stats.py metacompass_heavyadmintest/metacompass_output/metacompass.final.ctg.fa 1 > metacompass_heavyadmintest/metacompass_output/metacompass_assembly_stats.tsv && touch metacompass_heavyadmintest/metacompass_output/.run2.ok
Error in rule stats_all:
jobid: 2
output: metacompass_heavyadmintest/metacompass_output/metacompass_assembly_stats.tsv
shell:
python3 /gpfs/software/ada/build/MetaCompass-2.0-beta/bin/assembly_stats.py metacompass_heavyadmintest/metacompass_output/metacompass.final.ctg.fa 1 > metacompass_heavyadmintest/metacompass_output/metacompass_assembly_stats.tsv && touch metacompass_heavyadmintest/metacompass_output/.run2.ok
(one of the commands exited with non-zero exit code; note that snakemake uses bash strict mode!)
Removing output files of failed job stats_all since they might be corrupted:
metacompass_heavyadmintest/metacompass_output/metacompass_assembly_stats.tsv
Just a feature suggestion rather than an issue. An option to install without the need to download the huge refseq database would be very useful. I only plan on using MetaCompass without reference selection to assemble a target genome and it would save huge amounts of time to install without downloading refseq
Thanks for creating this tool
I'm trying to run the MetaCompass
examples listed in the README, and I've run into errors for both.
conda create -n metacompass bioconda::snakemake bioconda::blast bioconda::meryl bioconda::samtools bioconda::megahit bioconda::pilon
source activate metacompass
git clone https://github.com/marbl/MetaCompass.git
cd MetaCompass
./install.sh
./go_metacompass.py -r tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta -P tutorial/thao2000.1.fq,tutorial/thao2000.2.fq -o example1_output -m 3 -t 4
$ ./go_metacompass.py -r tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta -P tutorial/thao2000.1.fq,tutorial/thao2000.2.fq -o example1_output -m 3 -t 4 --clobber
confirming file containing reference genomes exists..
[OK]
checking for dependencies (Bowtie2, Blast, kmermask, Snakemake, etc)
Bowtie2--->[OK]
/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/bin/blastn
Blast+--->[OK]
/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/bin/kmer-mask
kmer-mask--->[OK]
/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/bin/snakemake
Snakemake--->[OK]
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 4
Rules claiming more threads will be scaled down.
Job counts:
count jobs
1 all
1 assemble_unmapped
1 bam_sort
1 bowtie2_map
1 build_contigs
1 create_tsv
1 join_contigs
1 merge_reads
1 pilon_contigs
1 pilon_map
1 sam_to_bam
11
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (1):
merge_reads
Selected jobs (1):
merge_reads
Resources after job selection: {'_cores': 3, '_nodes': 9223372036854775806}
Job 10: ---merge fastq reads
Reason: Missing output files: example1_output/thao2000.merged.fq
Building DAG of jobs...
Using shell: /bin/bash
Job counts:
count jobs
1 merge_reads
1
Complete log: /ebio/abt3_projects/software/dev/tmp/MetaCompass/.snakemake/log/2018-05-11T164650.403818.snakemake.log
Releasing 1 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
Finished job 10.
1 of 11 steps (9%) done
Resources before job selection: {'_cores': 4, '_nodes': 9223372036854775807}
Ready jobs (1):
bowtie2_map
Selected jobs (1):
bowtie2_map
Resources after job selection: {'_cores': 0, '_nodes': 9223372036854775806}
Job 9: ---Build index .
Reason: Missing output files: example1_output/thao2000.0.assembly.out/thao2000.sam; Input files updated by another job: example1_output/thao2000.merged.fq
Full Traceback (most recent call last):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/executors.py", line 1324, in run_wrapper
singularity_args, use_singularity, None)
File "/ebio/abt3_projects/software/dev/tmp/MetaCompass/snakemake/metacompass.iter0.ref.py", line 88, in __rule_bowtie2_map
rule sam_to_bam:
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/shell.py", line 110, in __new__
raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command ' set -euo pipefail; bowtie2-build -o 3 --threads 4 -q tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta example1_output/thao2000.0.assembly.out/thao2000.index 1>> example1_output/thao2000.0.assembly.out/thao2000.index 2>&1;bowtie2 -a --end-to-end --sensitive --no-unal -p 4 -x example1_output/thao2000.0.assembly.out/thao2000.index -q -U example1_output/thao2000.merged.fq -S example1_output/thao2000.0.assembly.out/thao2000.sam.all > example1_output/thao2000.0.bowtie2map.log 2>&1; /ebio/abt3_projects/software/dev/tmp/MetaCompass/bin/best_strata.py example1_output/thao2000.0.assembly.out/thao2000.sam.all example1_output/thao2000.0.assembly.out/thao2000.sam; rm example1_output/thao2000.0.assembly.out/thao2000.sam.all ' returned non-zero exit status 1.
Error in rule bowtie2_map:
jobid: 9
output: example1_output/thao2000.0.assembly.out/thao2000.index, example1_output/thao2000.0.assembly.out/thao2000.index, example1_output/thao2000.0.assembly.out/thao2000.sam
log: example1_output/thao2000.0.bowtie2map.log
Full Traceback (most recent call last):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/executors.py", line 1324, in run_wrapper
singularity_args, use_singularity, None)
File "/ebio/abt3_projects/software/dev/tmp/MetaCompass/snakemake/metacompass.iter0.ref.py", line 88, in __rule_bowtie2_map
rule sam_to_bam:
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/shell.py", line 110, in __new__
raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command ' set -euo pipefail; bowtie2-build -o 3 --threads 4 -q tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta example1_output/thao2000.0.assembly.out/thao2000.index 1>> example1_output/thao2000.0.assembly.out/thao2000.index 2>&1;bowtie2 -a --end-to-end --sensitive --no-unal -p 4 -x example1_output/thao2000.0.assembly.out/thao2000.index -q -U example1_output/thao2000.merged.fq -S example1_output/thao2000.0.assembly.out/thao2000.sam.all > example1_output/thao2000.0.bowtie2map.log 2>&1; /ebio/abt3_projects/software/dev/tmp/MetaCompass/bin/best_strata.py example1_output/thao2000.0.assembly.out/thao2000.sam.all example1_output/thao2000.0.assembly.out/thao2000.sam; rm example1_output/thao2000.0.assembly.out/thao2000.sam.all ' returned non-zero exit status 1.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/executors.py", line 378, in _callback
raise ex
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/concurrent/futures/thread.py", line 56, in run
result = self.fn(*self.args, **self.kwargs)
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/executors.py", line 1336, in run_wrapper
show_traceback=True))
snakemake.exceptions.RuleException: CalledProcessError in line 50 of /ebio/abt3_projects/software/dev/tmp/MetaCompass/snakemake/metacompass.iter0.ref.py:
Command ' set -euo pipefail; bowtie2-build -o 3 --threads 4 -q tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta example1_output/thao2000.0.assembly.out/thao2000.index 1>> example1_output/thao2000.0.assembly.out/thao2000.index 2>&1;bowtie2 -a --end-to-end --sensitive --no-unal -p 4 -x example1_output/thao2000.0.assembly.out/thao2000.index -q -U example1_output/thao2000.merged.fq -S example1_output/thao2000.0.assembly.out/thao2000.sam.all > example1_output/thao2000.0.bowtie2map.log 2>&1; /ebio/abt3_projects/software/dev/tmp/MetaCompass/bin/best_strata.py example1_output/thao2000.0.assembly.out/thao2000.sam.all example1_output/thao2000.0.assembly.out/thao2000.sam; rm example1_output/thao2000.0.assembly.out/thao2000.sam.all ' returned non-zero exit status 1.
File "/ebio/abt3_projects/software/dev/tmp/MetaCompass/snakemake/metacompass.iter0.ref.py", line 50, in __rule_bowtie2_map
RuleException:
CalledProcessError in line 50 of /ebio/abt3_projects/software/dev/tmp/MetaCompass/snakemake/metacompass.iter0.ref.py:
Command ' set -euo pipefail; bowtie2-build -o 3 --threads 4 -q tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta example1_output/thao2000.0.assembly.out/thao2000.index 1>> example1_output/thao2000.0.assembly.out/thao2000.index 2>&1;bowtie2 -a --end-to-end --sensitive --no-unal -p 4 -x example1_output/thao2000.0.assembly.out/thao2000.index -q -U example1_output/thao2000.merged.fq -S example1_output/thao2000.0.assembly.out/thao2000.sam.all > example1_output/thao2000.0.bowtie2map.log 2>&1; /ebio/abt3_projects/software/dev/tmp/MetaCompass/bin/best_strata.py example1_output/thao2000.0.assembly.out/thao2000.sam.all example1_output/thao2000.0.assembly.out/thao2000.sam; rm example1_output/thao2000.0.assembly.out/thao2000.sam.all ' returned non-zero exit status 1.
File "/ebio/abt3_projects/software/dev/tmp/MetaCompass/snakemake/metacompass.iter0.ref.py", line 50, in __rule_bowtie2_map
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/concurrent/futures/thread.py", line 56, in run
Removing output files of failed job bowtie2_map since they might be corrupted:
example1_output/thao2000.0.assembly.out/thao2000.index, example1_output/thao2000.0.assembly.out/thao2000.index
Releasing 4 _cores (now 4).
Releasing 1 _nodes (now 9223372036854775807).
Shutting down, this might take some time.
Exiting because a job execution failed. Look above for error message
Complete log: /ebio/abt3_projects/software/dev/tmp/MetaCompass/.snakemake/log/2018-05-11T164647.579389.snakemake.log
unlocking
removing lock
removing lock
removed all locks
ERROR: snakemake command failed; exiting..
wget ftp://public-ftp.hmpdacc.org/Illumina/posterior_fornix/SRS044742.tar.bz2
tar xvjf SRS044742.tar.bz2
./go_metacompass.py -P SRS044742/SRS044742.denovo_duplicates_marked.trimmed.1.fastq,SRS044742/SRS044742.denovo_duplicates_marked.trimmed.2.fastq -U SRS044742/SRS044742.denovo_duplicates_marked.trimmed.singleton.fastq -o example2_output
$ ./go_metacompass.py -P SRS044742/SRS044742.denovo_duplicates_marked.trimmed.1.fastq,SRS044742/SRS044742.denovo_duplicates_marked.trimmed.2.fastq -U SRS044742/SRS044742.denovo_duplicates_marked.trimmed.singleton.fastq -o example2_output
checking for dependencies (Bowtie2, Blast, kmermask, Snakemake, etc)
Bowtie2--->[OK]
/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/bin/blastn
Blast+--->[OK]
/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/bin/kmer-mask
kmer-mask--->[OK]
/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/bin/snakemake
Snakemake--->[OK]
Full Traceback (most recent call last):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 646, in block_content
rulename=self.rulename).consume():
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 105, in consume
for t, orig in self.state(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 170, in block
raise StopAutomaton(token)
snakemake.parser.StopAutomaton: TokenInfo(type=1 (NAME), string='output', start=(78, 4), end=(78, 10), line=" output:expand('{prefix}/{sample}.fasta',prefix=config['prefix'],sample=config['sample'])\n")
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 646, in block_content
rulename=self.rulename).consume():
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 105, in consume
for t, orig in self.state(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 170, in block
raise StopAutomaton(token)
snakemake.parser.StopAutomaton: TokenInfo(type=1 (NAME), string='message', start=(79, 4), end=(79, 11), line=' message: """---Converting fastq to fasta."""\n')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 646, in block_content
rulename=self.rulename).consume():
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 105, in consume
for t, orig in self.state(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 170, in block
raise StopAutomaton(token)
snakemake.parser.StopAutomaton: TokenInfo(type=1 (NAME), string='shell', start=(80, 4), end=(80, 9), line=' shell : "perl %s/bin/fq2fa.pl -i {input} -o {output}"%(config["mcdir"])\n')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 646, in block_content
rulename=self.rulename).consume():
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 105, in consume
for t, orig in self.state(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 170, in block
raise StopAutomaton(token)
snakemake.parser.StopAutomaton: TokenInfo(type=1 (NAME), string='log', start=(81, 4), end=(81, 7), line=" log:'%s/%s.%s.fastq2fasta.log'%(config['prefix'],config['sample'],config['iter'])\n")
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/__init__.py", line 382, in snakemake
print_compilation=print_compilation)
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/workflow.py", line 652, in include
rulecount=self._rulecount)
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 766, in parse
for t, orig_token in automaton.consume():
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 105, in consume
for t, orig in self.state(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 710, in python
for t in self.subautomaton(token.string).consume():
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 105, in consume
for t, orig in self.state(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 179, in block
for t in self.block_content(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 653, in block_content
for t in self.block(e.token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 179, in block
for t in self.block_content(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 653, in block_content
for t in self.block(e.token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 179, in block
for t in self.block_content(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 653, in block_content
for t in self.block(e.token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 179, in block
for t in self.block_content(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 653, in block_content
for t in self.block(e.token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 179, in block
for t in self.block_content(token):
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 644, in block_content
"rule {}.".format(self.rulename), token)
File "/ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass/lib/python3.6/site-packages/snakemake/parser.py", line 117, in error
(self.snakefile.path, lineno(token), None, None))
File "/ebio/abt3_projects/software/dev/tmp/MetaCompass/snakemake/metacompass.iter0.py", line 81
SyntaxError: No rule keywords allowed after run/shell/script/wrapper/cwl in rule fastq2fasta.
SyntaxError in line 81 of /ebio/abt3_projects/software/dev/tmp/MetaCompass/snakemake/metacompass.iter0.py:
No rule keywords allowed after run/shell/script/wrapper/cwl in rule fastq2fasta. (metacompass.iter0.py, line 81)
ERROR: snakemake command failed; exiting..
touch: cannot touch 'example2_output/SRS044742.0.assembly.out/run.fail': No such file or directory
# packages in environment at /ebio/abt3_projects/software/dev/miniconda3_dev/envs/metacompass:
#
# Name Version Build Channel
aioeasywebdav 2.2.0 py36_0 conda-forge
aiohttp 3.2.1 py36_0 conda-forge
appdirs 1.4.3 py_0 conda-forge
asn1crypto 0.24.0 py36_0 conda-forge
async-timeout 2.0.1 py36_0 conda-forge
attrs 18.1.0 py_0 conda-forge
bcrypt 3.1.4 py36h621fe67_0
blast 2.7.1 boost1.64_3 bioconda
boost 1.64.0 py36_4 conda-forge
boost-cpp 1.64.0 1 conda-forge
boto3 1.7.19 py_0 conda-forge
botocore 1.10.19 py_0 conda-forge
bzip2 1.0.6 1 conda-forge
ca-certificates 2018.4.16 0 conda-forge
cachetools 2.0.1 py_0 conda-forge
certifi 2018.4.16 py36_0 conda-forge
cffi 1.11.5 py36_0 conda-forge
chardet 3.0.4 py36_0 conda-forge
configargparse 0.12.0 py36_0 conda-forge
cryptography 2.2.1 py36_0 conda-forge
curl 7.59.0 1 conda-forge
datrie 0.7.1 py36_0
docutils 0.14 py36_0 conda-forge
dropbox 8.8.1 py_0 conda-forge
filechunkio 1.8 py36_1 conda-forge
ftputil 3.4 py_0 conda-forge
gmp 6.1.2 0 conda-forge
gnutls 3.5.17 0 conda-forge
google-auth 1.4.1 py_0 conda-forge
google-auth-httplib2 0.0.2 py36_0 conda-forge
google-cloud-core 0.24.1 py36_0 conda-forge
google-cloud-storage 1.1.1 py36_0 conda-forge
google-resumable-media 0.0.2 py36_0 conda-forge
googleapis-common-protos 1.5.3 py36_0 conda-forge
httplib2 0.11.1 py36_0 conda-forge
icu 58.2 0 conda-forge
idna 2.6 py36_1 conda-forge
idna_ssl 1.0.0 0 conda-forge
intel-openmp 2018.0.0 8
jmespath 0.9.3 py36_0 conda-forge
krb5 1.14.6 0 conda-forge
libdeflate 0.8 0 bioconda
libffi 3.2.1 3 conda-forge
libgcc 7.2.0 h69d50b8_2
libgcc-ng 7.2.0 hdf63c60_3
libgfortran-ng 7.2.0 hdf63c60_3
libidn11 1.33 0 conda-forge
libprotobuf 3.5.2 0 conda-forge
libssh2 1.8.0 2 conda-forge
libstdcxx-ng 7.2.0 hdf63c60_3
megahit 1.1.2 py36_1 bioconda
meryl 2013 0 bioconda
mkl 2018.0.2 1
mkl_fft 1.0.2 py36_0 conda-forge
mkl_random 1.0.1 py36_0 conda-forge
multidict 4.3.1 py36_0 conda-forge
ncurses 5.9 10 conda-forge
nettle 3.3 0 conda-forge
numpy 1.14.3 py36h14a74c5_0
numpy-base 1.14.3 py36hdbf6ddf_0
openjdk 8.0.121 1
openssl 1.0.2o 0 conda-forge
pandas 0.22.0 py36_1 conda-forge
paramiko 2.4.1 py36_0 conda-forge
pcre 8.41 1 conda-forge
perl 5.22.0.1 0 conda-forge
perl-app-cpanminus 1.7043 pl5.22.0_0 bioconda
perl-archive-tar 2.18 pl5.22.0_2 bioconda
perl-carp 1.38 pl5.22.0_0 bioconda
perl-compress-raw-bzip2 2.069 1 bioconda
perl-compress-raw-zlib 2.069 3 bioconda
perl-data-dumper 2.161 pl5.22.0_0 bioconda
perl-exporter 5.72 pl5.22.0_0 bioconda
perl-exporter-tiny 0.042 1 bioconda
perl-extutils-makemaker 7.24 pl5.22.0_1 bioconda
perl-io-compress 2.069 pl5.22.0_2 bioconda
perl-io-zlib 1.10 1 bioconda
perl-list-moreutils 0.428 pl5.22.0_0 bioconda
perl-pathtools 3.73 0 bioconda
perl-scalar-list-utils 1.45 2 bioconda
perl-test-more 1.001002 pl5.22.0_0 bioconda
perl-threaded 5.22.0 pl5.22.0_12 bioconda
pilon 1.22 py36_0 bioconda
pip 9.0.3 py36_0 conda-forge
prettytable 0.7.2 py36_1 conda-forge
protobuf 3.5.2 py36_0 conda-forge
psutil 5.4.5 py36_0 conda-forge
pyasn1 0.4.2 py_0 conda-forge
pyasn1-modules 0.2.1 py_0 conda-forge
pycparser 2.18 py36_0 conda-forge
pynacl 1.1.2 py36_0 conda-forge
pyopenssl 17.5.0 py36_1 conda-forge
pysftp 0.2.9 py36_0 conda-forge
pysocks 1.6.8 py36_1 conda-forge
python 3.6.5 1 conda-forge
python-dateutil 2.7.2 py_0 conda-forge
python-irodsclient 0.7.0 py_0 conda-forge
pytz 2018.4 py_0 conda-forge
pyyaml 3.12 py36_1 conda-forge
ratelimiter 1.2.0 py36_0 conda-forge
readline 7.0 0 conda-forge
requests 2.18.4 py36_1 conda-forge
rsa 3.4.2 py36_0 conda-forge
s3transfer 0.1.13 py36_0 conda-forge
samtools 1.8 3 bioconda
setuptools 39.1.0 py36_0 conda-forge
six 1.11.0 py36_1 conda-forge
snakemake 4.8.1 py36_0 bioconda
sqlite 3.20.1 2 conda-forge
tk 8.6.7 0 conda-forge
urllib3 1.22 py36_0 conda-forge
wheel 0.31.0 py36_0 conda-forge
wrapt 1.10.11 py36_0 conda-forge
xmlrunner 1.7.7 py_0 conda-forge
xz 5.2.3 0 conda-forge
yaml 0.1.7 0 conda-forge
yarl 1.2.3 py36_0 conda-forge
zlib 1.2.11 0 conda-forge
Hello,
I've been trying to check my installation of MetaCompass before running on my data and have it at least one (possibly two, possibly related) issues.
First, I installed the dependencies and made sure they are in my path. Then ran the tutorial one example.
python go_metacompass.py -r tutorial/Candidatus_Carsonella_ruddii_HT_Thao2000.fasta -P tutorial/thao2000.1.fq,tutorial/thao2000.2.fq -o example1_output -m 1 -t 4
and got the following output
confirming file containing reference genomes exists..
[OK]
checking for dependencies (Bowtie2, Blast, kmermask, Snakemake, etc)
Bowtie2--->[OK]
/home/talex/.pyenv/versions/miniconda3-latest/bin/blastn
Blast+--->[OK]
/home/talex/apps/MetaCompass/bin/kmer-mask
kmer-mask--->[OK]
/home/talex/.pyenv/versions/miniconda3-latest/bin/snakemake
Snakemake--->[OK]
Traceback (most recent call last):
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/io.py", line 697, in _load_configfile
return yaml.load(f)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/yaml/__init__.py", line 72, in load
return loader.get_single_data()
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/yaml/constructor.py", line 35, in get_single_data
node = self.get_single_node()
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/yaml/composer.py", line 36, in get_single_node
document = self.compose_document()
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/yaml/composer.py", line 55, in compose_document
node = self.compose_node(None, None)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/yaml/composer.py", line 84, in compose_node
node = self.compose_mapping_node(anchor)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/yaml/composer.py", line 127, in compose_mapping_node
while not self.check_event(MappingEndEvent):
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/yaml/parser.py", line 98, in check_event
self.current_event = self.state()
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/yaml/parser.py", line 550, in parse_flow_mapping_key
"expected ',' or '}', but got %r" % token.id, token.start_mark)
yaml.parser.ParserError: while parsing a flow mapping
in "/home/talex/apps/MetaCompass/snakemake/config.json", line 1, column 1
expected ',' or '}', but got '<scalar>'
in "/home/talex/apps/MetaCompass/snakemake/config.json", line 17, column 14
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/talex/.pyenv/versions/miniconda3-latest/bin/snakemake", line 6, in <module>
sys.exit(snakemake.main())
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/__init__.py", line 1017, in main
max_jobs_per_second=args.max_jobs_per_second)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/__init__.py", line 270, in snakemake
overwrite_config.update(load_configfile(configfile))
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/io.py", line 708, in load_configfile
config = _load_configfile(configpath)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/io.py", line 699, in _load_configfile
raise WorkflowError("Config file is not valid JSON or YAML. "
snakemake.exceptions.WorkflowError: Config file is not valid JSON or YAML. In case of YAML, make sure to not mix whitespace and tab indentation.
ERROR: snakemake command failed; exiting..
touch: cannot touch 'example1_output/thao2000.0.assembly.out/run.fail': No such file or directory
I investigated the file /home/talex/apps/MetaCompass/snakemake/config.json
and noticed what I think is incorrect JSON formatting (no expert, kind of just a guess)
{
"sample" : "",
"r1" : [""],
"r2" : [""],
"ru" : [""],
"reads" : [""],
"reference":[""],
"pickref":"breadth",
"length":100,
"prefix":".",
"memory": 50,
"nthreads": 64,
"iter": 1,
"mincov" : 3,
"minlen" : 100,
"mfilter" : 0.00005,
"cogcov" = 10,
"mcdir": "."
}
I think "cogcov" = 10,
should be "cogcov" : 10,
, so I updated it and re-ran the example. I indeed did get a different error.
confirming file containing reference genomes exists..
[OK]
checking for dependencies (Bowtie2, Blast, kmermask, Snakemake, etc)
Bowtie2--->[OK]
/home/talex/.pyenv/versions/miniconda3-latest/bin/blastn
Blast+--->[OK]
/home/talex/apps/MetaCompass/bin/kmer-mask
kmer-mask--->[OK]
/home/talex/.pyenv/versions/miniconda3-latest/bin/snakemake
Snakemake--->[OK]
Full Traceback (most recent call last):
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/__init__.py", line 399, in snakemake
no_hooks=no_hooks)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/workflow.py", line 308, in execute
dag.init()
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 115, in init
job = self.update([job])
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 436, in update
raise exceptions[0]
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 408, in update
skip_until_dynamic=skip_until_dynamic)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 468, in update_
raise ex
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 462, in update_
job.dynamic_input)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 436, in update
raise exceptions[0]
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 408, in update
skip_until_dynamic=skip_until_dynamic)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 468, in update_
raise ex
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 462, in update_
job.dynamic_input)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 436, in update
raise exceptions[0]
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 408, in update
skip_until_dynamic=skip_until_dynamic)
File "/home/talex/.pyenv/versions/miniconda3-latest/lib/python3.5/site-packages/snakemake/dag.py", line 477, in update_
raise MissingInputException(job.rule, missing_input)
snakemake.exceptions.MissingInputException: Missing input files for rule pilon_map:
MissingInputException in line 72 of /home/talex/apps/MetaCompass/snakemake/metacompass.iter0.ref.py:
Missing input files for rule pilon_map:
unlocking
removed all locks
ERROR: snakemake command failed; exiting..
touch: cannot touch 'example1_output/thao2000.0.assembly.out/run.fail': No such file or directory
looking at /home/talex/apps/MetaCompass/snakemake/metacompass.iter0.ref.py:
I figured it could have something to do with bowtie2...
I found this issue #3. So I decided to uninstall MetaCompass, install bowtie2 2.2.9, and create an alias in my .bash_profile
to override the system version, then reinstalled MetaCompass. I got the exact same series of errors (before and after updating the config.json
file.
here's some info on my dependency installations
pyenv versions
system
2.7.12
2.7.9
3.1
3.5.1
3.6.1
miniconda2-latest
* miniconda3-latest (set by /home/talex/.pyenv/version)
blastn -version
blastn: 2.6.0+
snakemake --version
3.7.1
bowtie2 -version
Bowtie 2 version 2.2.9 by Ben Langmead ([email protected], www.cs.jhu.edu/~langmea)
samtools --version
samtools 1.6
megahit -version
megahit: MEGAHIT v1.1.3
java -version
openjdk version "1.8.0_181"
OpenJDK Runtime Environment (build 1.8.0_181-8u181-b13-0ubuntu0.16.04.1-b13)
OpenJDK 64-Bit Server VM (build 25.181-b13, mixed mode)
plenv versions
system
* 5.18.0 (set by /home/talex/.plenv/version)
I've also tried installing the kmer-mask via the meryl installation from https://sourceforge.net/p/kmer/code/HEAD/tree/trunk/ and also adding using the path to the MetaCompass installation. Above errors identical. Any help getting this to work would be appreciated.
Thanks!
Hi,
I got the following errors while trying to install MetaCompass, I guess the links have been removed? Could you please update it?
Thanks a lot,
Shengwei
./install.sh
#Installing MetaCompass
g++ -Wall -W -O2 -o ./bin/extractSeq ./src/utils/extractSeq.cpp
g++ -Wall -W -O2 -o ./bin/formatFASTA ./src/utils/formatFASTA.cpp
g++ -Wall -W -O2 -o ./bin/buildcontig ./src/buildcontig/buildcontig.cpp ./src/buildcontig/cmdoptions.cpp ./src/buildcontig/memory.cpp ./src/buildcontig/procmaps.cpp ./src/buildcontig/outputfiles.cpp
wget --no-check-certificate https://gembox.cbcb.umd.edu/metacompass/markers.tar.gz -P ./src
--2018-10-13 01:41:48-- https://gembox.cbcb.umd.edu/metacompass/markers.tar.gz
Resolving gembox.cbcb.umd.edu (gembox.cbcb.umd.edu)... 128.8.120.25
Connecting to gembox.cbcb.umd.edu (gembox.cbcb.umd.edu)|128.8.120.25|:443... connected.
HTTP request sent, awaiting response... 404 Not Found
2018-10-13 01:41:49 ERROR 404: Not Found.
cd ./src/metaphyler/
tar -xzvf ./markers.tar.gz
tar (child): ./markers.tar.gz: Cannot open: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error is not recoverable: exiting now
cd ../../
wget --no-check-certificate https://gembox.cbcb.umd.edu/metacompass/refseq.tar.gz
--2018-10-13 01:41:49-- https://gembox.cbcb.umd.edu/metacompass/refseq.tar.gz
Resolving gembox.cbcb.umd.edu (gembox.cbcb.umd.edu)... 128.8.120.25
Connecting to gembox.cbcb.umd.edu (gembox.cbcb.umd.edu)|128.8.120.25|:443... connected.
HTTP request sent, awaiting response... 404 Not Found
2018-10-13 01:41:49 ERROR 404: Not Found.
tar -xzvf refseq.tar.gz
tar (child): refseq.tar.gz: Cannot open: No such file or directory
tar (child): Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error is not recoverable: exiting now
rm: cannot remove 'refseq.tar.gz': No such file or directory
Hi,
I am running the following command:
python3 go_metacompass.py -r NC_040570.1.fasta -1 ERR2040215_1fp.fq -2 ERR2040215_2fp.fq -t 10
And I get the following error:
[Tue Mar 8 10:02:21 2022] Job 7: ---Build index .
[Tue Mar 8 10:02:30 2022] bowtie2-build -o 3 --threads 10 -q NC_040570.1.fasta metacompass_assembly/assembly/mc.index 1>> metacompass_assembly/assembly/mc.index 2>&1;bowtie2 -a --end-to-end --sensitive --no-unal -p 10 -x metacompass_assembly/assembly/mc.index -q -U ../fp/ERR2040215_1fp.fq,../fp/ERR2040215_2fp.fq -S metacompass_assembly/assembly/mc.sam.all > metacompass_assembly/logs/bowtie2map.log 2>&1; python3 /data/12T/fp/software/MetaCompass/bin/best_strata.py metacompass_assembly/assembly/mc.sam.all metacompass_assembly/assembly/mc.sam; rm metacompass_assembly/assembly/mc.sam.all && touch metacompass_assembly/assembly/.run1.ok
Traceback (most recent call last):
File "/data/12T/fp/software/MetaCompass/bin/best_strata.py", line 13, in
while line[0]=="@":
IndexError: string index out of range
[Tue Mar 8 10:05:49 2022] Error in rule bowtie2_map:
[Tue Mar 8 10:05:49 2022] jobid: 7
[Tue Mar 8 10:05:49 2022] output: metacompass_assembly/assembly/mc.index, metacompass_assembly/assembly/mc.index, metacompass_assembly/assembly/mc.sam, metacompass_assembly/logs/bowtie2map.log
[Tue Mar 8 10:05:49 2022] RuleException:
[Tue Mar 8 10:05:49 2022] CalledProcessError in line 42 of /data/12T/fp/software/MetaCompass/snakemake/metacompass.ref.paired.py:
[Tue Mar 8 10:05:49 2022] Command ' set -euo pipefail; bowtie2-build -o 3 --threads 10 -q NC_040570.1.fasta metacompass_assembly/assembly/mc.index 1>> metacompass_assembly/assembly/mc.index 2>&1;bowtie2 -a --end-to-end --sensitive --no-unal -p 10 -x metacompass_assembly/assembly/mc.index -q -U ../fp/ERR2040215_1fp.fq,../fp/ERR2040215_2fp.fq -S metacompass_assembly/assembly/mc.sam.all > metacompass_assembly/logs/bowtie2map.log 2>&1; python3 /data/12T/fp/software/MetaCompass/bin/best_strata.py metacompass_assembly/assembly/mc.sam.all metacompass_assembly/assembly/mc.sam; rm metacompass_assembly/assembly/mc.sam.all && touch metacompass_assembly/assembly/.run1.ok ' returned non-zero exit status 1.
[Tue Mar 8 10:05:49 2022] File "/data/12T/fp/software/MetaCompass/snakemake/metacompass.ref.paired.py", line 42, in __rule_bowtie2_map
[Tue Mar 8 10:05:49 2022] File "/home/root640/software/miniconda3/envs/fp/lib/python3.6/concurrent/futures/thread.py", line 56, in run
[Tue Mar 8 10:05:49 2022] Removing output files of failed job bowtie2_map since they might be corrupted:
[Tue Mar 8 10:05:49 2022] metacompass_assembly/assembly/mc.index, metacompass_assembly/assembly/mc.index, metacompass_assembly/assembly/mc.sam, metacompass_assembly/logs/bowtie2map.log
[Tue Mar 8 10:05:49 2022] Will exit after finishing currently running jobs.
[Tue Mar 8 10:05:49 2022] Exiting because a job execution failed. Look above for error message
[Tue Mar 8 10:05:49 2022] Complete log: .snakemake/log/2022-03-08T100221.342749.snakemake.log
Do you have any idea why this is happening?
Thanks
Hi @vcepeda,
thanks for this very interesting new tool!
I am about to install MetaCompass for the first time, and I have a question about the reference database (refseq), which is downloaded by default in the install.sh script. Is it possible to supply a different reference database, e.g. gtdb? Can the reference database be located outside the install directory (to save disk space by avoiding file duplication if the database is also used for other programs)?
Thanks!
Cheers,
Christiane
From the example data:
-- I have a set of metagenomic reads, and want to perform reference-guided assembly.
python3 go_metacompass.py -P [read1.fq,read2.fq] -l [max read length]-o [output_folder] -m [min coverage] -t [ncpu]
python3 go_metacompass.py -P tutorial/thao2000.1.fq,tutorial/thao2000.2.fq -l 150 -o example1_output_referenceguidedassembly -m 1 -t 4
Error:
...
---Build index .
Full Traceback (most recent call last):
File "/home/cseto/Programs/anaconda3/envs/MetaCompass/lib/python3.5/site-packages/snakemake/executors.py", line 784, in run_wrapper
version)
File "/home/cseto/Programs/MetaCompass-1.12/snakemake/metacompass.iter0.paired.py", line 337, in __rule_bowtie2_map
File "/home/cseto/Programs/anaconda3/envs/MetaCompass/lib/python3.5/site-packages/snakemake/shell.py", line 74, in new
raise sp.CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'bowtie2-build -o 3 --threads 4 -q example1_output_referenceguidedassembly/thao2000.0.assembly.out/mc.refseq.filt.fna example1_output_referenceguidedassembly/thao2000.0.assembly.out/thao2000.index 1>> example1_output_referenceguidedassembly/thao2000.0.assembly.out/thao2000.index 2>&1;bowtie2 -a --end-to-end --sensitive --no-unal -p 4 -x example1_output_referenceguidedassembly/thao2000.0.assembly.out/thao2000.index -q -U example1_output_referenceguidedassembly/thao2000.merged.fq -S example1_output_referenceguidedassembly/thao2000 .0.assembly.out/thao2000.sam.all > example1_output_referenceguidedassembly/thao2000.0.bowtie2map.log 2>&1; /home/cseto/Programs/MetaCompass-1.12/bin/best_strata.py example1_output_referencegu idedassembly/thao2000.0.assembly.out/thao2000.sam.all example1_output_referenceguidedassembly/thao2000.0.assembly.out/thao2000.sam; rm example1_output_referenceguidedassembly/thao2000.0.assembly.out/thao2000.sam.all' returned non-zero exit status 1
Presumably this begins at
bowtie2-build -o 3 --threads 4 -q example1_output_referenceguidedassembly/thao2000.0.assembly.out/mc.refseq.filt.fna example1_output_referenceguidedass embly/thao2000.0.assembly.out/thao2000.index
Warning: Empty fasta file: 'example1_output_referenceguidedassembly/thao2000.0.assembly.out/mc.refseq.filt.fna'
Warning: All fasta inputs were empty
Error: Encountered internal Bowtie 2 exception (#1)
However, assembly with reference (-r) seems to work fine.
issue in Snakefile/dependencies/timestamps
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.