Giter VIP home page Giter VIP logo

rna-seq's People

Contributors

cagaser avatar davycats avatar ffinfo avatar hailiangmei avatar jasperboom avatar rhpvorderman avatar tomkuipers1402 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

rna-seq's Issues

release 1.1.0

Release checklist

  • Check outstanding issues on JIRA and Github
  • Publish documentation (updateDocs.sh) from develop branch
    • Copy docs folder to gh-pages branch
    • Overwrite existing develop folder with docs folder on gh-pages
    • Push changes to gh-pages branch
  • Check latest documentation
    looks fine
  • Change current development version in CHANGELOG.md to stable version.
  • Run the release script release.sh
    • Check all submodules are tagged
    • Merge the develop branch into master
    • Created an annotated tag with the stable version number. Include changes
      from changelog.md.
    • Confirm or set stable version to be used for tagging
    • Push tag to remote.
    • Merge master branch back into develop.
    • Add updated version number to develop
  • Publish documentation (updateDocs.sh) from master branch
    • Copy docs folder to gh-pages branch
    • Rename docs to new stable version on gh-pages
    • Set latest version to new version
    • Push changes to gh-pages branch
  • Create a new release from the pushed tag on github

Pytest failing for all workflows

I ran pytest --kwd for testing out all workflows but all of them failed and only 30 passed. I specifically tested it out for Rna3PairedEndVariantCalling workflow and I got the following error.

cromwell.engine.workflow.lifecycle.execution.job.preparation.JobPreparationActor$$anonfun$1$$anon$1: Call input and runtime attributes evaluation failed for FastqcRead1:
Failed to evaluate input '__timeMinutes' (reason 1 of 1): [Attempted 1 time(s)] - NoSuchFileException: /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-sampleJobs/shard-0/SampleWorkflow/e70c60fe-c8c1-4c64-b058-c092b5838d70/call-qc/shard-0/QC/44e3077d-bc41-4537-8498-c1e23d4a29a8/call-FastqcRead1/execution/tests/data/rna3/R1.fq.gz
Package versions
_libgcc_mutex             0.1                 conda_forge    conda-forge
_openmp_mutex             4.5                       0_gnu    conda-forge
appdirs                   1.4.3                      py_1    conda-forge
argcomplete               1.11.1             pyh9f0ad1d_1    conda-forge
asn1crypto                1.3.0            py37hc8dfbb8_1    conda-forge
attrs                     19.3.0                     py_0    conda-forge
biowdl-input-converter    0.2.1                      py_0    bioconda
brotlipy                  0.7.0           py37h8f50634_1000    conda-forge
bzip2                     1.0.8                h516909a_2    conda-forge
ca-certificates           2020.6.20            hecda079_0    conda-forge
certifi                   2020.6.20        py37hc8dfbb8_0    conda-forge
cffi                      1.14.0           py37hd463f26_0    conda-forge
chardet                   3.0.4           py37hc8dfbb8_1006    conda-forge
coloredlogs               14.0             py37hc8dfbb8_1    conda-forge
cromwell                  48                            0    conda-forge
cryptography              2.9.2            py37hb09aad4_0    conda-forge
docker-py                 4.2.2            py37hc8dfbb8_0    conda-forge
docker-pycreds            0.4.0                      py_0    conda-forge
findutils                 4.6.0             h14c3975_1000    conda-forge
gettext                   0.19.8.1          hc5be6a0_1002    conda-forge
humanfriendly             8.2              py37hc8dfbb8_0    conda-forge
idna                      2.10               pyh9f0ad1d_0    conda-forge
importlib-metadata        1.7.0            py37hc8dfbb8_0    conda-forge
importlib_metadata        1.7.0                         0    conda-forge
jsonschema                3.2.0            py37hc8dfbb8_1    conda-forge
krb5                      1.17.1               hfafb76e_1    conda-forge
lark-parser               0.8.5              pyh9f0ad1d_0    conda-forge
ld_impl_linux-64          2.34                 h53a641e_5    conda-forge
libcurl                   7.71.0               hcdd3856_0    conda-forge
libdeflate                1.6                  h516909a_0    conda-forge
libedit                   3.1.20191231         h46ee950_0    conda-forge
libffi                    3.2.1             he1b5a44_1007    conda-forge
libgcc-ng                 9.2.0                h24d8f2e_2    conda-forge
libgomp                   9.2.0                h24d8f2e_2    conda-forge
libssh2                   1.9.0                hab1572f_2    conda-forge
libstdcxx-ng              9.2.0                hdf63c60_2    conda-forge
miniwdl                   0.7.5                      py_0    conda-forge
more-itertools            8.4.0                      py_0    conda-forge
ncurses                   6.1               hf484d3e_1002    conda-forge
openjdk                   8.0.192           h516909a_1005    conda-forge
openssl                   1.1.1g               h516909a_0    conda-forge
packaging                 20.4               pyh9f0ad1d_0    conda-forge
pip                       20.1.1                     py_1    conda-forge
pluggy                    0.13.1           py37hc8dfbb8_2    conda-forge
py                        1.9.0              pyh9f0ad1d_0    conda-forge
pycparser                 2.20               pyh9f0ad1d_2    conda-forge
pygtail                   0.11.1                     py_0    conda-forge
pyopenssl                 19.1.0                     py_1    conda-forge
pyparsing                 2.4.7              pyh9f0ad1d_0    conda-forge
pyrsistent                0.16.0           py37h8f50634_0    conda-forge
pysam                     0.16.0.1         py37hc501bad_0    bioconda
pysocks                   1.7.1            py37hc8dfbb8_1    conda-forge
pytest                    5.4.3            py37hc8dfbb8_0    conda-forge
pytest-workflow           1.4.0                      py_0    conda-forge
python                    3.7.6           cpython_h8356626_6    conda-forge
python-json-logger        0.1.11                     py_0    conda-forge
python_abi                3.7                     1_cp37m    conda-forge
pyyaml                    5.3.1            py37h8f50634_0    conda-forge
readline                  8.0                  hf8c457e_0    conda-forge
regex                     2020.6.8         py37h8f50634_0    conda-forge
requests                  2.24.0             pyh9f0ad1d_0    conda-forge
ruamel.yaml               0.16.6           py37h8f50634_1    conda-forge
ruamel.yaml.clib          0.2.0            py37h8f50634_1    conda-forge
setuptools                47.3.1           py37hc8dfbb8_0    conda-forge
six                       1.15.0             pyh9f0ad1d_0    conda-forge
sqlite                    3.32.3               hcee41ef_0    conda-forge
tk                        8.6.10               hed695b0_0    conda-forge
urllib3                   1.25.9                     py_0    conda-forge
wcwidth                   0.2.5              pyh9f0ad1d_0    conda-forge
websocket-client          0.57.0           py37hc8dfbb8_1    conda-forge
wheel                     0.34.2                     py_1    conda-forge
xdg                       4.0.1                      py_1    conda-forge
xz                        5.2.5                h516909a_0    conda-forge
yaml                      0.2.5                h516909a_0    conda-forge
zipp                      3.1.0                      py_0    conda-forge
zlib                      1.2.11            h516909a_1006    conda-forge

I am using a Ubuntu 18.04.3 LTS (GNU/Linux 4.15.0-72-generic x86_64)

Full error stack trace
[2020-07-03 00:27:33,97] [info] Running with database db.url = jdbc:hsqldb:mem:fb4e23cb-5eb9-494b-ae42-1d9af7d01aab;shutdown=false;hsqldb.tx=mvcc
[2020-07-03 00:27:39,37] [info] Running migration RenameWorkflowOptionsInMetadata with a read batch size of 100000 and a write batch size of 100000
[2020-07-03 00:27:39,38] [info] [RenameWorkflowOptionsInMetadata] 100%
[2020-07-03 00:27:39,44] [info] Running with database db.url = jdbc:hsqldb:mem:d1976698-78aa-47bf-b7ce-e2d98a2e4308;shutdown=false;hsqldb.tx=mvcc
[2020-07-03 00:27:39,74] [info] Slf4jLogger started
[2020-07-03 00:27:39,86] [info] Workflow heartbeat configuration:
{
  "cromwellId" : "cromid-e524f6a",
  "heartbeatInterval" : "2 minutes",
  "ttl" : "10 minutes",
  "failureShutdownDuration" : "5 minutes",
  "writeBatchSize" : 10000,
  "writeThreshold" : 10000
}
[2020-07-03 00:27:39,89] [info] Metadata summary refreshing every 1 second.
[2020-07-03 00:27:39,91] [info] KvWriteActor configured to flush with batch size 200 and process rate 5 seconds.
[2020-07-03 00:27:39,91] [info] WriteMetadataActor configured to flush with batch size 200 and process rate 5 seconds.
[2020-07-03 00:27:39,91] [info] CallCacheWriteActor configured to flush with batch size 100 and process rate 3 seconds.
[2020-07-03 00:27:39,91] [�[38;5;220mwarn�[0m] 'docker.hash-lookup.gcr-api-queries-per-100-seconds' is being deprecated, use 'docker.hash-lookup.gcr.throttle' instead (see reference.conf)
[2020-07-03 00:27:39,97] [info] JobExecutionTokenDispenser - Distribution rate: 50 per 1 seconds.
[2020-07-03 00:27:40,09] [info] SingleWorkflowRunnerActor: Version 48
[2020-07-03 00:27:40,09] [info] SingleWorkflowRunnerActor: Submitting workflow
[2020-07-03 00:27:40,13] [info] Unspecified type (Unspecified version) workflow 5c8eec74-dace-4680-a79b-b5ca0d003def submitted
[2020-07-03 00:27:40,15] [info] SingleWorkflowRunnerActor: Workflow submitted �[38;5;2m5c8eec74-dace-4680-a79b-b5ca0d003def�[0m
[2020-07-03 00:27:40,15] [info] 1 new workflows fetched by cromid-e524f6a: 5c8eec74-dace-4680-a79b-b5ca0d003def
[2020-07-03 00:27:40,16] [info] WorkflowManagerActor Starting workflow �[38;5;2m5c8eec74-dace-4680-a79b-b5ca0d003def�[0m
[2020-07-03 00:27:40,17] [info] WorkflowManagerActor Successfully started WorkflowActor-5c8eec74-dace-4680-a79b-b5ca0d003def
[2020-07-03 00:27:40,17] [info] Retrieved 1 workflows from the WorkflowStoreActor
[2020-07-03 00:27:40,20] [info] WorkflowStoreHeartbeatWriteActor configured to flush with batch size 10000 and process rate 2 minutes.
[2020-07-03 00:27:40,25] [info] MaterializeWorkflowDescriptorActor [�[38;5;2m5c8eec74�[0m]: Parsing workflow as WDL 1.0
[2020-07-03 00:27:42,50] [info] MaterializeWorkflowDescriptorActor [�[38;5;2m5c8eec74�[0m]: Call-to-Backend assignments: MultiBamExpressionQuantification.mergedStringtieTPMs -> Local, RNAseq.CPAT -> Local, GatkPreprocess.gatherBamFiles -> Local, GatkPreprocess.applyBqsr -> Local, MultiBamExpressionQuantification.mergedStringtieFPKMs -> Local, SampleWorkflow.umiDedup -> Local, QC.FastqcRead2 -> Local, SingleSampleCalling.callY -> Local, SingleSampleCalling.mergeSingleSampleVcf -> Local, CalculateRegions.intersectX -> Local, SampleWorkflow.star -> Local, SampleWorkflow.hisat2 -> Local, CalculateRegions.inverseBed -> Local, MultiBamExpressionQuantification.stringtieAssembly -> Local, MultiBamExpressionQuantification.stringtie -> Local, RNAseq.multiqcTask -> Local, SingleSampleCalling.Stats -> Local, RNAseq.gffreadTask -> Local, QC.FastqcRead1 -> Local, GatkPreprocess.splitNCigarReads -> Local, GatkPreprocess.gatherBqsr -> Local, BamMetrics.targetIntervalsLists -> Local, RNAseq.makeStarIndex -> Local, SingleSampleCalling.callX -> Local, MultiBamExpressionQuantification.htSeqCount -> Local, CalculateRegions.intersectAutosomalRegions -> Local, SingleSampleCalling.mergeSingleSampleGvcf -> Local, BamMetrics.rnaSeqMetrics -> Local, BamMetrics.Flagstat -> Local, SampleWorkflow.markDuplicates -> Local, BamMetrics.targetMetrics -> Local, CalculateRegions.scatterAutosomalRegions -> Local, QC.Cutadapt -> Local, MultiBamExpressionQuantification.mergedHTSeqFragmentsPerGenes -> Local, BamMetrics.ampliconIntervalsLists -> Local, RNAseq.GffCompare -> Local, RNAseq.scatterList -> Local, RNAseq.ConvertDockerTagsFile -> Local, QC.FastqcRead2After -> Local, CalculateRegions.intersectY -> Local, BamMetrics.picardMetrics -> Local, CalculateRegions.mergeBeds -> Local, RNAseq.ConvertSampleConfig -> Local, SingleSampleCalling.callAutosomal -> Local, QC.FastqcRead1After -> Local, MultiBamExpressionQuantification.mergeStringtieGtf -> Local, SampleWorkflow.postUmiDedupMarkDuplicates -> Local, GatkPreprocess.baseRecalibrator -> Local, SampleWorkflow.indexStarBam -> Local
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, cpu, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,63] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [cpu, memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [time_minutes, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:42,64] [�[38;5;220mwarn�[0m] Local [�[38;5;2m5c8eec74�[0m]: Key/s [memory, time_minutes] is/are not supported by backend. Unsupported attributes will not be part of job executions.
[2020-07-03 00:27:44,82] [info] WorkflowExecutionActor-5c8eec74-dace-4680-a79b-b5ca0d003def [�[38;5;2m5c8eec74�[0m]: Starting RNAseq.ConvertSampleConfig, RNAseq.ConvertDockerTagsFile
[2020-07-03 00:27:44,98] [info] Not triggering log of token queue status. Effective log interval = None
[2020-07-03 00:27:44,99] [info] Assigned new job execution tokens to the following groups: 5c8eec74: 2
[2020-07-03 00:27:46,66] [�[38;5;220mwarn�[0m] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertSampleConfig:NA:1]: Unrecognized runtime attribute keys: time_minutes, memory
[2020-07-03 00:27:46,66] [�[38;5;220mwarn�[0m] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertDockerTagsFile:NA:1]: Unrecognized runtime attribute keys: time_minutes, memory
[2020-07-03 00:27:46,71] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertDockerTagsFile:NA:1]: �[38;5;5mset -e
mkdir -p "$(dirname ./dockerImages.json)"
python <<CODE
import json
import yaml
with open("/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertDockerTagsFile/inputs/-904235375/dockerImages.yml", "r") as input_yaml:
    content = yaml.load(input_yaml)
with open("./dockerImages.json", "w") as output_json:
    json.dump(content, output_json)
CODE�[0m
[2020-07-03 00:27:46,71] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertSampleConfig:NA:1]: �[38;5;5mset -e
mkdir -p "$(dirname ./samples.json)"
biowdl-input-converter \
-o ./samples.json \
--skip-file-check \
 \
 \
/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertSampleConfig/inputs/-1211160950/Rna3PairedEnd.yml�[0m
[2020-07-03 00:27:46,75] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertSampleConfig:NA:1]: executing: # make sure there is no preexisting Docker CID file
rm -f /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertSampleConfig/execution/docker_cid
# run as in the original configuration without --rm flag (will remove later)
docker run \
  --cidfile /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertSampleConfig/execution/docker_cid \
  -i \
  --user $EUID \
  --entrypoint /bin/bash \
  -v /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertSampleConfig:/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertSampleConfig:delegated \
  quay.io/biocontainers/biowdl-input-converter@sha256:e41ea016fc53a258ad39fac9b4eccc41cceab392cb43169dfef9fc29fb317f49 /cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertSampleConfig/execution/script

# get the return code (working even if the container was detached)
rc=$(docker wait �[38;5;5mcat /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertSampleConfig/execution/docker_cid�[0m)

# remove the container after waiting
docker rm �[38;5;5mcat /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertSampleConfig/execution/docker_cid�[0m

# return exit code
exit $rc
[2020-07-03 00:27:46,75] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertDockerTagsFile:NA:1]: executing: # make sure there is no preexisting Docker CID file
rm -f /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertDockerTagsFile/execution/docker_cid
# run as in the original configuration without --rm flag (will remove later)
docker run \
  --cidfile /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertDockerTagsFile/execution/docker_cid \
  -i \
  --user $EUID \
  --entrypoint /bin/bash \
  -v /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertDockerTagsFile:/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertDockerTagsFile:delegated \
  quay.io/biocontainers/biowdl-input-converter@sha256:e41ea016fc53a258ad39fac9b4eccc41cceab392cb43169dfef9fc29fb317f49 /cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertDockerTagsFile/execution/script

# get the return code (working even if the container was detached)
rc=$(docker wait �[38;5;5mcat /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertDockerTagsFile/execution/docker_cid�[0m)

# remove the container after waiting
docker rm �[38;5;5mcat /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-ConvertDockerTagsFile/execution/docker_cid�[0m

# return exit code
exit $rc
[2020-07-03 00:27:49,98] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertSampleConfig:NA:1]: job id: 16572
[2020-07-03 00:27:49,98] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertDockerTagsFile:NA:1]: job id: 16577
[2020-07-03 00:27:49,98] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertSampleConfig:NA:1]: Status change from - to WaitingForReturnCode
[2020-07-03 00:27:49,98] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertDockerTagsFile:NA:1]: Status change from - to WaitingForReturnCode
[2020-07-03 00:27:55,49] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertSampleConfig:NA:1]: Status change from WaitingForReturnCode to Done
[2020-07-03 00:27:56,10] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.ConvertDockerTagsFile:NA:1]: Status change from WaitingForReturnCode to Done
[2020-07-03 00:27:59,17] [info] WorkflowExecutionActor-5c8eec74-dace-4680-a79b-b5ca0d003def [�[38;5;2m5c8eec74�[0m]: Starting RNAseq.scatterList
[2020-07-03 00:27:59,99] [info] Assigned new job execution tokens to the following groups: 5c8eec74: 1
[2020-07-03 00:28:01,23] [�[38;5;220mwarn�[0m] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.scatterList:NA:1]: Unrecognized runtime attribute keys: cpu, time_minutes, memory
[2020-07-03 00:28:01,24] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.scatterList:NA:1]: �[38;5;5mscatter-regions \
--print-paths \
--scatter-size 1000000000 \
 \
--prefix scatters/scatter- \
/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-scatterList/inputs/2074281362/reference.fasta.fai�[0m
[2020-07-03 00:28:01,25] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.scatterList:NA:1]: executing: # make sure there is no preexisting Docker CID file
rm -f /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-scatterList/execution/docker_cid
# run as in the original configuration without --rm flag (will remove later)
docker run \
  --cidfile /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-scatterList/execution/docker_cid \
  -i \
  --user $EUID \
  --entrypoint /bin/bash \
  -v /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-scatterList:/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-scatterList:delegated \
  quay.io/biocontainers/chunked-scatter@sha256:88ccc279639105d7ea4defe43c06f88b738f69b1156706ba3c335d84a37e9b64 /cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-scatterList/execution/script

# get the return code (working even if the container was detached)
rc=$(docker wait �[38;5;5mcat /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-scatterList/execution/docker_cid�[0m)

# remove the container after waiting
docker rm �[38;5;5mcat /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-scatterList/execution/docker_cid�[0m

# return exit code
exit $rc
[2020-07-03 00:28:04,93] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.scatterList:NA:1]: job id: 17269
[2020-07-03 00:28:04,94] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m5c8eec74�[0mRNAseq.scatterList:NA:1]: Status change from - to Done
[2020-07-03 00:28:05,30] [info] 815e1e42-f503-45ad-9454-7be49d04361a-SubWorkflowActor-SubWorkflow-calculateRegions:-1:1 [�[38;5;2m815e1e42�[0m]: Starting CalculateRegions.scatterAutosomalRegions
[2020-07-03 00:28:05,99] [info] Assigned new job execution tokens to the following groups: 5c8eec74: 1
[2020-07-03 00:28:06,00] [�[38;5;220mwarn�[0m] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m815e1e42�[0mCalculateRegions.scatterAutosomalRegions:NA:1]: Unrecognized runtime attribute keys: cpu, time_minutes, memory
[2020-07-03 00:28:06,00] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m815e1e42�[0mCalculateRegions.scatterAutosomalRegions:NA:1]: �[38;5;5mscatter-regions \
--print-paths \
--scatter-size 1000000000 \
 \
--prefix scatters/scatter- \
/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-calculateRegions/CalculateRegions/815e1e42-f503-45ad-9454-7be49d04361a/call-scatterAutosomalRegions/inputs/2074281362/reference.fasta.fai�[0m
[2020-07-03 00:28:06,05] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m815e1e42�[0mCalculateRegions.scatterAutosomalRegions:NA:1]: executing: # make sure there is no preexisting Docker CID file
rm -f /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-calculateRegions/CalculateRegions/815e1e42-f503-45ad-9454-7be49d04361a/call-scatterAutosomalRegions/execution/docker_cid
# run as in the original configuration without --rm flag (will remove later)
docker run \
  --cidfile /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-calculateRegions/CalculateRegions/815e1e42-f503-45ad-9454-7be49d04361a/call-scatterAutosomalRegions/execution/docker_cid \
  -i \
  --user $EUID \
  --entrypoint /bin/bash \
  -v /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-calculateRegions/CalculateRegions/815e1e42-f503-45ad-9454-7be49d04361a/call-scatterAutosomalRegions:/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-calculateRegions/CalculateRegions/815e1e42-f503-45ad-9454-7be49d04361a/call-scatterAutosomalRegions:delegated \
  quay.io/biocontainers/chunked-scatter@sha256:88ccc279639105d7ea4defe43c06f88b738f69b1156706ba3c335d84a37e9b64 /cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-calculateRegions/CalculateRegions/815e1e42-f503-45ad-9454-7be49d04361a/call-scatterAutosomalRegions/execution/script

# get the return code (working even if the container was detached)
rc=$(docker wait �[38;5;5mcat /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-calculateRegions/CalculateRegions/815e1e42-f503-45ad-9454-7be49d04361a/call-scatterAutosomalRegions/execution/docker_cid�[0m)

# remove the container after waiting
docker rm �[38;5;5mcat /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-calculateRegions/CalculateRegions/815e1e42-f503-45ad-9454-7be49d04361a/call-scatterAutosomalRegions/execution/docker_cid�[0m

# return exit code
exit $rc
[2020-07-03 00:28:07,38] [info] 44e3077d-bc41-4537-8498-c1e23d4a29a8-SubWorkflowActor-SubWorkflow-qc:0:1 [�[38;5;2m44e3077d�[0m]: Starting QC.FastqcRead1
[2020-07-03 00:28:07,99] [info] Assigned new job execution tokens to the following groups: 5c8eec74: 1
[2020-07-03 00:28:09,93] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m815e1e42�[0mCalculateRegions.scatterAutosomalRegions:NA:1]: job id: 17613
[2020-07-03 00:28:09,93] [info] BackgroundConfigAsyncJobExecutionActor [�[38;5;2m815e1e42�[0mCalculateRegions.scatterAutosomalRegions:NA:1]: Status change from - to Done
[2020-07-03 00:28:11,43] [info] 815e1e42-f503-45ad-9454-7be49d04361a-SubWorkflowActor-SubWorkflow-calculateRegions:-1:1 [�[38;5;2m815e1e42�[0m]: Workflow CalculateRegions complete. Final Outputs:
{
  "CalculateRegions.autosomalRegionScatters": ["/tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-calculateRegions/CalculateRegions/815e1e42-f503-45ad-9454-7be49d04361a/call-scatterAutosomalRegions/execution/scatters/scatter-0.bed"],
  "CalculateRegions.Xregions": null,
  "CalculateRegions.Yregions": null,
  "CalculateRegions.autosomalRegions": null
}
[2020-07-03 00:28:12,47] [info] WorkflowManagerActor Workflow 5c8eec74-dace-4680-a79b-b5ca0d003def failed (during ExecutingWorkflowState): cromwell.engine.workflow.lifecycle.execution.job.preparation.JobPreparationActor$$anonfun$1$$anon$1: Call input and runtime attributes evaluation failed for FastqcRead1:
Failed to evaluate input '__timeMinutes' (reason 1 of 1): [Attempted 1 time(s)] - NoSuchFileException: /tmp/pytest_workflow_wd6hgac4/Rna3PairedEndVariantCalling/cromwell-executions/RNAseq/5c8eec74-dace-4680-a79b-b5ca0d003def/call-sampleJobs/shard-0/SampleWorkflow/e70c60fe-c8c1-4c64-b058-c092b5838d70/call-qc/shard-0/QC/44e3077d-bc41-4537-8498-c1e23d4a29a8/call-FastqcRead1/execution/tests/data/rna3/R1.fq.gz
	at cromwell.engine.workflow.lifecycle.execution.job.preparation.JobPreparationActor$$anonfun$1.applyOrElse(JobPreparationActor.scala:76)
	at cromwell.engine.workflow.lifecycle.execution.job.preparation.JobPreparationActor$$anonfun$1.applyOrElse(JobPreparationActor.scala:69)
	at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
	at akka.actor.FSM.processEvent(FSM.scala:707)
	at akka.actor.FSM.processEvent$(FSM.scala:704)
	at cromwell.engine.workflow.lifecycle.execution.job.preparation.JobPreparationActor.processEvent(JobPreparationActor.scala:45)
	at akka.actor.FSM.akka$actor$FSM$$processMsg(FSM.scala:701)
	at akka.actor.FSM$$anonfun$receive$1.applyOrElse(FSM.scala:695)
	at akka.actor.Actor.aroundReceive(Actor.scala:539)
	at akka.actor.Actor.aroundReceive$(Actor.scala:537)
	at cromwell.engine.workflow.lifecycle.execution.job.preparation.JobPreparationActor.aroundReceive(JobPreparationActor.scala:45)
	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:612)
	at akka.actor.ActorCell.invoke(ActorCell.scala:581)
	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:268)
	at akka.dispatch.Mailbox.run(Mailbox.scala:229)
	at akka.dispatch.Mailbox.exec(Mailbox.scala:241)
	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

[2020-07-03 00:28:12,47] [info] WorkflowManagerActor WorkflowActor-5c8eec74-dace-4680-a79b-b5ca0d003def is in a terminal state: WorkflowFailedState
[2020-07-03 00:28:23,85] [info] SingleWorkflowRunnerActor workflow finished with status 'Failed'.
[2020-07-03 00:28:24,93] [info] Workflow polling stopped
[2020-07-03 00:28:24,96] [info] 0 workflows released by cromid-e524f6a
[2020-07-03 00:28:24,96] [info] Shutting down WorkflowStoreActor - Timeout = 5 seconds
[2020-07-03 00:28:24,97] [info] Shutting down WorkflowLogCopyRouter - Timeout = 5 seconds
[2020-07-03 00:28:24,97] [info] Shutting down JobExecutionTokenDispenser - Timeout = 5 seconds
[2020-07-03 00:28:24,98] [info] Aborting all running workflows.
[2020-07-03 00:28:24,98] [info] JobExecutionTokenDispenser stopped
[2020-07-03 00:28:24,98] [info] WorkflowStoreActor stopped
[2020-07-03 00:28:24,98] [info] WorkflowLogCopyRouter stopped
[2020-07-03 00:28:24,98] [info] Shutting down WorkflowManagerActor - Timeout = 3600 seconds
[2020-07-03 00:28:24,98] [info] WorkflowManagerActor All workflows finished
[2020-07-03 00:28:24,98] [info] WorkflowManagerActor stopped
[2020-07-03 00:28:25,13] [info] Connection pools shut down
[2020-07-03 00:28:25,14] [info] Shutting down SubWorkflowStoreActor - Timeout = 1800 seconds
[2020-07-03 00:28:25,14] [info] Shutting down JobStoreActor - Timeout = 1800 seconds
[2020-07-03 00:28:25,14] [info] Shutting down CallCacheWriteActor - Timeout = 1800 seconds
[2020-07-03 00:28:25,14] [info] SubWorkflowStoreActor stopped
[2020-07-03 00:28:25,14] [info] Shutting down ServiceRegistryActor - Timeout = 1800 seconds
[2020-07-03 00:28:25,14] [info] Shutting down DockerHashActor - Timeout = 1800 seconds
[2020-07-03 00:28:25,14] [info] Shutting down IoProxy - Timeout = 1800 seconds
[2020-07-03 00:28:25,14] [info] CallCacheWriteActor Shutting down: 0 queued messages to process
[2020-07-03 00:28:25,14] [info] JobStoreActor stopped
[2020-07-03 00:28:25,14] [info] CallCacheWriteActor stopped
[2020-07-03 00:28:25,14] [info] IoProxy stopped
[2020-07-03 00:28:25,14] [info] WriteMetadataActor Shutting down: 0 queued messages to process
[2020-07-03 00:28:25,14] [info] KvWriteActor Shutting down: 0 queued messages to process
[2020-07-03 00:28:25,14] [info] DockerHashActor stopped
[2020-07-03 00:28:25,14] [info] ServiceRegistryActor stopped
[2020-07-03 00:28:25,15] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false
[2020-07-03 00:28:25,15] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false
[2020-07-03 00:28:25,15] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false
[2020-07-03 00:28:25,15] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false
[2020-07-03 00:28:25,16] [info] Database closed
[2020-07-03 00:28:25,16] [info] Stream materializer shut down
[2020-07-03 00:28:25,16] [info] WDL HTTP import resolver closed

Release v3.0.0

Release checklist

  • Check outstanding issues on JIRA and Github
  • Update all submodules to latest master with: git submodule foreach "git checkout master;git pull; git submodule foreach --recursive 'git fetch'; git submodule update --init --recursive"
  • check all submodules are tagged correctly with git submodule
  • run tests to confirm to be released version works.
  • Generate inputs overview using wdl-aid:
    wdl-aid --strict -t scripts/docs_template.md.j2 pipeline.wdl > docs/inputs.md
  • Publish documentation (updateDocs.sh) from develop branch
    • Copy docs folder to gh-pages branch
    • Overwrite existing develop folder with docs folder on gh-pages
    • Push changes to gh-pages branch
  • Check latest documentation
    looks fine
  • Change current development version in CHANGELOG.md to stable version.
  • Run the release script release.sh
    • Check all submodules are tagged
    • Merge the develop branch into master
    • Created an annotated tag with the stable version number. Include changes
      from changelog.md.
    • Confirm or set stable version to be used for tagging
    • Push tag to remote.
    • Merge master branch back into develop.
    • Add updated version number to develop
  • Publish documentation (updateDocs.sh) from master branch
    • Copy docs folder to gh-pages branch
    • Rename docs to new stable version on gh-pages
    • Set latest version to new version
    • Push changes to gh-pages branch
  • Create a new release from the pushed tag on github

Downstream tools for DGEA are not taking advantage of index files due to wrong naming convention

I have been running the pipeline with the default code and test data as provided in the repository. However, I found that the bam index file produced has a naming convention different from what the tools expect(Tools don't explicitly require it but they expect the bam index file in the same folder as of the bam file.

Biowdl naming convention

bam file name: rna3-paired-end.markdup.bam
bam index file name : rna3-paired-end.markdup.bai

Tools naming convention(expected)

bam file name: rna3-paired-end.markdup.bam
bam index file name : rna3-paired-end.markdup.bam.bai

The second naming convention allows tools to use index files. This makes tools directly jump to required read or chromosome rather than processing file sequentially thereby increasing speed and reducing computation time.

PS: I noticed that there is no documentation that mentions the naming convention but it is just a community agreed convention.

Unable to run v4.0.0 or develop (#d5f7d1f) locally

Hi I am interested in hosting this workflow on a Cromwell-enabled platform, but I've been seeing errors even trying it locally with both the stable and develop branches using these inputs derived from your internal tests

{
    "RNAseq.cpatHex": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/CPAT/Human_Hexamer.tsv",
    "RNAseq.dbsnpVCF": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/wgs2.vcf.gz",
    "RNAseq.hisat2Index": [
        "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/hisat2/reference.1.ht2",
        "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/hisat2/reference.2.ht2",
        "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/hisat2/reference.3.ht2",
        "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/hisat2/reference.4.ht2",
        "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/hisat2/reference.5.ht2",
        "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/hisat2/reference.6.ht2",
        "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/hisat2/reference.7.ht2",
        "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/hisat2/reference.8.ht2"
    ],
    "RNAseq.refflatFile": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/reference.refflat",
    "RNAseq.strandedness": "None",
    "RNAseq.dbsnpVCFIndex": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/wgs2.vcf.gz.tbi",
    "RNAseq.cpatLogitModel": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/CPAT/Human_logitModel.RData",
    "RNAseq.referenceFasta": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/reference.fasta",
    "RNAseq.variantCalling": true,
    "RNAseq.lncRNAdatabases": [
        "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/reference.gtf"
    ],
    "RNAseq.lncRNAdetection": true,
    "RNAseq.dockerImagesFile": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/dockerImages.yml",
    "RNAseq.referenceGtfFile": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/reference.gtf",
    "RNAseq.sampleConfigFile": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/samplesheets/Rna3PairedEnd.yml",
    "RNAseq.referenceFastaFai": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/reference.fasta.fai",
    "RNAseq.referenceFastaDict": "https://raw.githubusercontent.com/biowdl/RNA-seq/develop/tests/data/reference/reference.dict"
}

On v4.0.0 I see:

java -jar cromwell-59.jar run -i PairedEndHisat2.json RNA-seq.wdl
...
  File "/usr/local/bin/biowdl-input-converter", line 10, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.8/site-packages/biowdl_input_converter/__init__.py", line 96, in main
    output_json = samplesheet_to_json(
  File "/usr/local/lib/python3.8/site-packages/biowdl_input_converter/__init__.py", line 77, in samplesheet_to_json
    raise NotImplementedError(
NotImplementedError: Unsupported extension: 

On develop I see:

Failed to import 'expression-quantification/multi-bam-quantify.wdl' (reason 1 of 1): Failed to process workflow definition 'MultiBamExpressionQuantification' (reason 1 of 1): Failed to process 'call collectColumns.CollectColumns as mergedStringtieFPKMs' (reason 1 of 1): The call supplied a value 'sumOnDuplicateId' that doesn't exist in the task (or sub-workflow)

Either of these might be easy to resolve but I'm not sure what direction I should take. Thanks!

samples.json and dockerImages.json causing problems in Cromwell on GCP

Hi. This is probably more a Cromwell problem than a biowdl problem but you should be aware that some of these "files in files" schemes in this pipeline seem to be triggering problems trying to "delocalize" them in the cloud. I don't think this pipeline would run on GCP (or Terra) as is.

I'm seeing errors that are a little different than in #74
"message": "Failed to evaluate 'dockerImages' (reason 1 of 1): Evaluating read_json(ConvertDockerTagsFile.json) failed: Failed to read_json(\"gs://pipeline-outputs/caper_out/RNAseq/cddd8308-d8af-49a9-a19d-1f9572210606/call-ConvertDockerTagsFile/dockerImages.json\") (reason 1 of 1): java.lang.IllegalArgumentException: Could not build the path \"gs://pipeline-outputs/caper_out/RNAseq/cddd8308-d8af-49a9-a19d-1f9572210606/call-ConvertDockerTagsFile/dockerImages.json\". It may refer to a filesystem not supported by this instance of Cromwell. Supported filesystems are: HTTP, LinuxFileSystem. Failures: \nHTTP: gs://pipeline-outputs/caper_out/RNAseq/cddd8308-d8af-49a9-a19d-1f9572210606/call-ConvertDockerTagsFile/dockerImages.json does not have an http or https scheme (IllegalArgumentException)\nLinuxFileSystem: Cannot build a local path from gs://pipeline-outputs/caper_out/RNAseq/cddd8308-d8af-49a9-a19d-1f9572210606/call-ConvertDockerTagsFile/dockerImages.json (RuntimeException)\n Please refer to the documentation for more information on how to configure filesystems: http://cromwell.readthedocs.io/en/develop/backends/HPC/#filesystems",

https://support.terra.bio/hc/en-us/community/posts/360071465631-write-lines-write-map-write-tsv-write-json-fail-when-run-in-a-workflow-rather-than-in-a-task

Release 2.0.0

Release checklist

  • Check outstanding issues on JIRA and Github
  • Generate inputs overview using wdl-aid:
    wdl-aid --strict -t scripts/docs_template.md.j2 pipeline.wdl > docs/inputs.md
  • Publish documentation (updateDocs.sh) from develop branch
    • Copy docs folder to gh-pages branch
    • Overwrite existing develop folder with docs folder on gh-pages
    • Push changes to gh-pages branch
  • Check latest documentation
    looks fine
  • Update all submodules to latest master with: git submodule foreach "git checkout master;git pull; git submodule foreach --recursive 'git fetch'; git submodule update --init --recursive"
  • check all submodules are tagged correctly with git submodule
  • run tests to confirm to be released version works.
  • Change current development version in CHANGELOG.md to stable version.
  • Run the release script release.sh
    • Check all submodules are tagged
    • Merge the develop branch into master
    • Created an annotated tag with the stable version number. Include changes
      from changelog.md.
    • Confirm or set stable version to be used for tagging
    • Push tag to remote.
    • Merge master branch back into develop.
    • Add updated version number to develop
  • Publish documentation (updateDocs.sh) from master branch
    • Copy docs folder to gh-pages branch
    • Rename docs to new stable version on gh-pages
    • Set latest version to new version
    • Push changes to gh-pages branch
  • Create a new release from the pushed tag on github

Release 4.0.0

  • Check outstanding issues on JIRA and Github
  • Update all submodules to latest master with: git submodule foreach "git checkout master;git pull; git submodule foreach --recursive 'git fetch'; git submodule update --init --recursive"
  • check all submodules are tagged correctly with git submodule
  • run tests to confirm to be released version works.
  • Generate inputs overview using wdl-aid:
    wdl-aid --strict -t scripts/docs_template.md.j2 pipeline.wdl > docs/inputs.md
  • Publish documentation (updateDocs.sh) from develop branch
    • Copy docs folder to gh-pages branch
    • Overwrite existing develop folder with docs folder on gh-pages
    • Push changes to gh-pages branch
  • Check latest documentation
    looks fine
  • Change current development version in CHANGELOG.md to stable version.
  • Run the release script release.sh
    • Check all submodules are tagged
    • Merge the develop branch into master
    • Created an annotated tag with the stable version number. Include changes
      from changelog.md.
    • Confirm or set stable version to be used for tagging
    • Push tag to remote.
    • Merge master branch back into develop.
    • Add updated version number to develop
  • Publish documentation (updateDocs.sh) from master branch
    • Copy docs folder to gh-pages branch
    • Rename docs to new stable version on gh-pages
    • Set latest version to new version
    • Push changes to gh-pages branch
  • Create a new release from the pushed tag on github
  • Prepare the repo for packaging by git checkout master && git submodule update --init --recursive
    • Package the wdl files with wdl-packager --reproducible -a LICENSE -a dockerImages.yml <WDL_FILE>
    • Add the package(s) to the github release. Also add the original WDL file
      as <pipeline>_<version>.wdl following the same naming as the package.
      This alllows for usage of wdl and imports zip with cromwell without
      requiring the user to extract the package.

outputDir ties running RNA-seq to local file system - support for cloud file systems

I'm trying to get this running on Google Cloud Platform using https://github.com/broadinstitute/wdl-runner
Unfortunately, the outputDir is pervasive throughout the WDL files and seems to be a show stopper.
I've modified some code to ignore Rna3PairedEnd.yml and that's worked.
But now the issue is with output files and their location.
Now, I'm getting errors like:
Required file output '/cromwell_root/./samples/rna3-paired-end/rna3-paired-end.markdup.bam.md5' does not exist.
Which, on GCP, it wouldn't.
This issue is for an 'enhancement' to support cloud file systems.

Shiny interface for WDL pipelines

Hello,Is it possible to create a UI using rshiny for wdl pipeline, so that the user can set the JSON file parameters( input and output) via the rshiny UI and then, on click of a button trigger the run of the WDL pipeline? If so, can you share any examples or publications that have implemented this?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.