Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pipeline failed in atac.filter #359

Closed
w111zsj opened this issue Dec 27, 2021 · 8 comments
Closed

pipeline failed in atac.filter #359

w111zsj opened this issue Dec 27, 2021 · 8 comments

Comments

@w111zsj
Copy link

w111zsj commented Dec 27, 2021

Describe the bug

Hi team,

My pipeline failed before entering the atac.filter step.

My caper was installed in conda environment (canu) so I ran this pipeline when activate canu.
caper run /home/zhaosj/software/encode_atacseq/atac-seq-pipeline/atac.wdl -i /home/zhaosj/proj/yeastatac/data/jasondb2015.json --conda canu

Conda environments corresponding to this pipeline were installed follow your instructions by running install_conda_env.sh.
Softlink (ln -s ) and hardlink (ln) work good in my filesystem.

Could you please help me fix this problem?

Many thanks,
Jason

OS/Platform

  • OS/Platform: Ubuntu 16.04
  • Conda version: v4.10.1
  • Pipeline version: v2.0.3
  • Caper version:v2.1.2

Caper configuration file

Paste contents of ~/.caper/default.conf.

backend=local

# Hashing strategy for call-caching (3 choices)
# This parameter is for local (local/slurm/sge/pbs/lsf) backend only.
# This is important for call-caching,
# which means re-using outputs from previous/failed workflows.
# Cache will miss if different strategy is used.
# "file" method has been default for all old versions of Caper<1.0.
# "path+modtime" is a new default for Caper>=1.0,
#   file: use md5sum hash (slow).
#   path: use path.
#   path+modtime: use path and modification time.
local-hash-strat=path+modtime

# Metadata DB for call-caching (reusing previous outputs):
# Cromwell supports restarting workflows based on a metadata DB
# DB is in-memory by default
#db=in-memory

# If you use 'caper server' then you can use one unified '--file-db'
# for all submitted workflows. In such case, uncomment the following two lines
# and defined file-db as an absolute path to store metadata of all workflows
#db=file
#file-db=

# If you use 'caper run' and want to use call-caching:
# Make sure to define different 'caper run ... --db file --file-db DB_PATH'
# for each pipeline run.
# But if you want to restart then define the same '--db file --file-db DB_PATH'
# then Caper will collect/re-use previous outputs without running the same task again
# Previous outputs will be simply hard/soft-linked.


# Local directory for localized files and Cromwell's intermediate files
# If not defined, Caper will make .caper_tmp/ on local-out-dir or CWD.
# /tmp is not recommended here since Caper store all localized data files
# on this directory (e.g. input FASTQs defined as URLs in input JSON).
local-loc-dir=

cromwell=/home/zhaosj/.caper/cromwell_jar/cromwell-65.jar
womtool=/home/zhaosj/.caper/womtool_jar/womtool-65.jar

Input JSON file

Paste contents of your input JSON file.

{
    "atac.title" : "JasonDB2015",
    "atac.description" : "This is the wt yeast ATACseq data from JasonDB etal Genome Research 2015.",

    "atac.pipeline_type" : "atac",
    "atac.align_only" : false,
    "atac.true_rep_only" : false,

    "atac.genome_tsv" : "/home/zhaosj/proj/yeastatac/ref/R64.tsv",

    "atac.paired_end" : true,

    "atac.fastqs_rep1_R1" : [ "/home/zhaosj/proj/yeastatac/data/lin2a_1.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin2b_1.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin2c_1.fastq.gz" ],
    "atac.fastqs_rep1_R2" : [ "/home/zhaosj/proj/yeastatac/data/lin2a_2.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin2b_2.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin2c_2.fastq.gz" ],
    "atac.fastqs_rep2_R1" : [ "/home/zhaosj/proj/yeastatac/data/lin3a_1.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin3b_1.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin3c_1.fastq.gz" ],
    "atac.fastqs_rep2_R2" : [ "/home/zhaosj/proj/yeastatac/data/lin3a_2.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin3b_2.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin3c_2.fastq.gz" ],
    "atac.fastqs_rep3_R1" : [ "/home/zhaosj/proj/yeastatac/data/lin6a_1.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin6b_1.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin6c_1.fastq.gz" ],
    "atac.fastqs_rep3_R2" : [ "/home/zhaosj/proj/yeastatac/data/lin6a_2.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin6b_2.fastq.gz", "/home/zhaosj/proj/yeastatac/data/lin6c_2.fastq.gz" ],

    "atac.auto_detect_adapter" : true,

    "atac.multimapping" : 4
}

Troubleshooting result

If you ran caper run without Caper server then Caper automatically runs a troubleshooter for failed workflows. Find troubleshooting result in the bottom of Caper's screen log.

If you ran caper submit with a running Caper server then first find your workflow ID (1st column) with caper list and run caper debug [WORKFLOW_ID].

Paste troubleshooting result.

2021-12-27 17:14:17,054|caper.cli|INFO| Cromwell stdout: /home/zhaosj/proj/yeastatac/run/cromwell.out
2021-12-27 17:14:17,055|caper.caper_base|INFO| Creating a timestamped temporary directory. /home/zhaosj/proj/yeastatac/run/.caper_tmp/atac/20211227_171417_055145
2021-12-27 17:14:17,055|caper.caper_runner|INFO| Localizing files on work_dir. /home/zhaosj/proj/yeastatac/run/.caper_tmp/atac/20211227_171417_055145
2021-12-27 17:14:17,951|caper.cromwell|INFO| Validating WDL/inputs/imports with Womtool...
2021-12-27 17:14:20,938|caper.cromwell|INFO| Passed Womtool validation.
2021-12-27 17:14:20,939|caper.caper_runner|INFO| launching run: wdl=/home/zhaosj/software/encode_atacseq/atac-seq-pipeline/atac.wdl, inputs=/home/zhaosj/proj/yeastatac/data/jasondb2015.json, backend_conf=/home/zhaosj/proj/yeastatac/run/.caper_tmp/atac/20211227_171417_055145/backend.conf
2021-12-27 17:14:30,313|caper.cromwell_workflow_monitor|INFO| Workflow: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, status=Submitted
2021-12-27 17:14:30,349|caper.cromwell_workflow_monitor|INFO| Workflow: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, status=Running
2021-12-27 17:14:45,202|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.read_genome_tsv:-1, retry=0, status=Started, job_id=14817
2021-12-27 17:14:45,206|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.read_genome_tsv:-1, retry=0, status=WaitingForReturnCode
2021-12-27 17:14:50,104|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.read_genome_tsv:-1, retry=0, status=Done
2021-12-27 17:15:00,195|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align:2, retry=0, status=Started, job_id=16835
2021-12-27 17:15:00,195|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align:1, retry=0, status=Started, job_id=16254
2021-12-27 17:15:00,196|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align:1, retry=0, status=WaitingForReturnCode
2021-12-27 17:15:00,196|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align:2, retry=0, status=WaitingForReturnCode
2021-12-27 17:15:05,195|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align_mito:1, retry=0, status=Started, job_id=18037
2021-12-27 17:15:05,195|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align_mito:0, retry=0, status=Started, job_id=18628
2021-12-27 17:15:05,195|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align:0, retry=0, status=Started, job_id=17430
2021-12-27 17:15:05,195|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align_mito:1, retry=0, status=WaitingForReturnCode
2021-12-27 17:15:05,195|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align_mito:0, retry=0, status=WaitingForReturnCode
2021-12-27 17:15:05,195|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align:0, retry=0, status=WaitingForReturnCode
2021-12-27 17:15:10,196|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align_mito:2, retry=0, status=Started, job_id=19201
2021-12-27 17:15:10,196|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align_mito:2, retry=0, status=WaitingForReturnCode
2021-12-27 17:52:30,905|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align_mito:1, retry=0, status=Done
2021-12-27 18:10:32,264|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align_mito:0, retry=0, status=Done
2021-12-27 18:27:34,364|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align_mito:2, retry=0, status=Done
2021-12-27 19:00:47,133|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align:1, retry=0, status=Done
2021-12-27 19:00:55,195|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.frac_mito:1, retry=0, status=Started, job_id=33877
2021-12-27 19:00:55,196|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.frac_mito:1, retry=0, status=WaitingForReturnCode
2021-12-27 19:01:00,194|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.filter:1, retry=0, status=Started, job_id=34433
2021-12-27 19:01:00,195|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.filter:1, retry=0, status=WaitingForReturnCode
2021-12-27 19:01:01,593|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.frac_mito:1, retry=0, status=Done
2021-12-27 19:05:13,583|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.filter:1, retry=0, status=Done
2021-12-27 19:05:20,194|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.filter:1, retry=1, status=Started, job_id=46392
2021-12-27 19:05:20,194|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.filter:1, retry=1, status=WaitingForReturnCode
2021-12-27 19:09:06,113|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.filter:1, retry=1, status=Done
2021-12-27 19:22:27,534|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align:0, retry=0, status=Done
2021-12-27 19:56:09,023|caper.cromwell_workflow_monitor|INFO| Task: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, task=atac.align:2, retry=0, status=Done
2021-12-27 19:56:09,696|caper.cromwell_workflow_monitor|INFO| Workflow: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, status=Failed
2021-12-27 19:56:16,130|caper.cromwell_metadata|INFO| Wrote metadata file. /home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/metadata.json
2021-12-27 19:56:16,130|caper.cromwell|INFO| Workflow failed. Auto-troubleshooting...
2021-12-27 19:56:16,131|caper.nb_subproc_thread|ERROR| Cromwell failed. returncode=1
2021-12-27 19:56:16,131|caper.cli|ERROR| Check stdout in /home/zhaosj/proj/yeastatac/run/cromwell.out
* Started troubleshooting workflow: id=e7c2068b-d8f8-423b-ac61-4fa9428d92f7, status=Failed
* Found failures JSON object.
[
    {
        "message": "Workflow failed",
        "causedBy": [
            {
                "causedBy": [],
                "message": "Job atac.filter:1:2 exited with return code 1 which has not been declared as a valid return code. See 'continueOnReturnCode' runtime attribute for more details."
            }
        ]
    }
]
* Recursively finding failures in calls (tasks)...

==== NAME=atac.filter, STATUS=RetryableFailure, PARENT=
SHARD_IDX=1, RC=1, JOB_ID=34433
START=2021-12-27T11:00:56.271Z, END=2021-12-27T11:05:15.197Z
STDOUT=/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/execution/stdout
STDERR=/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/execution/stderr
STDERR_CONTENTS=
Traceback (most recent call last):
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 438, in <module>
    main()
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 344, in main
    args.nth, args.mem_gb, args.out_dir)
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 136, in rm_unmapped_lowq_reads_pe
    res_param=get_samtools_res_param('fixmate', nth=nth),
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_lib_common.py", line 359, in run_shell_cmd
    raise Exception(err_str)
Exception: PID=44241, PGID=44241, RC=127, DURATION_SEC=0.0
STDERR=/usr/bin/env: ‘python2’: No such file or directory
STDOUT=

STDERR_BACKGROUND_CONTENTS=
Traceback (most recent call last):
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 438, in <module>
    main()
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 344, in main
    args.nth, args.mem_gb, args.out_dir)
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 136, in rm_unmapped_lowq_reads_pe
    res_param=get_samtools_res_param('fixmate', nth=nth),
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_lib_common.py", line 359, in run_shell_cmd
    raise Exception(err_str)
Exception: PID=44241, PGID=44241, RC=127, DURATION_SEC=0.0
STDERR=/usr/bin/env: ‘python2’: No such file or directory
STDOUT=
ln: failed to access '/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/execution/*.samstats.qc': No such file or directory
ln: failed to access '/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/execution/*.dup.qc': No such file or directory
ln: failed to access '/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/execution/*.bai': No such file or directory
ln: failed to access '/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/execution/*.lib_complexity.qc': No such file or directory



==== NAME=atac.filter, STATUS=Failed, PARENT=
SHARD_IDX=1, RC=1, JOB_ID=46392
START=2021-12-27T11:05:16.268Z, END=2021-12-27T11:09:06.115Z
STDOUT=/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/attempt-2/execution/stdout
STDERR=/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/attempt-2/execution/stderr
STDERR_CONTENTS=
Traceback (most recent call last):
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 438, in <module>
    main()
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 344, in main
    args.nth, args.mem_gb, args.out_dir)
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 136, in rm_unmapped_lowq_reads_pe
    res_param=get_samtools_res_param('fixmate', nth=nth),
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_lib_common.py", line 359, in run_shell_cmd
    raise Exception(err_str)
Exception: PID=56099, PGID=56099, RC=127, DURATION_SEC=0.0
STDERR=/usr/bin/env: ‘python2’: No such file or directory
STDOUT=

STDERR_BACKGROUND_CONTENTS=
Traceback (most recent call last):
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 438, in <module>
    main()
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 344, in main
    args.nth, args.mem_gb, args.out_dir)
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_task_filter.py", line 136, in rm_unmapped_lowq_reads_pe
    res_param=get_samtools_res_param('fixmate', nth=nth),
  File "/home/zhaosj/anaconda3/envs/encode-atac-seq-pipeline/bin/encode_lib_common.py", line 359, in run_shell_cmd
    raise Exception(err_str)
Exception: PID=56099, PGID=56099, RC=127, DURATION_SEC=0.0
STDERR=/usr/bin/env: ‘python2’: No such file or directory
STDOUT=
ln: failed to access '/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/attempt-2/execution/*.samstats.qc': No such file or directory
ln: failed to access '/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/attempt-2/execution/*.dup.qc': No such file or directory
ln: failed to access '/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/attempt-2/execution/*.bai': No such file or directory
ln: failed to access '/home/zhaosj/proj/yeastatac/run/atac/e7c2068b-d8f8-423b-ac61-4fa9428d92f7/call-filter/shard-1/attempt-2/execution/*.lib_complexity.qc': No such file or directory



@leepc12
Copy link
Contributor

leepc12 commented Jan 3, 2022

Did you install pipeline's conda environment (scripts/install_conda_env.sh) and run with caper run ... --conda?

@w111zsj
Copy link
Author

w111zsj commented Jan 10, 2022

Did you install pipeline's conda environment (scripts/install_conda_env.sh) and run with caper run ... --conda?

Yes, as I mentioned, I install pipeline's conda environment by running scripts/install_conda_env.sh and run with capper run ... --conda.

@aimeGloria
Copy link

I have met the same problem as you. Have you solved this problem yet? @w111zsj

@leepc12
Copy link
Contributor

leepc12 commented Mar 21, 2022

Please try this for debugging:

$ source activate encode-atac-seq-pipeline # or conda activte

(encode-atac-seq-pipeline) $ conda env list
# conda environments:
#
base                     /users/leepc12/miniconda3
encode-atac-seq-pipeline  *  /users/leepc12/miniconda3/envs/encode-atac-seq-pipeline
encode-atac-seq-pipeline-macs2     /users/leepc12/miniconda3/envs/encode-atac-seq-pipeline-macs2
encode-atac-seq-pipeline-python2     /users/leepc12/miniconda3/envs/encode-atac-seq-pipeline-python2
encode-atac-seq-pipeline-spp     /users/leepc12/miniconda3/envs/encode-atac-seq-pipeline-spp

(encode-atac-seq-pipeline) $ which python
/users/leepc12/miniconda3/envs/encode-atac-seq-pipeline/bin/python

(encode-atac-seq-pipeline) $ python --version
Python 3.6.6 :: Anaconda, Inc.

(encode-atac-seq-pipeline) $ which samtools
/users/leepc12/miniconda3/envs/encode-atac-seq-pipeline/bin/samtools

(encode-atac-seq-pipeline) $ samtools --version
samtools 1.9
Using htslib 1.9
Copyright (C) 2018 Genome Research Ltd.

@jonasungerback
Copy link

I'm person number 3 with this problem. I debugged as you suggested above and everything looks fine to me.

@leepc12
Copy link
Contributor

leepc12 commented Mar 23, 2022

This looks like a bug in the pipeline.
Until it's fixed please manually remove python2 definition in python script's shebang.

# cd to the git repo
$ cd atac-seq-pipeline 

# edit the first line of src/assign_multimappers.py , replace python2 with python3
$ vi src/assign_multimappers.py 

# then the python script should look like this
$ head src/assign_multimappers.py
#!/usr/bin/env python3

# piped script to take multimappers and randomly assign
# requires a qname sorted file!!

import sys
import random
import argparse

# transfer that python script to conda envs
$ scripts/update_conda_env.sh

Please let me know if this fix works.

@jonasungerback
Copy link

Worked like a charm for me. Thank you!

@leepc12
Copy link
Contributor

leepc12 commented Mar 24, 2022

Closing this, I will make a new release today.

@leepc12 leepc12 closed this as completed Mar 24, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants
@leepc12 @jonasungerback @w111zsj @aimeGloria and others