Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mosdepth error #192

Closed
pauline-ng opened this issue Mar 18, 2022 · 2 comments
Closed

mosdepth error #192

pauline-ng opened this issue Mar 18, 2022 · 2 comments

Comments

@pauline-ng
Copy link

pauline-ng commented Mar 18, 2022

Hi Brent,

I'm running smoove with a small reference cohort:

smoove call -x --outdir /home/pauline/cnv_problem_files/1-0041-003/go_debug --name 1-0041-003_recal --fasta /myome/share/references/hs37d5/hs37d5.fa -p 12 --genotype /home/pauline/cnv_problem_files/1-0041-003/smoove_ref_cohort_tmdir_intern/HG01710.bam /home/pauline/cnv_problem_files/1-0041-003/smoove_ref_cohort_tmdir_intern/HG01709.bam /home/pauline/cnv_problem_files/1-0041-003/smoove_ref_cohort_tmdir_intern/HG00733.bam /home/pauline/cnv_problem_files/1-0041-003/smoove_ref_cohort_tmdir_intern/HG01695.bam /home/pauline/cnv_problem_files/1-0041-003/1-0041-003_recal.bam

and getting the error:

[smoove] 2022/03/18 02:18:14 starting with version 0.2.6
[smoove] 2022/03/18 02:18:14 calculating bam stats for 5 bams
[smoove] 2022/03/18 02:18:24 done calculating bam stats
bash: line 5:  3860 Killed                  mosdepth -f /myome/share/references/hs37d5/hs37d5.fa --fast-mode -n --quantize 1001: /home/pauline/smoove-mosdepth-862318781 /home/pauline/cnv_problem_files/1-0041-003/go_debug/HG01709.split.bam
panic: exit status 137


goroutine 81881 [running]:
github.com/brentp/smoove/lumpy.check(...)
        /home/brentp/go/go/src/github.com/brentp/smoove/lumpy/lumpy.go:54
github.com/brentp/smoove/lumpy.remove_sketchy(0xc00023e320, 0x45, 0x3e8, 0x7fff7f3d0baa, 0x28, 0x0, 0x0, 0xc00018d640, 0x4, 0x4, ...)
        /home/brentp/go/go/src/github.com/brentp/smoove/lumpy/depthfilter.go:232 +0x1818
github.com/brentp/smoove/lumpy.remove_sketchy_all.func1(0xc0017c8b40, 0x3e8, 0x7fff7f3d0baa, 0x28, 0x0, 0x0, 0xc00018d640, 0x4, 0x4, 0x1, ...)
        /home/brentp/go/go/src/github.com/brentp/smoove/lumpy/depthfilter.go:399 +0x231
created by github.com/brentp/smoove/lumpy.remove_sketchy_all
        /home/brentp/go/go/src/github.com/brentp/smoove/lumpy/depthfilter.go:397 +0x191

I saw there was an OOM issue on your mosdepth repo, is there a way to specify memory with mosdepth inside smoove so this error doesn't occur?

Other information that may be relevant:
It's 1 particular bam that's causing the problem (1-0041-003). I can run 1-0041-003 individually and with a few other reference samples, but when it's run with all the reference files fails. 1-0041-003 is the sample file -- the others are my reference files and have run successfully with lots of other samples.

Thanks,
Pauline

@brentp
Copy link
Owner

brentp commented Mar 18, 2022

Hi, the memory shouldn't be an issue unless you have less than 2GB of memory per sample available. But, indeed exit status 137 is out of memory. How much memory is available? Maybe you need to specify, e.g. 15GB total and then run the job.

you could also try updating to the latest version of smoove as that will contain other recent changes (though I don't think any directly related to this).

@pauline-ng
Copy link
Author

Hi Brent,

Thanks for the tip. What finally worked for me was to reduce the number of threads for the smoove call. (smoove -p)

Note: Specifyingdocker --memoryor docker --memory --oom-kill-disable didn't work for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants