You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Dear
I have 290 samples to impute and i run QUILT in chunks. QUILT.R --outputdir=quilt_output_chunks --chr=chr03 --regionStart=12800001 --regionEnd=13800001 --buffer=200000 --bamlist=bamlist_1.txt --reference_haplotype_file=chr03.hap.gz --reference_legend_file=chr03.legend.gz --nGen=100 --nCores=40
I encountered an error:
"Error : sort_index(): detected NaN
In mclapply(1:length(sampleRanges), mc.cores = nCores, function(iCore) { :
scheduled cores 7, 8 encountered errors in user code, all values of the jobs will be affected
Execution halted".
When I tried to run them using multiple cores(40 cores). I attempted to allocate more memory(200Gb), and I found that ran successfully.
But Similar commands and the same solutions have failed. The command: QUILT.R --outputdir=quilt_output_chunks --chr=chr03 --regionStart=13600001 --regionEnd=14600001 --buffer=200000 --bamlist=bamlist_1.txt --reference_haplotype_file=chr03.hap.gz --reference_legend_file=chr03.legend.gz --nGen=100 --nCores=40
The error always is
"Error : sort_index(): detected NaN".
Therefore, I tried setting the core number to 1 for the tasks, but encountered the same error. What could be the issue?
Thank you.
Any feedback is welcome.
The text was updated successfully, but these errors were encountered:
I'm slightly confused. QUILT in general will throw errors that are sometimes confusing when it runs out of memory.
Each sample in QUILT is processed independently. Can you try running fewer samples at a time, using fewer cores? In an extreme case, run small sets of individuals with 1 core each, then merge back together.
Dear
I have 290 samples to impute and i run QUILT in chunks.
QUILT.R --outputdir=quilt_output_chunks --chr=chr03 --regionStart=12800001 --regionEnd=13800001 --buffer=200000 --bamlist=bamlist_1.txt --reference_haplotype_file=chr03.hap.gz --reference_legend_file=chr03.legend.gz --nGen=100 --nCores=40
I encountered an error:
When I tried to run them using multiple cores(40 cores). I attempted to allocate more memory(200Gb), and I found that ran successfully.
But Similar commands and the same solutions have failed. The command:
QUILT.R --outputdir=quilt_output_chunks --chr=chr03 --regionStart=13600001 --regionEnd=14600001 --buffer=200000 --bamlist=bamlist_1.txt --reference_haplotype_file=chr03.hap.gz --reference_legend_file=chr03.legend.gz --nGen=100 --nCores=40
The error always is
Therefore, I tried setting the core number to 1 for the tasks, but encountered the same error. What could be the issue?
Thank you.
Any feedback is welcome.
The text was updated successfully, but these errors were encountered: