Removed too many insertions, header is now negative!
Posted in Ask the team | Last updated on


Comments (23)

New gatk version... trying out ReduceReads again.

6 of 8 exomes I tried were processed by ReduceReads just fine, but two throw the exception Removed too many insertions, header is now negative! (at different genomic locations).

I did not find any mention of this error in the GATK forums, is this a known problem?

Command line: java -Xmx6g -jar GenomeAnalysisTK.jar -R human_g1k_v37.fasta -T ReduceReads -o test.rr.bam -I rr-too-many-insertions.bam

java -v: java version "1.6.0_27" Java(TM) SE Runtime Environment (build 1.6.0_27-b07) Java HotSpot(TM) 64-Bit Server VM (build 20.2-b06, mixed mode)

Run log:

INFO  16:03:26,898 HelpFormatter - -------------------------------------------------------------------------------- 
INFO  16:03:27,382 HelpFormatter - The Genome Analysis Toolkit (GATK) v2.3-0-g9593e74, Compiled 2012/12/17 16:58:19 
INFO  16:03:27,383 HelpFormatter - Copyright (c) 2010 The Broad Institute 
INFO  16:03:27,383 HelpFormatter - For support and documentation go to http://www.broadinstitute.org/gatk 
INFO  16:03:27,388 HelpFormatter - Program Args: -R human_g1k_v37.fasta -T ReduceReads -o test.rr.bam -I rr-too-many-insertions.bam 
INFO  16:03:27,388 HelpFormatter - Date/Time: 2012/12/18 16:03:26 
INFO  16:03:27,388 HelpFormatter - -------------------------------------------------------------------------------- 
INFO  16:03:27,388 HelpFormatter - -------------------------------------------------------------------------------- 
INFO  16:03:27,471 GenomeAnalysisEngine - Strictness is SILENT 
INFO  16:03:27,577 GenomeAnalysisEngine - Downsampling Settings: No downsampling 
INFO  16:03:27,585 SAMDataSource$SAMReaders - Initializing SAMRecords in serial 
INFO  16:03:27,620 SAMDataSource$SAMReaders - Done initializing BAM readers: total time 0.03 
INFO  16:03:27,656 ProgressMeter - [INITIALIZATION COMPLETE; STARTING PROCESSING] 
INFO  16:03:27,657 ProgressMeter -        Location processed.reads  runtime per.1M.reads completed total.runtime remaining 
INFO  16:03:27,714 ReadShardBalancer$1 - Loading BAM index data for next contig 
INFO  16:03:27,717 ReadShardBalancer$1 - Done loading BAM index data for next contig 
INFO  16:03:27,739 ReadShardBalancer$1 - Loading BAM index data for next contig 
INFO  16:03:28,739 GATKRunReport - Uploaded run statistics report to AWS S3 
##### ERROR ------------------------------------------------------------------------------------------
##### ERROR stack trace 
org.broadinstitute.sting.utils.exceptions.ReviewedStingException: Removed too many insertions, header is now negative!
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.HeaderElement.removeInsertionToTheRight(HeaderElement.java:151)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.SlidingWindow.updateHeaderCounts(SlidingWindow.java:881)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.SlidingWindow.removeFromHeader(SlidingWindow.java:816)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.SlidingWindow.compressVariantRegion(SlidingWindow.java:604)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.SlidingWindow.closeVariantRegion(SlidingWindow.java:623)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.SlidingWindow.closeVariantRegions(SlidingWindow.java:643)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.SingleSampleCompressor.closeVariantRegions(SingleSampleCompressor.java:83)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.MultiSampleCompressor.closeVariantRegionsInAllSamples(MultiSampleCompressor.java:94)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.MultiSampleCompressor.addAlignment(MultiSampleCompressor.java:76)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.ReduceReadsStash.compress(ReduceReadsStash.java:67)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.ReduceReads.reduce(ReduceReads.java:387)
    at org.broadinstitute.sting.gatk.walkers.compression.reducereads.ReduceReads.reduce(ReduceReads.java:87)
    at org.broadinstitute.sting.gatk.traversals.TraverseReadsNano$TraverseReadsReduce.apply(TraverseReadsNano.java:226)
    at org.broadinstitute.sting.gatk.traversals.TraverseReadsNano$TraverseReadsReduce.apply(TraverseReadsNano.java:215)
    at org.broadinstitute.sting.utils.nanoScheduler.NanoScheduler.executeSingleThreaded(NanoScheduler.java:254)
    at org.broadinstitute.sting.utils.nanoScheduler.NanoScheduler.execute(NanoScheduler.java:219)
    at org.broadinstitute.sting.gatk.traversals.TraverseReadsNano.traverse(TraverseReadsNano.java:91)
    at org.broadinstitute.sting.gatk.traversals.TraverseReadsNano.traverse(TraverseReadsNano.java:55)
    at org.broadinstitute.sting.gatk.executive.LinearMicroScheduler.execute(LinearMicroScheduler.java:83)
    at org.broadinstitute.sting.gatk.GenomeAnalysisEngine.execute(GenomeAnalysisEngine.java:281)
    at org.broadinstitute.sting.gatk.CommandLineExecutable.execute(CommandLineExecutable.java:113)
    at org.broadinstitute.sting.commandline.CommandLineProgram.start(CommandLineProgram.java:237)
    at org.broadinstitute.sting.commandline.CommandLineProgram.start(CommandLineProgram.java:147)
    at org.broadinstitute.sting.gatk.CommandLineGATK.main(CommandLineGATK.java:94)
##### ERROR ------------------------------------------------------------------------------------------
##### ERROR A GATK RUNTIME ERROR has occurred (version 2.3-0-g9593e74):
##### ERROR
##### ERROR Please visit the wiki to see if this is a known problem
##### ERROR If not, please post the error, with stack trace, to the GATK forum
##### ERROR Visit our website and forum for extensive documentation and answers to 
##### ERROR commonly asked questions http://www.broadinstitute.org/gatk
##### ERROR
##### ERROR MESSAGE: Removed too many insertions, header is now negative!
##### ERROR ------------------------------------------------------------------------------------------

(there is no progress listed here because this log is from after I bisected to find a narrow region where the problem is occuring).


Return to top Comment on this article in the forum