Tagged with #lsf
0 documentation articles | 0 announcements | 5 forum discussions


No articles to display.

No articles to display.


Created 2016-05-20 22:32:20 | Updated | Tags: indelrealigner realignertargetcreator queue performance gatk lsf memory

Comments (2)

Hi,

I'm using a QScript to run the GATK best practices on LSF. I'm trying to run the data processing steps now. RealignerTargetCreator and IndelRealigner are using so much memory that my jobs are being killed by LSF. Usually they don't even make it through RealignerTargetCreator, but the ones that do make it have a memory explosion in IndelRealigner and get killed at that step.

I'm testing the pipeline with a very small dataset: 3 bam files of about 10 million reads each. Even when I downsample them to 1 million reads each, the memory still explodes. I've tried requesting up to 32g of memory. Higher memory requests allow the jobs to run longer, but they eventually outgrow even 32g.

A few notes:

  • I'm using GATK/Queue 3.5-0.
  • The "-memLimit" option is working correctly so that my jobs are submitted with the correct memory request. LSF is reporting that the actual memory usage is indeed extremely high.
  • I've validated my bam files with the Picard validation tool.
  • I'm using 1000G_phase1.indels.hg19.vcf and dbsnp_137.hg19.vcf from the GATK resource bundle downloaded in early 2013.

Do you have any idea what could be happening?

Here is an example of the output from LSF:

Sender: LSF System <hpcadmin@c27>
Subject: Job 57251: <RealignerTargetCreator: B11.sorted.rg.downsample.interval_list> Exited

Job <RealignerTargetCreator: B11.sorted.rg.downsample.interval_list> was submitted from host <c15> by user <russellp> in cluster <compbio_cluster1>.
Job was executed on host(s) <c27>, in queue <shared>, as user <russellp> in cluster <compbio_cluster1>.
</home/russellp> was used as the home directory.
</home/russellp/Projects/IPF_resequencing/GATK_testing> was used as the working directory.
Started at Fri May 20 16:10:00 2016
Results reported at Fri May 20 16:11:03 2016

Your job looked like:

------------------------------------------------------------
# LSBATCH: User input
sh /home/russellp/Projects/IPF_resequencing/GATK_testing/.queue/tmp/.exec5219091543152698331
------------------------------------------------------------

TERM_MEMLIMIT: job killed after reaching LSF memory usage limit.
Exited with exit code 130.

Resource usage summary:

    CPU time   :     79.19 sec.
    Max Memory :      7951 MB
    Max Swap   :     34733 MB

    Max Processes  :         4
    Max Threads    :        29

Here is my QScript:

package qscripts

import org.broadinstitute.gatk.queue.QScript
//import org.broadinstitute.gatk.queue.extensions.picard.MarkDuplicates
import org.broadinstitute.gatk.queue.extensions.gatk._

/**
  * Created by prussell on 3/3/16.
  * Implements GATK best practices
  */
class GatkBestPractices extends QScript {

  /**
    * *********************************************************************
    *                               INPUTS
    * *********************************************************************
    */

  /**
    * Reference genome fasta
    */
  @Input(doc="Reference genome for the bam files", shortName="R", fullName="REF_FASTA", required=true)
  var referenceFile: File = null

  /**
    * Bam files
    */
  @Input(doc="One or more bam files", shortName="I", fullName="INPUT_BAM", required=true)
  var bamFiles: List[File] = Nil

  /**
    * VCF files
    */
  @Input(doc="VCF file(s) with known indels", fullName="KNOWN_INDELS", required=true)
  var knownIndels: List[File] = Nil
  @Input(doc="Database of known variants e.g. dbSNP", fullName="KNOWN_VARIANTS", required=true)
  var knownPolymorphicSites: List[File] = Nil

  /**
    * Output directory
    */
  @Input(doc="Output directory", shortName="od", fullName="OUT_DIR", required=true)
  var outputDirectory : File = null

  /**
    * Output file prefix not including directory
    */
  @Input(doc="Output prefix not including directory", shortName="op", fullName="OUT_PREFIX", required=true)
  var outputPrefix : File = null

  /**
    * Common arguments
    */
  trait CommonArguments extends CommandLineGATK {
    this.reference_sequence = referenceFile
    // TODO other common arguments?
  }

  def script() = {    

    /**
      * Container for the processed bam files
      */
    var processedFiles = Seq.empty[File]

    /**
      * Data processing
      */
    for(bam <- bamFiles) {

      /**
        * *********************************************************************
        * LOCAL REALIGNMENT AROUND INDELS
        * https://www.broadinstitute.org/gatk/guide/article?id=38
        * https://www.broadinstitute.org/gatk/guide/article?id=2800
        * *********************************************************************
        */

      /**
        * Local realignment around indels
        * Step 1 of 2: RealignerTargetCreator: Define intervals to target for local realignment
        * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_indels_RealignerTargetCreator.php
        */
      val realignerTargetCreator = new RealignerTargetCreator with CommonArguments
      realignerTargetCreator.input_file +:= bam
      realignerTargetCreator.known = knownIndels
      realignerTargetCreator.out = swapExt(bam, "bam", "interval_list")
      realignerTargetCreator.maxIntervalSize = int2intOption(500) // Default 500
      realignerTargetCreator.minReadsAtLocus = int2intOption(4) // Default 4
      realignerTargetCreator.mismatchFraction = double2doubleOption(0.0) // Default 0.0
      realignerTargetCreator.windowSize = int2intOption(10) // Default 10
      add(realignerTargetCreator)

      /**
        * Local realignment around indels
        * Step 2 of 2: IndelRealigner: Perform local realignment of reads around indels
        * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_indels_IndelRealigner.php
        */
      val indelRealigner = new IndelRealigner with CommonArguments
      indelRealigner.targetIntervals = realignerTargetCreator.out
      indelRealigner.input_file +:= bam
      indelRealigner.knownAlleles = knownIndels
      indelRealigner.out = swapExt(bam, "bam", "realign.bam")
      indelRealigner.consensusDeterminationModel = null
      indelRealigner.LODThresholdForCleaning = double2doubleOption(5.0) // Default 5.0
      indelRealigner.nWayOut = null
      indelRealigner.entropyThreshold = double2doubleOption(0.15) // Default 0.15
      indelRealigner.maxConsensuses = int2intOption(30) // Default 30
      indelRealigner.maxIsizeForMovement = int2intOption(3000) // Default 3000
      indelRealigner.maxPositionalMoveAllowed = int2intOption(200) // Default 200
      indelRealigner.maxReadsForConsensuses = int2intOption(120) // Default 120
      indelRealigner.maxReadsForRealignment = int2intOption(20000) // Default 20000
      indelRealigner.maxReadsInMemory = int2intOption(150000) // Default 150000
      indelRealigner.noOriginalAlignmentTags = false
      add(indelRealigner)

      /**
        * *********************************************************************
        * BASE QUALITY SCORE RECALIBRATION
        * https://www.broadinstitute.org/gatk/guide/article?id=44
        * https://www.broadinstitute.org/gatk/guide/article?id=2801
        * *********************************************************************
        */

      /**
        * Base quality score recalibration
        * Step 1 of 3: BaseRecalibrator: Generate base recalibration table to compensate for systematic errors in basecalling confidences
        * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_bqsr_BaseRecalibrator.php
        */

      // Generate the first pass recalibration table file
      val baseRecalibratorBefore = new BaseRecalibrator with CommonArguments
      baseRecalibratorBefore.input_file +:= indelRealigner.out
      baseRecalibratorBefore.out = swapExt(indelRealigner.out, "realign.bam", "base_recalibrator_first_pass.out")
      baseRecalibratorBefore.knownSites = knownPolymorphicSites
      baseRecalibratorBefore.indels_context_size = int2intOption(3) // Default 3
      baseRecalibratorBefore.maximum_cycle_value = int2intOption(500) // Default 500
      baseRecalibratorBefore.mismatches_context_size = int2intOption(2) // Default 2
      baseRecalibratorBefore.solid_nocall_strategy = null
      baseRecalibratorBefore.solid_recal_mode = null
      baseRecalibratorBefore.list = false
      baseRecalibratorBefore.lowMemoryMode = false
      baseRecalibratorBefore.no_standard_covs = false
      baseRecalibratorBefore.sort_by_all_columns = false
      baseRecalibratorBefore.binary_tag_name = null
      baseRecalibratorBefore.bqsrBAQGapOpenPenalty = double2doubleOption(40.0) // Default 40.0
      baseRecalibratorBefore.deletions_default_quality = int2byteOption(45) // Default 45
      baseRecalibratorBefore.insertions_default_quality = int2byteOption(45) // Default 45
      baseRecalibratorBefore.low_quality_tail = int2byteOption(2) // Default 2
      baseRecalibratorBefore.mismatches_default_quality = int2byteOption(-1) // Default -1
      baseRecalibratorBefore.quantizing_levels = int2intOption(16) // Default 16
      baseRecalibratorBefore.run_without_dbsnp_potentially_ruining_quality = false
      add(baseRecalibratorBefore)

      // Generate the second pass recalibration table file
      val baseRecalibratorAfter = new BaseRecalibrator with CommonArguments
      baseRecalibratorAfter.BQSR = baseRecalibratorBefore.out
      baseRecalibratorAfter.input_file +:= indelRealigner.out
      baseRecalibratorAfter.out = swapExt(indelRealigner.out, "realign.bam", "base_recalibrator_second_pass.out")
      baseRecalibratorAfter.knownSites = knownPolymorphicSites
      baseRecalibratorAfter.indels_context_size = int2intOption(3) // Default 3
      baseRecalibratorAfter.maximum_cycle_value = int2intOption(500) // Default 500
      baseRecalibratorAfter.mismatches_context_size = int2intOption(2) // Default 2
      baseRecalibratorAfter.solid_nocall_strategy = null
      baseRecalibratorAfter.solid_recal_mode = null
      baseRecalibratorAfter.list = false
      baseRecalibratorAfter.lowMemoryMode = false
      baseRecalibratorAfter.no_standard_covs = false
      baseRecalibratorAfter.sort_by_all_columns = false
      baseRecalibratorAfter.binary_tag_name = null
      baseRecalibratorAfter.bqsrBAQGapOpenPenalty = double2doubleOption(40.0) // Default 40.0
      baseRecalibratorAfter.deletions_default_quality = int2byteOption(45) // Default 45
      baseRecalibratorAfter.insertions_default_quality = int2byteOption(45) // Default 45
      baseRecalibratorAfter.low_quality_tail = int2byteOption(2) // Default 2
      baseRecalibratorAfter.mismatches_default_quality = int2byteOption(-1) // Default -1
      baseRecalibratorAfter.quantizing_levels = int2intOption(16) // Default 16
      baseRecalibratorAfter.run_without_dbsnp_potentially_ruining_quality = false
      add(baseRecalibratorAfter)

      /**
        * Base quality score recalibration
        * Step 2 of 3: AnalyzeCovariates: Create plots to visualize base recalibration results
        * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_bqsr_AnalyzeCovariates.php
        */
      val analyzeCovariates = new AnalyzeCovariates with CommonArguments
      analyzeCovariates.beforeReportFile = baseRecalibratorBefore.out
      analyzeCovariates.afterReportFile = baseRecalibratorAfter.out
      analyzeCovariates.plotsReportFile = new File(outputDirectory.getAbsolutePath + "/" + outputPrefix + "_BQSR.pdf")
      analyzeCovariates.intermediateCsvFile = new File(outputDirectory.getAbsolutePath + "/" + outputPrefix + "_BQSR.csv")
      analyzeCovariates.ignoreLMT = false
      add(analyzeCovariates)

      /**
        * Base quality score recalibration
        * Step 3 of 3: PrintReads: Write out sequence read data (for filtering, merging, subsetting etc)
        * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_readutils_PrintReads.php
        */
      val printReads = new PrintReads with CommonArguments
      printReads.input_file +:= bam
      printReads.BQSR = baseRecalibratorAfter.out
      printReads.out = swapExt(bam, "bam", "recalibrated.bam")

      processedFiles +:= printReads.out

    }

    /**
      * *********************************************************************
      *                        VARIANT DISCOVERY
      * https://www.broadinstitute.org/gatk/guide/bp_step.php?p=2
      * *********************************************************************
      */

    /**
      * Variant discovery
      * Step 1 of 6: HaplotypeCaller: Call germline SNPs and indels via local re-assembly of haplotypes
      * https://www.broadinstitute.org/gatk/guide/article?id=2803
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_haplotypecaller_HaplotypeCaller.php
      * https://www.broadinstitute.org/gatk/guide/article?id=3893
      */
    // TODO

    /**
      * Variant discovery
      * Step 2 of 6: CombineGVCFs: Combine per-sample gVCF files produced by HaplotypeCaller into a multi-sample gVCF file
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_variantutils_CombineGVCFs.php
      */
    // TODO

    /**
      * Variant discovery
      * Step 3 of 6: GenotypeGVCFs: Perform joint genotyping on gVCF files produced by HaplotypeCaller
      * https://www.broadinstitute.org/gatk/guide/article?id=3893
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_variantutils_GenotypeGVCFs.php
      */
    // TODO

    /**
      * Variant discovery
      * Step 4 of 6: VariantFiltration: Filter variant calls based on INFO and FORMAT annotations
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_filters_VariantFiltration.php
      */
    // TODO

    /**
      * Variant discovery
      * Step 5 of 6: VariantRecalibrator: Build a recalibration model to score variant quality for filtering purposes
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_variantrecalibration_VariantRecalibrator.php
      * https://www.broadinstitute.org/gatk/guide/article?id=39
      * https://www.broadinstitute.org/gatk/guide/article?id=2805
      *
      */
    // TODO

    /**
      * Variant discovery
      * Step 6 of 6: ApplyRecalibration: Apply a score cutoff to filter variants based on a recalibration table
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_variantrecalibration_ApplyRecalibration.php
      * https://www.broadinstitute.org/gatk/guide/article?id=2806
      */
    // TODO

    /**
      * *********************************************************************
      *                        CALLSET REFINEMENT
      * https://www.broadinstitute.org/gatk/guide/bp_step.php?p=3
      * *********************************************************************
      */

    /**
      * Callset refinement
      * Step 1 of 8: CalculateGenotypePosteriors: Calculate genotype posterior likelihoods given panel data
      * https://www.broadinstitute.org/gatk/guide/article?id=4727
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_variantutils_CalculateGenotypePosteriors.php
      */
    // TODO

    /**
      * Callset refinement
      * Step 2 of 8: VariantFiltration: Filter variant calls based on INFO and FORMAT annotations
      * https://www.broadinstitute.org/gatk/guide/article?id=4727
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_filters_VariantFiltration.php
      */
    // TODO

    /**
      * Callset refinement
      * Step 3 of 8: VariantAnnotator: Annotate variant calls with context information
      * https://www.broadinstitute.org/gatk/guide/article?id=4727
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_annotator_VariantAnnotator.php
      */
    // TODO

    /**
      * Callset refinement
      * Step 4 of 8: SelectVariants: Select a subset of variants from a larger callset
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_variantutils_SelectVariants.php
      */
    // TODO

    /**
      * Callset refinement
      * Step 5 of 8: CombineVariants: Combine variant records from different sources
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_variantutils_CombineVariants.php
      */
    // TODO

    /**
      * Callset refinement
      * Step 6 of 8: VariantEval: General-purpose tool for variant evaluation (% in dbSNP, genotype concordance, Ti/Tv ratios, and a lot more)
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_varianteval_VariantEval.php
      */
    // TODO

    /**
      * Callset refinement
      * Step 7 of 8: VariantsToTable: Extract specific fields from a VCF file to a tab-delimited table
      * https://www.broadinstitute.org/gatk/guide/tooldocs/org_broadinstitute_gatk_tools_walkers_variantutils_VariantsToTable.php
      */
    // TODO

    /**
      * Callset refinement
      * Step 8 of 8: GenotypeConcordance (Picard version): Genotype concordance
      * https://broadinstitute.github.io/picard/command-line-overview.html#GenotypeConcordance
      */
    // TODO

  }

}

Created 2014-07-25 15:45:20 | Updated | Tags: lsf

Comments (5)

Happy Friday!

I am running GATK on a shared LSF cluster but have unintentionally run afoul of the admins due to resource use. I am running the GATK commands with 'java -Xmx4g -jar $GATK' and am not using -nt or -nct, but CPU usage is exceeding the resources requested which are 1 vCore and 6G memory. I am now running the same commands on our lab server and do indeed see that CPU usage reaches as high as 800% at times. Is there a way to limit the CPU resources claimed by GATK tools, or alternatively a way to anticipate its actual needs so that I can request an appropriate number of CPUs?

Much thanks, Erik


Created 2013-08-27 18:48:00 | Updated | Tags: queue lsf

Comments (9)

This is not a question, per se - I suppose it's more of an observation.

We recently upgraded LSF on one of our clusters to v9.0.1, and quickly discovered that Queue can't submit jobs. The reaction was rather violent - the entire JVM crashed, and the stack trace showed it dying in lsb_submit(). We downgraded LSF to v8.3.0, and everything is working fine (so far).

I know Queue is compiled against the LSF v7.0.6 API, it would appear that it's not binary-compatible with LSF 9.x.

Hope this helps others in the future...


Created 2013-01-15 19:38:43 | Updated 2013-01-15 20:02:25 | Tags: queue lsf

Comments (8)

I had no problem to run GATK two weeks ago. But today, when I run the following GATK command, I got error message. It seems it cannot load library " liblsf.so". Please see below. Is there any change recently on GATK library?

2:15pm qyu@vbronze /bit/data01/projects/GC_coverage $ java -Xmx4g -Djava.io.tmpdir=/broad/hptmp/vdauwera -jar /humgen/gsa-scr1/vdauwera/gatk/walker/dist/Queue.jar -S /bit/data01/projects/GC_coverage/src/IntervalCovGG.scala -i /humgen/gsa-hpprojects/GATK/data/genes_of_interest.exon_targets.interval_list -b /bit/data01/projects/GC_coverage/testGCcoverage/data/DEV-2129.list -o /bit/data01/projects/GC_coverage/testGCcoverage/DEV-2129.gatkreport -m 16 -sc 100 -bsub -jobQueue priority -startFromScratch -run

The error shows:

ERROR 14:13:06,238 QGraph - Uncaught error running jobs. 
java.lang.UnsatisfiedLinkError: Unable to load library 'lsf': liblsf.so: cannot 
open shared object file: No such file or directory
        at com.sun.jna.NativeLibrary.loadLibrary(NativeLibrary.java:163)
        at com.sun.jna.NativeLibrary.getInstance(NativeLibrary.java:236)
        at com.sun.jna.NativeLibrary.getInstance(NativeLibrary.java:199)
        at org.broadinstitute.sting.jna.lsf.v7_0_6.LibBat.<clinit>(LibBat.java:9
0)
        at org.broadinstitute.sting.queue.engine.lsf.Lsf706JobRunner$.<init>(Lsf
706JobRunner.scala:233)
        at org.broadinstitute.sting.queue.engine.lsf.Lsf706JobRunner$.<clinit>(L
sf706JobRunner.scala)
        at org.broadinstitute.sting.queue.engine.lsf.Lsf706JobRunner.<init>(Lsf7
06JobRunner.scala:47)
        at org.broadinstitute.sting.queue.engine.lsf.Lsf706JobManager.create(Lsf
706JobManager.scala:35)
        at org.broadinstitute.sting.queue.engine.lsf.Lsf706JobManager.create(Lsf
706JobManager.scala:33)
        at org.broadinstitute.sting.queue.engine.QGraph.newRunner(QGraph.scala:6
32)
        at org.broadinstitute.sting.queue.engine.QGraph.runJobs(QGraph.scala:408
)
        at org.broadinstitute.sting.queue.engine.QGraph.run(QGraph.scala:131)
        at org.broadinstitute.sting.queue.QCommandLine.execute(QCommandLine.scal
a:127)
        at org.broadinstitute.sting.commandline.CommandLineProgram.start(Command
LineProgram.java:236)
        at org.broadinstitute.sting.commandline.CommandLineProgram.start(Command
LineProgram.java:146)
        at org.broadinstitute.sting.queue.QCommandLine$.main(QCommandLine.scala:
62)
        at org.broadinstitute.sting.queue.QCommandLine.main(QCommandLine.scala)
Exception in thread "main" java.lang.UnsatisfiedLinkError: Unable to load librar
y 'lsf': liblsf.so: cannot open shared object file: No such file or directory
        at com.sun.jna.NativeLibrary.loadLibrary(NativeLibrary.java:163)
        at com.sun.jna.NativeLibrary.getInstance(NativeLibrary.java:236)
        at com.sun.jna.NativeLibrary.getInstance(NativeLibrary.java:199)
        at org.broadinstitute.sting.jna.lsf.v7_0_6.LibBat.<clinit>(LibBat.java:9
.........

Thanks, Qing


Created 2012-12-13 18:44:03 | Updated | Tags: queue lsf

Comments (7)

Hi,

I am testing queue scripts with new installed LSF v8.3. The test script is:

java -Djava.io.tmpdir=/tmp -jar jar2216/Queue.jar -S Queue-2.2-16-g9f648cb/resources/ExampleCountReads.scala -R Queue-2.2-16-g9f648cb/resources/exampleFASTA.fasta -I Queue-2.2-16-g9f648cb/resources/exampleBAM.bam --bsub -run

where I get error message as follows:

'java' '-Xmx1024m' '-XX:+UseParallelOldGC' '-XX:ParallelGCThreads=4' '-XX:GCTimeLimit=50' '-XX:GCHeapFreeLimit=10' '-Djava.io.tmpdir=/data/cmb/wxing/gatk/.queue/tmp' '-cp' '/data/cmb/wxing/gatk/jar2216/Queue.jar' 'org.broadinstitute.sting.gatk. CommandLineGATK' '-T' 'CountReads' '-I' '/data/cmb/wxing/gatk/Queue-2.2-16-g9f648cb/resources/exampleBAM.bam' '-R' '/data/cmb/wxing/gatk/Queue-2.2-16-g9f648cb/resources/exampleFASTA.fasta' java.lang.UnsatisfiedLinkError: Error looking up function 'ls_getLicenseUsage': /usr/local/lsf/8.3/linux2.6-glibc2.3-x86_64/lib/liblsf.so: undefined symbol: ls_getLicenseUsage at com.sun.jna.Function.(Function.java:179) at com.sun.jna.NativeLibrary.getFunction(NativeLibrary.java:344) at com.sun.jna.NativeLibrary.getFunction(NativeLibrary.java:324) at com.sun.jna.Native.register(Native.java:1341) at com.sun.jna.Native.register(Native.java:1018) at org.broadinstitute.sting.jna.lsf.v7_0_6.LibLsf.(LibLsf.java:86) at java.lang.J9VMInternals.initializeImpl(Native Method) at java.lang.J9VMInternals.initialize(J9VMInternals.java:200) at org.broadinstitute.sting.queue.engine.lsf.Lsf706JobRunner$.unitDivisor(Lsf706JobRunner.scala:401) at org.broadinstitute.sting.queue.engine.lsf.Lsf706JobRunner$.org$broadinstitute$sting$queue$engine$lsf$Lsf706JobRunner$$convertUnits(Lsf706JobRunner.scala:416) at org.broadinstitute.sting.queue.engine.lsf.Lsf706JobRunner.start(Lsf706JobRunner.scala:98) at org.broadinstitute.sting.queue.engine.FunctionEdge.start(FunctionEdge.scala:83) at org.broadinstitute.sting.queue.engine.QGraph.runJobs(QGraph.scala:432) at org.broadinstitute.sting.queue.engine.QGraph.run(QGraph.scala:154) at org.broadinstitute.sting.queue.QCommandLine.execute(QCommandLine.scala:145) at org.broadinstitute.sting.commandline.CommandLineProgram.start(CommandLineProgram.java:237) at org.broadinstitute.sting.commandline.CommandLineProgram.start(CommandLineProgram.java:147) at org.broadinstitute.sting.queue.QCommandLine$.main(QCommandLine.scala:62) at org.broadinstitute.sting.queue.QCommandLine.main(QCommandLine.scala)

Any clues on the issue "java.lang.UnsatisfiedLinkError: Error looking up function 'ls_getLicenseUsage': /usr/local/lsf/8.3/linux2.6-glibc2.3-x86_64/lib/liblsf.so: ". Or anyone had similar problems?

Anyone think it could be the version of our LSF (v8.3) as the code seem based on version 706?

Many thanks, Wei