org.opensha.sha.calc.hazardMap.dagGen
Class HazardDataSetDAGCreator

java.lang.Object
  extended by org.opensha.sha.calc.hazardMap.dagGen.HazardDataSetDAGCreator
Direct Known Subclasses:
HazusDataSetDAGCreator

public class HazardDataSetDAGCreator
extends Object

This class generates a simple Condor DAG for a given ERF, IMR Hash Map(s), and list of sites. This DAG is meant to be run on a shared filesystem, where the output directory for DAG generation is also visible on the compute nodes/slots. It could be extended in the future to use Globus and GridFTP to get around this limitation.

Author:
kevin

Field Summary
protected  CurveResultsArchiver archiver
           
protected  CalculationSettings calcSettings
           
static int DAGMAN_MAX_IDLE
           
static int DAGMAN_MAX_POST
           
static int DAGMAN_MAX_PRE
           
protected  ERF erf
           
static String ERF_SERIALIZED_FILE_NAME
           
protected  List<Map<TectonicRegionType,ScalarIMR>> imrMaps
           
protected  String jarFile
           
protected  String javaExec
           
protected  List<Site> sites
           
protected  SubmitScript.Universe universe
           
 
Constructor Summary
HazardDataSetDAGCreator(CalculationInputsXMLFile inputs, String javaExec, String jarFile)
          Convenience constructor for if you already have the inputs from an XML file.
HazardDataSetDAGCreator(ERF erf, List<Map<TectonicRegionType,ScalarIMR>> imrMaps, List<Parameter<Double>> imts, List<Site> sites, CalculationSettings calcSettings, CurveResultsArchiver archiver, String javaExec, String jarFile)
          Main constructor with objects/info necessary for hazard data set calculation.
 
Method Summary
static void createSubmitDAGScript(String odir, boolean run)
          Create a DAG submit script with common tuning parameters
protected  DAG getPostDAG(File outputDir)
          Can be overridden to add jobs at the end of the workflow
protected  DAG getPreDAG(File outputDir)
          Can be overridden to add jobs at the start of the workflow
 String getRequirements()
           
 SubmitScript.Universe getUniverse()
           
static void main(String[] args)
           
 void setRequirements(String requirements)
           
 void setUniverse(SubmitScript.Universe universe)
           
static void usage()
           
 void writeDAG(File outputDir, int sitesPerJob, boolean run)
          Writes the DAG to the specified output directory.
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Field Detail

ERF_SERIALIZED_FILE_NAME

public static final String ERF_SERIALIZED_FILE_NAME
See Also:
Constant Field Values

erf

protected ERF erf

imrMaps

protected List<Map<TectonicRegionType,ScalarIMR>> imrMaps

sites

protected List<Site> sites

calcSettings

protected CalculationSettings calcSettings

archiver

protected CurveResultsArchiver archiver

javaExec

protected String javaExec

jarFile

protected String jarFile

universe

protected SubmitScript.Universe universe

DAGMAN_MAX_IDLE

public static int DAGMAN_MAX_IDLE

DAGMAN_MAX_PRE

public static int DAGMAN_MAX_PRE

DAGMAN_MAX_POST

public static int DAGMAN_MAX_POST
Constructor Detail

HazardDataSetDAGCreator

public HazardDataSetDAGCreator(CalculationInputsXMLFile inputs,
                               String javaExec,
                               String jarFile)
Convenience constructor for if you already have the inputs from an XML file.

Parameters:
inputs -
javaExec -
jarFile -

HazardDataSetDAGCreator

public HazardDataSetDAGCreator(ERF erf,
                               List<Map<TectonicRegionType,ScalarIMR>> imrMaps,
                               List<Parameter<Double>> imts,
                               List<Site> sites,
                               CalculationSettings calcSettings,
                               CurveResultsArchiver archiver,
                               String javaExec,
                               String jarFile)
Main constructor with objects/info necessary for hazard data set calculation.

Parameters:
erf - - The ERF
imrMaps - - A list of IMR/TectonicRegion hash maps
imts - - A list of imt's for each imrMap (or null to use IMT from IMR)
sites - - The list of sites that need to be calculated. All site parameters should already be set
calcSettings - - Some simple calculation settings (such as X values, cutoff distance)
archiver - - The archiver used to store curves once calculated
javaExec - - The path to the java executable
jarFile - - The path to the jar file used for calculation.
Method Detail

getPreDAG

protected DAG getPreDAG(File outputDir)
                 throws IOException
Can be overridden to add jobs at the start of the workflow

Returns:
Throws:
IOException

getPostDAG

protected DAG getPostDAG(File outputDir)
                  throws IOException
Can be overridden to add jobs at the end of the workflow

Returns:
Throws:
IOException

writeDAG

public void writeDAG(File outputDir,
                     int sitesPerJob,
                     boolean run)
              throws IOException
Writes the DAG to the specified output directory. It will the task into many small tasks as specified by sitesPerJob. It can also be automatically submitted if the run is true.

Parameters:
outputDir -
sitesPerJob -
run -
Throws:
IOException

getUniverse

public SubmitScript.Universe getUniverse()

setUniverse

public void setUniverse(SubmitScript.Universe universe)

createSubmitDAGScript

public static void createSubmitDAGScript(String odir,
                                         boolean run)
                                  throws IOException
Create a DAG submit script with common tuning parameters

Parameters:
odir -
run -
Throws:
IOException

setRequirements

public void setRequirements(String requirements)

getRequirements

public String getRequirements()

usage

public static void usage()

main

public static void main(String[] args)