PyFoam.Infrastructure.ClusterJob module

Encapsulates all necessary things for a cluster-job, like setting up, running, restarting

class PyFoam.Infrastructure.ClusterJob.ClusterJob(basename, arrayJob=False, hardRestart=False, autoParallel=True, doAutoReconstruct=None, foamVersion=None, compileOption=None, useFoamMPI=False, multiRegion=False, parameters={}, isDecomposed=False)[source]

Bases: object

All Cluster-jobs are to be derived from this base-class

The actual jobs are implemented by overriding methods

There is a number of variables in this class that are used to ‘communicate’ information between the various stages

__dict__ = dict_proxy({'__module__': 'PyFoam.Infrastructure.ClusterJob', 'run': <function run>, 'jobFile': <function jobFile>, '__dict__': <attribute '__dict__' of 'ClusterJob' objects>, 'fullJobId': <function fullJobId>, 'autoDecompose': <function autoDecompose>, 'postDecomposeSetup': <function postDecomposeSetup>, 'templateFile': <function templateFile>, 'message': <function message>, '__weakref__': <attribute '__weakref__' of 'ClusterJob' objects>, '__init__': <function __init__>, 'setState': <function setState>, 'writeCheckpoint': <function writeCheckpoint>, 'execute': <function execute>, 'casename': <function casename>, 'casedir': <function casedir>, 'setup': <function setup>, 'autoReconstruct': <function autoReconstruct>, 'stopJob': <function stopJob>, 'taskParameters': <function taskParameters>, 'doIt': <function doIt>, 'foamRun': <function foamRun>, 'stopFile': <function stopFile>, 'cleanup': <function cleanup>, 'checkpointFile': <function checkpointFile>, 'preReconstructCleanup': <function preReconstructCleanup>, 'additionalReconstruct': <function additionalReconstruct>, '__doc__': " All Cluster-jobs are to be derived from this base-class\n\n The actual jobs are implemented by overriding methods\n\n There is a number of variables in this class that are used to\n 'communicate' information between the various stages", 'additionalParameters': <function additionalParameters>})
__init__(basename, arrayJob=False, hardRestart=False, autoParallel=True, doAutoReconstruct=None, foamVersion=None, compileOption=None, useFoamMPI=False, multiRegion=False, parameters={}, isDecomposed=False)[source]

Initializes the Job :param basename: Basis name of the job :param arrayJob: this job is a parameter variation. The tasks are identified by their task-id :param hardRestart: treat the job as restarted :param autoParallel: Parallelization is handled by the base-class :param doAutoReconstruct: Automatically reconstruct the case if autoParalellel is set. If the value is None then it is looked up from the configuration :param foamVersion: The foam-Version that is to be used :param compileOption: Forces compile-option (usually ‘Opt’ or ‘Debug’) :param useFoamMPI: Use the OpenMPI supplied with OpenFOAM :param multiRegion: This job consists of multiple regions :param parameters: Dictionary with parameters that are being passed to the Runner :param isDecomposed: Assume that the job is already decomposed

__module__ = 'PyFoam.Infrastructure.ClusterJob'
__weakref__

list of weak references to the object (if defined)

additionalParameters()[source]

Additional parameters :return: a dictionary with parameters for this task

additionalReconstruct(parameters)[source]

Additional reconstruction of parallel runs (Stuff that the OpenFOAM-reconstructPar doesn’t do :param parameters: a dictionary with parameters

autoDecompose()[source]

Automatically decomposes the grid with a metis-algorithm

autoReconstruct()[source]

Default reconstruction of a parallel run

casedir()[source]

Returns the actual directory of the case To be overridden if appropriate

casename()[source]

Returns just the name of the case

checkpointFile()[source]

The file that makes the job write a checkpoint

cleanup(parameters)[source]

Clean up after a job :param parameters: a dictionary with parameters

doIt()[source]

The central logic. Runs the job, sets it up etc

execute(cmd)[source]

Execute a shell command in the case directory. No checking done :param cmd: the command as a string

foamRun(application, args=[], foamArgs=[], steady=False, multiRegion=True, progress=False, compress=False, noLog=False)[source]

Runs a foam utility on the case. If it is a parallel job and the grid has already been decomposed (and not yet reconstructed) it is run in parallel :param application: the Foam-Application that is to be run :param foamArgs: A list if with the additional arguments for the Foam-Application :param compress: Compress the log-file :param args: A list with additional arguments for the Runner-object :param steady: Use the steady-runner :param multiRegion: Run this on multiple regions (if None: I don’t have an opinion on this) :param progress: Only output the time and nothing else :param noLog: Do not generate a logfile

fullJobId()[source]

Return a string with the full job-ID

jobFile()[source]

The file with the job information

message(*txt)[source]
postDecomposeSetup(parameters)[source]

Additional setup, to be executed when the grid is already decomposed

Usually for tasks that can be done on a decomposed grid

Parameters:parameters – a dictionary with parameters
preReconstructCleanup(parameters)[source]

Additional cleanup, to be executed when the grid is still decomposed

Usually for tasks that can be done on a decomposed grid

Parameters:parameters – a dictionary with parameters
run(parameters)[source]

Run the actual job. Usually the solver. :param parameters: a dictionary with parameters

setState(txt)[source]
setup(parameters)[source]

Set up the job. Called in the beginning if the job has not been restarted

Usual tasks include grid conversion/setup, mesh decomposition etc

Parameters:parameters – a dictionary with parameters
stopFile()[source]

The file that makes the job write a checkpoint and end

stopJob()[source]
taskParameters(id)[source]

Parameters for a specific task :param id: the id of the task :return: a dictionary with parameters for this task

templateFile(fileName)[source]

Looks for a template file and evaluates the template using the usual parameters :param fileName: the name of the file that will be constructed. The template file is the same plus the extension ‘.template’

writeCheckpoint()[source]
class PyFoam.Infrastructure.ClusterJob.PrepareCaseJob(basename, solver, parameterfile, arguments, parameters={}, noMeshCreate=False, **kwargs)[source]

Bases: PyFoam.Infrastructure.ClusterJob.SolverJob

Assumes that the case is prepared to be set up with =pyFoamPrepareCase.py= and automatically sets it up with this. Needs one parameterfile to be specified and then a list of name/value-pairs

__init__(basename, solver, parameterfile, arguments, parameters={}, noMeshCreate=False, **kwargs)[source]
Parameters:template – Name of the template-case. It is assumed that

it resides in the same directory as the actual case :param cloneParameters: a list with additional parameters for the CloneCase-object that copies the template :param solverProgress: Only writes the current time of the solver

__module__ = 'PyFoam.Infrastructure.ClusterJob'
setup(parameters)[source]

Set up the job. Called in the beginning if the job has not been restarted

Usual tasks include grid conversion/setup, mesh decomposition etc

Parameters:parameters – a dictionary with parameters
class PyFoam.Infrastructure.ClusterJob.SolverJob(basename, solver, template=None, cloneParameters=[], arrayJob=False, hardRestart=False, autoParallel=True, doAutoReconstruct=None, foamVersion=None, compileOption=None, useFoamMPI=False, steady=False, multiRegion=False, parameters={}, progress=False, solverArgs=[], solverProgress=False, solverNoLog=False, solverLogCompress=False, isDecomposed=False)[source]

Bases: PyFoam.Infrastructure.ClusterJob.ClusterJob

A Cluster-Job that executes a solver. It implements the run-function. If a template-case is specified, the case is copied

__init__(basename, solver, template=None, cloneParameters=[], arrayJob=False, hardRestart=False, autoParallel=True, doAutoReconstruct=None, foamVersion=None, compileOption=None, useFoamMPI=False, steady=False, multiRegion=False, parameters={}, progress=False, solverArgs=[], solverProgress=False, solverNoLog=False, solverLogCompress=False, isDecomposed=False)[source]
Parameters:template – Name of the template-case. It is assumed that

it resides in the same directory as the actual case :param cloneParameters: a list with additional parameters for the CloneCase-object that copies the template :param solverProgress: Only writes the current time of the solver

__module__ = 'PyFoam.Infrastructure.ClusterJob'
run(parameters)[source]

Run the actual job. Usually the solver. :param parameters: a dictionary with parameters

class PyFoam.Infrastructure.ClusterJob.VariationCaseJob(basename, parameterfile, variationfile, template=None, **kwargs)[source]

Bases: PyFoam.Infrastructure.ClusterJob.SolverJob

Assumes that the case is prepared to be set up with =pyFoamRunParameterVariation.py= and automatically sets it up with this. Needs one parameterfile and a variation-file

__init__(basename, parameterfile, variationfile, template=None, **kwargs)[source]
Parameters:template – Name of the template-case. It is assumed that

it resides in the same directory as the actual case :param cloneParameters: a list with additional parameters for the CloneCase-object that copies the template :param solverProgress: Only writes the current time of the solver

__module__ = 'PyFoam.Infrastructure.ClusterJob'
setup(parameters)[source]

Set up the job. Called in the beginning if the job has not been restarted

Usual tasks include grid conversion/setup, mesh decomposition etc

Parameters:parameters – a dictionary with parameters
taskParameters(id)[source]

Parameters for a specific task :param id: the id of the task :return: a dictionary with parameters for this task

PyFoam.Infrastructure.ClusterJob.checkForMessageFromAbove(job)[source]