PyFoam.Infrastructure.ClusterJob module¶
Encapsulates all necessary things for a cluster-job, like setting up, running, restarting
-
class
PyFoam.Infrastructure.ClusterJob.
ClusterJob
(basename, arrayJob=False, hardRestart=False, autoParallel=True, doAutoReconstruct=None, foamVersion=None, compileOption=None, useFoamMPI=False, multiRegion=False, parameters={}, isDecomposed=False)[source]¶ Bases:
object
All Cluster-jobs are to be derived from this base-class
The actual jobs are implemented by overriding methods
There is a number of variables in this class that are used to ‘communicate’ information between the various stages
-
__dict__
= mappingproxy({'templateFile': <function ClusterJob.templateFile>, '__init__': <function ClusterJob.__init__>, 'postDecomposeSetup': <function ClusterJob.postDecomposeSetup>, 'execute': <function ClusterJob.execute>, 'cleanup': <function ClusterJob.cleanup>, '__module__': 'PyFoam.Infrastructure.ClusterJob', 'jobFile': <function ClusterJob.jobFile>, 'fullJobId': <function ClusterJob.fullJobId>, 'run': <function ClusterJob.run>, 'stopJob': <function ClusterJob.stopJob>, 'taskParameters': <function ClusterJob.taskParameters>, 'foamRun': <function ClusterJob.foamRun>, 'casename': <function ClusterJob.casename>, 'setState': <function ClusterJob.setState>, 'preReconstructCleanup': <function ClusterJob.preReconstructCleanup>, 'doIt': <function ClusterJob.doIt>, '__weakref__': <attribute '__weakref__' of 'ClusterJob' objects>, 'casedir': <function ClusterJob.casedir>, 'additionalParameters': <function ClusterJob.additionalParameters>, '__doc__': " All Cluster-jobs are to be derived from this base-class\n\n The actual jobs are implemented by overriding methods\n\n There is a number of variables in this class that are used to\n 'communicate' information between the various stages", 'writeCheckpoint': <function ClusterJob.writeCheckpoint>, 'stopFile': <function ClusterJob.stopFile>, 'checkpointFile': <function ClusterJob.checkpointFile>, 'message': <function ClusterJob.message>, 'additionalReconstruct': <function ClusterJob.additionalReconstruct>, 'setup': <function ClusterJob.setup>, 'autoDecompose': <function ClusterJob.autoDecompose>, '__dict__': <attribute '__dict__' of 'ClusterJob' objects>, 'autoReconstruct': <function ClusterJob.autoReconstruct>})¶
-
__init__
(basename, arrayJob=False, hardRestart=False, autoParallel=True, doAutoReconstruct=None, foamVersion=None, compileOption=None, useFoamMPI=False, multiRegion=False, parameters={}, isDecomposed=False)[source]¶ Initializes the Job :param basename: Basis name of the job :param arrayJob: this job is a parameter variation. The tasks are identified by their task-id :param hardRestart: treat the job as restarted :param autoParallel: Parallelization is handled by the base-class :param doAutoReconstruct: Automatically reconstruct the case if autoParalellel is set. If the value is None then it is looked up from the configuration :param foamVersion: The foam-Version that is to be used :param compileOption: Forces compile-option (usually ‘Opt’ or ‘Debug’) :param useFoamMPI: Use the OpenMPI supplied with OpenFOAM :param multiRegion: This job consists of multiple regions :param parameters: Dictionary with parameters that are being passed to the Runner :param isDecomposed: Assume that the job is already decomposed
-
__module__
= 'PyFoam.Infrastructure.ClusterJob'¶
-
__weakref__
¶ list of weak references to the object (if defined)
-
additionalParameters
()[source]¶ Additional parameters :return: a dictionary with parameters for this task
-
additionalReconstruct
(parameters)[source]¶ Additional reconstruction of parallel runs (Stuff that the OpenFOAM-reconstructPar doesn’t do :param parameters: a dictionary with parameters
-
execute
(cmd)[source]¶ Execute a shell command in the case directory. No checking done :param cmd: the command as a string
-
foamRun
(application, args=[], foamArgs=[], steady=False, multiRegion=True, progress=False, compress=False, noLog=False)[source]¶ Runs a foam utility on the case. If it is a parallel job and the grid has already been decomposed (and not yet reconstructed) it is run in parallel :param application: the Foam-Application that is to be run :param foamArgs: A list if with the additional arguments for the Foam-Application :param compress: Compress the log-file :param args: A list with additional arguments for the Runner-object :param steady: Use the steady-runner :param multiRegion: Run this on multiple regions (if None: I don’t have an opinion on this) :param progress: Only output the time and nothing else :param noLog: Do not generate a logfile
-
postDecomposeSetup
(parameters)[source]¶ Additional setup, to be executed when the grid is already decomposed
Usually for tasks that can be done on a decomposed grid
Parameters: parameters – a dictionary with parameters
-
preReconstructCleanup
(parameters)[source]¶ Additional cleanup, to be executed when the grid is still decomposed
Usually for tasks that can be done on a decomposed grid
Parameters: parameters – a dictionary with parameters
-
run
(parameters)[source]¶ Run the actual job. Usually the solver. :param parameters: a dictionary with parameters
-
setup
(parameters)[source]¶ Set up the job. Called in the beginning if the job has not been restarted
Usual tasks include grid conversion/setup, mesh decomposition etc
Parameters: parameters – a dictionary with parameters
-
taskParameters
(id)[source]¶ Parameters for a specific task :param id: the id of the task :return: a dictionary with parameters for this task
-
-
class
PyFoam.Infrastructure.ClusterJob.
PrepareCaseJob
(basename, solver, parameterfile, arguments, parameters={}, noMeshCreate=False, **kwargs)[source]¶ Bases:
PyFoam.Infrastructure.ClusterJob.SolverJob
Assumes that the case is prepared to be set up with =pyFoamPrepareCase.py= and automatically sets it up with this. Needs one parameterfile to be specified and then a list of name/value-pairs
-
__init__
(basename, solver, parameterfile, arguments, parameters={}, noMeshCreate=False, **kwargs)[source]¶
-
__module__
= 'PyFoam.Infrastructure.ClusterJob'¶
-
-
class
PyFoam.Infrastructure.ClusterJob.
SolverJob
(basename, solver, template=None, cloneParameters=[], arrayJob=False, hardRestart=False, autoParallel=True, doAutoReconstruct=None, foamVersion=None, compileOption=None, useFoamMPI=False, steady=False, multiRegion=False, parameters={}, progress=False, solverArgs=[], solverProgress=False, solverNoLog=False, solverLogCompress=False, isDecomposed=False)[source]¶ Bases:
PyFoam.Infrastructure.ClusterJob.ClusterJob
A Cluster-Job that executes a solver. It implements the run-function. If a template-case is specified, the case is copied
-
__init__
(basename, solver, template=None, cloneParameters=[], arrayJob=False, hardRestart=False, autoParallel=True, doAutoReconstruct=None, foamVersion=None, compileOption=None, useFoamMPI=False, steady=False, multiRegion=False, parameters={}, progress=False, solverArgs=[], solverProgress=False, solverNoLog=False, solverLogCompress=False, isDecomposed=False)[source]¶ Parameters: template – Name of the template-case. It is assumed that it resides in the same directory as the actual case :param cloneParameters: a list with additional parameters for the CloneCase-object that copies the template :param solverProgress: Only writes the current time of the solver
-
__module__
= 'PyFoam.Infrastructure.ClusterJob'¶
-
-
class
PyFoam.Infrastructure.ClusterJob.
VariationCaseJob
(basename, parameterfile, variationfile, template=None, **kwargs)[source]¶ Bases:
PyFoam.Infrastructure.ClusterJob.SolverJob
Assumes that the case is prepared to be set up with =pyFoamRunParameterVariation.py= and automatically sets it up with this. Needs one parameterfile and a variation-file
-
__module__
= 'PyFoam.Infrastructure.ClusterJob'¶
-