pyepic.applications package

Submodules

pyepic.applications.base module

class pyepic.applications.base.Config

Bases: object

The Job Configuration

Variables:
  • overwrite_existing (bool) – Should data created on the remote cluster overwrite older data that exists in the epic data store

  • upload (list) – Which job states should trigger a data upload

  • data_sync_interval (int) – How frequently should the data be periodically uploaded while the job is still running, set to 0 to disable.

  • project_id (int, optional) – ID of the EPIC project to run this job in

get_configuration()

Get a JobConfiguration for this job :return: Job Configuration :rtype: class:epiccore.models.JobConfiguration

class pyepic.applications.base.Distribution(value)

Bases: Enum

How should the partitions/processes or tasks be distrubuted on the remote cluster, 1 per CORE/SOCKET/NODE or DEVICE

CORE = 'core'
DEVICE = 'device'
NODE = 'node'
SOCKET = 'socket'
class pyepic.applications.base.Job(application_version, job_name, path)

Bases: object

An EPIC Job Definition

Parameters:
  • application_version (str) – The Code of the BatchApplicationVersion that this job will user

  • job_name (str) – A user friendly name for the job

  • path (str) – The path to the root of the OpenFOAM job, formed as an epic url (e.g. “epic://path_to/data”)

Variables:
  • job_name (str) – A user friendly name for the job

  • path (str) – The path to the root of the OpenFOAM job, formed as an epic url (e.g. “epic://path_to/data”)

  • config (Config) – The Job configuration options object

  • steps (list) – The job steps that make up this job

add_step(job_step)

Add a new step to this job

Parameters:

job_step (JobStep) – The step to append to this job

get_job_create_spec(queue_code)

Get a JobArraySpec for this job. The JobArraySpec can be used to submit the job to EPIC via the client.

Parameters:

queue_code (str) – The code of the EPIC batch queue to submit to

Returns:

Job ArraySpecification

Return type:

class:epiccore.models.JobArraySpec

get_job_spec()

Get a JobSpec for this job

Returns:

Job Specification

Return type:

class:epiccore.models.JobSpec

class pyepic.applications.base.JobArray(array_name, array_root_folder)

Bases: object

A helper class for submitting an array of jobs to EPIC.

Parameters:
  • array_name (str) – The name to give the array in EPIC

  • array_root_folder (str) – The epic data path to the root of the array data folder, formed as an epic url (e.g. “epic://path_to/data”). Any data in a folder called “common” within this folder will be shared between all jobs in the array.

Variables:
  • config (Config) – The Job configuration options object, common to all jobs in the array

  • jobs (list) – The jobs that make up this array

add_job(job)

Add a job to this array

Parameters:

job (class:Job) – The code of the EPIC batch queue to submit to

get_job_create_spec(queue_code)

Get a JobArraySpec for this array. The JobArraySpec can be used to submit the array to EPIC via the client.

Parameters:

queue_code (str) – The code of the EPIC batch queue to submit to

Returns:

Job ArraySpecification

Return type:

class:epiccore.models.JobArraySpec

class pyepic.applications.base.JobStep(execute_step=True)

Bases: object

A Step within an EPIC Job

Parameters:

execute_step (int) – Enable this step as part of this job

Variables:
  • step_name (str) – Name of step, this is application specific

  • execute (bool) – Should this step execute when the job is submitted

  • partitions (int) – How many partitions/processes make up this step

  • task_distribution (Distribution) – How are the partitions distributed to the hardware

  • runtime (int) – The maximum runtime of this step in hours

  • run_if_previous_step_fails (bool) – Should this step execute if the previous step fails

  • hyperthreading (bool) – Does this step count hyperthreaded cores as individual CPUs?

get_task_spec()

Get a JobTaskSpec for this job step

Returns:

Job Task Specification

Return type:

epiccore.models.JobTaskSpec

class pyepic.applications.base.Upload(value)

Bases: Enum

Should excluded files be uploaded? Yes, No or only when the job finishes in an error state.

NO = 'no'
ON_ERROR = 'error'
YES = 'yes'

pyepic.applications.openfoam module

class pyepic.applications.openfoam.BlockMeshStep(*args, **kwargs)

Bases: JobStep

BlockMeshStep step of OpenFOAM

class pyepic.applications.openfoam.DecomposeParStep(*args, **kwargs)

Bases: JobStep

DecomposeParStep step of OpenFOAM

class pyepic.applications.openfoam.OpenFoamJob(foam_version, job_name, data_path)

Bases: Job

A helper class for submitting an OpenFOAM job to EPIC.

Parameters:
  • foam_version (str) – The code of the BatchApplicationVersion of OpenFOAM to use

  • job_name (str) – The name to give the job in EPIC

  • data_path (str) – The epic data path to the OpenFOAM case directory

Variables:
  • blockMesh (BlockMeshStep) – blockMesh JobStep object

  • decomposePar (DecomposeParStep) – decomposePar JobStep object

  • solver (SolverStep) – initial solver JobStep object

  • reconstructPar (ReconstructParStep) – reconstructPar JobStep object

  • clear_partitions – Delete any existing processor directories before running job

  • sync_processor_directories (base.Upload) – Upload processor after job completion, default No

get_applications_options()

Get application configuration options for submission to EPIC

Returns:

Dictionary of the job configuration

Return type:

dict

get_job_create_spec(queue_code)

Get a JobSpec for this job

Returns:

Job Specification

Return type:

class:epiccore.models.JobSpec

class pyepic.applications.openfoam.Reconstruct(value)

Bases: Enum

An enumeration.

ALL = 'all'
LATEST = 'latest'
TIME = 'time'
class pyepic.applications.openfoam.ReconstructParStep(*args, **kwargs)

Bases: JobStep

ReconstructPar step of OpenFOAM

Variables:
  • run_if_previous_step_fails (bool) – Run if solver fails, defaults to True

  • reconstruct_option (Reconstruct) – Which time step to reconstruct. Defaults to ALL

  • reconstruct_time (int) – If reconstruct_option == TIME then which timestep to reconstruct.

class pyepic.applications.openfoam.SolverStep(*args, **kwargs)

Bases: JobStep

Solver step of OpenFOAM

Variables:
  • run_if_previous_step_fails (bool) – Run if previous step fails, defaults to False

  • stopAt (StopAt) – When to stop the solver. Defaults to END_TIME

  • startFrom (StartFrom) – Which timestep to start the solver from. Defaults to LATEST

  • endTime (int) – If stopAt == END_TIME then which timestep to stop the solver at.

  • startTime (int) – If startFrom == START then which timestep to start the solver from.

class pyepic.applications.openfoam.StartFrom(value)

Bases: Enum

An enumeration.

FIRST = 'firstTime'
LATEST = 'latestTime'
START = 'startTime'
class pyepic.applications.openfoam.StopAt(value)

Bases: Enum

An enumeration.

END_TIME = 'endTime'
NEXT_WRITE = 'nextWrite'
NO_WRITE_NOW = 'noWriteNow'
WRITE_NOW = 'writeNow'

pyepic.applications.zcfd module

class pyepic.applications.zcfd.ZCFDJob(zcfd_version, job_name, data_path, case_name, problem_name, override_file=None, cycles=100, restart=False, partitions=1)

Bases: Job

A helper class for submitting an zCFD job to EPIC.

Parameters:
  • zcfd_version (str) – The code of the BatchApplicationVersion of zCFD to use

  • job_name (str) – The name to give the job in EPIC

  • data_path (str) – The epic data path to the zCFD case directory

  • case_name (str) – The name of the python control file for the case

  • problem_name (str) – The name of the hdf5 mesh file

  • override_file (str, optional) – The name of the zcfd override file for overset meshes. Defaults to None.

  • cycles (int, optional) – How many cycles to run for. Default 100.

  • restart (bool, optional) – Is the case a restart from a previous solution. Default False.

  • partitions (int, optional) – How many parallel partitions should the case use. Default 1.

Variables:

zcfd (ZCFDStep) – zCFD JobStep object

get_applications_options()

Get application configuration options for submission to EPIC

Returns:

Dictionary of the job configuration

Return type:

dict

get_job_create_spec(queue_code)

Get a JobSpec for this job

Returns:

Job Specification

Return type:

class:epiccore.models.JobSpec

class pyepic.applications.zcfd.ZCFDStep(case_name, problem_name, override_file, cycles, restart=False, partitions=1, execute_step=True)

Bases: JobStep

zCFD Solver

Variables:
  • case_name (str) – The name of the python control file for the case

  • problem_name (str) – The name of the hdf5 mesh file

  • override_file (str) – The name of the zcfd override file for overset meshes

  • cycles (int) – How many cycles to run for

  • restart (bool) – Is the case a restart from a previous solution

  • partitions (int) – How many parallel partitions should the case use

pyepic.applications.msc_nastran module

class pyepic.applications.msc_nastran.NastranJob(nastran_version, job_name, data_path, dat_file, nastran_licence_server, partitions=1)

Bases: Job

A helper class for submitting an Nastran job to EPIC.

Parameters:
  • nastran_version (str) – The code of the BatchApplicationVersion of Nastran to use

  • job_name (str) – The name to give the job in EPIC

  • data_path (str) – The epic data path to the Nastran case directory

  • dat_file (str) – The name of the nastran data file

  • nastran_licence_server (str) – The licence server and port for nastran

Variables:

nastran (NastranStep) – Nastran solver JobStep object

get_applications_options()

Get application configuration options for submission to EPIC

Returns:

Dictionary of the job configuration

Return type:

dict

get_job_create_spec(queue_code)

Get a JobSpec for this job

Returns:

Job Specification

Return type:

class:epiccore.models.JobSpec

class pyepic.applications.msc_nastran.NastranStep(dat_file, nastran_licence_server, cycles, restart=False, partitions=1, execute_step=True)

Bases: JobStep

Nastran Solver

Variables:
  • dat_file (str) – The name of the nastran data file

  • nastran_licence_server (str) – The licence server and port for nastran

  • partitions (int) – How many parallel partitions should the case use

Module contents