Skip to main content
Version: v3.3 print this page

Jobs

Jobs in Amorphic CICD enable the execution of Python and PySpark code to work with datasets and perform ETL transformations.

Job Artifacts

  • ETL Jobs require a script file, which should be specified under the Artifacts key in the resource definition.
  • The path to the script file should be relative to the root resources directory.
Security Checks

The Python script is subject to SCA (Software Composition Analysis) and SAST (Static Application Security Testing) before deployment, ensuring compliance and security standards are met.

Below is the sample resource definition file for ETL Job:

{
"rPythonJob": {
"Type": "Job",
"Artifacts": {
"Script": "resources/jobs/test/job_script.py"
},
"Properties": {
"JobName": "cicd_test",
"Description": "CICD test for job",
"ETLJobType": "pythonshell",
"NetworkConfiguration": "general-public-network",
"JobBookmarkOption": "disable",
"Keywords": [
"Owner: asysuser"
],
"MaxCapacity": 0.0625,
"ParameterAccess": [
{
"!DependsOn": "rS3TierParam.ParameterKey"
}
],
"SharedLibraries": [],
"DomainAccess": {
"Owner": [
{
"DomainName": {
"!DependsOn": "rCICDDomain.DomainName"
}
}
],
"ReadOnly": []
},
"DatasetAccess": {
"Owner": [],
"ReadOnly": []
},
"IsDataLineageEnabled": "no",
"IsAutoScalingEnabled": false
}
}
}