This file follows semantic versioning 2.0.0. Given a version number MAJOR.MINOR.PATCH, increment the:
- MAJOR version when you make incompatible API changes,
- MINOR version when you add functionality in a backwards compatible manner, and
- PATCH version when you make backwards compatible bug fixes.
As a heuristic:
- if you fix a bug, increment the PATCH
- if you add a feature (add keyword arguments with default values, add a new object, a new mechanism for parameter setup that is backwards compatible, etc.), increment the MINOR version
- if you introduce a breaking change (removing arguments, removing objects, restructuring code such that it affects imports, etc.), increment the MAJOR version
The general format is:
# VERSION - DATE (dd/mm/yyyy)
### Added
- A to B
### Changed
- B to C
### Removed
- C from D
- support for global hooks on task and workflow
- support for unset parameters so Hera can be used in a GitOps style
- better Workflow typing for inheritance
- exit DAGs on workflows and tasks
- mapping Python source specs via dictionaries on inputs
- all accessible task properties
- sidecars
- suspend
- artifact compression
- raw artifact
- PyYAML kwargs on
to_yaml
- supplied option on value from and parameter feature parity
- workflow link from service when using the global host
expr
module for constructing Argo expressions
to_yaml
,to_dict
, andto_json
on workflows- optional PyYAML dependency (
hera[yaml]
) - global default service account
- global task image
- global SSL verification flag
lint
API on workflowsvalue_from_input
onEnv
sanitization so that the name satisfies RFC1123 requirements
- DAG template naming to fix the issue of DAGs not being present on workflow templates
- version constraints on dependencies from
^
to>=
- auto-setting of a DAG on the workflow so users do not necessarily need to supply one, so workflow additions are executed against the set default DAG
- a fix for positional args in sized volumes vs unsized volumes
- the correct field specification for K8S resources
- archiving specifications
- prometheus metrics
- generated names on workflows
- backoff custom object specification
- Git artifact specification to set the correct fields based on input
- support for
withSequence
onTask
generate_name
on worfklow- active deadline seconds/timeouts on tasks and workflows
EnvSpec
naming toEnv
, including inheriting classes- pyproject Python limit from 3.11 to 4
- error messages to be more descriptive
- workflow template update API
- more examples
- Hera type returns rather than
argo_workflow
SDK return types get_parameters_as
rather thanoutputs
- list dependencies structuring
- reorder of args to use default
Equals
for workflow and task result comparisons
- workflow of workflows support, and general K8S resource provisioning, via resource templates
- nested, parallel, DAGs
- pod patch spec
- support for arbitrary scripts, rather than only Python functions and containers
- K8S-aligned resource specs
- volumes on tasks
IO
between tasksTask
build on submissionIO
on parameters
func
andfunc_params
fromTask
in favor ofsource
andparams
Resource
volumes, moved toTask
CronWorkflow
andWorkflowTemplate
servicesCronWorkflow
andWorkflowTemplate
independent implementationsTask
build on definition
- tolerations can now be set via cron workflow and workflow spec
- tolerations, node selectors and affinity should be set in the internal workflow spec
- support for Git artifact authentication credentials
- tolerations can now be set via workflow
- version via
hera.__version__
- volume specifications on workflows
- float type handling for
max_cpu
andmin_cpu
properties inResources
class - kwarg value setting as a parameter
- Remove python <3.11 constraint and unpin transitive dependencies
pytz
version from^2021.3
to>=2021.3
- workflow template parameters
- privileged option to the security context
Config.host
andConfig.verify_ssl
are now public;WorkflowService.get_workflow_link
now referencesConfig.host
to properly pull the host when usingset_global_host
.
get_input_spec
fromInputArtifact
so that it relies on the inherited one that does not add thefrom
field that is not allowed in the Argo submission for input artifacts
- support for
subPath
in volume mounts - context management to workflow types. This supports the
with
clause and adds all tasks to a workflow automatically - task exit hook
- HTTP artifact
- global workflow parameters
- input parameter caching through memoization
- bucket field to GCS/S3 artifact, which was missing
- assertion that
input_from
cannot be used with artifacts, which supports artifact input on fanned out tasks now
- support for multiple inputs from a fanned out task via
MultiInput
set_global_token
to take in a union of a string token or a callable
- node selectors on all workflow types
- memoization
- global parameter access on tasks
- input/output parameters in addition to artifacts
- affinity, anti-affinity, node affinity
- client host/token global injection
- parallelism specification on spec templates
- Fix Task raise error with input_from without func
AnySucceeded
/AllFailed
support on tasksOnExit
condition on workflows with tasks conditioned onWorkflowStatus
- plain string support when using
InputFrom
argo-workflows==6.3.5
dependency
- JSON import from tasks that do not need it
- references to
IoArgoprojWorkflowV1alpha1WorkflowTemplateSpec
and instead useIoArgoprojWorkflowV1alpha1WorkflowSpec
- cluster scope in template reference
- host alias support
- volume claim garbage collection setting on workflows
- git artifact
- structure of the environment variables. Now contributors can implement the generic
Variable
rather than augment or duplicateVariableAsEnv
- retry policy support
- implementation of the cron workflow update command so that it includes the version and the assigned cron workflow UUID during the update operation
- update method to CronWorkflowService
- ability to track workflow status
- shared implementation of
add_head
,add_tail
,add_task
,add_tasks
- task workflow template reference
- UUID suffixes
- prevent json.loads error when task has input from other tasks(with the json dumped string contains single quote)
- Add support for TTLStrategy(link)
- fix wrong dependencies when calling on_success(), on_failure(), on_error() functions of Sask
- add support for exposing field reference via env vars in Tasks
- add support for specifying annotations on Workflows, CronWorkflows and Tasks
- retry limit
- image pull secrets option on workflows
- ability to set a task dependency and execution based on the success, failure, or error of another task (similar
to
when
) - support for
env_from
option
- Set image_pull_policy on a Task level
- WorkflowTemplate Service and WorkflowTemplate implementation
- Option to create a Workflow with a WorkflowTemplate
- ability to set the
access_modes
for dynamically provisioned volumes - security context on cron workflows
- image pull secrets specification on workflows and cron workflows
- inconsistency between
create
andsubmit
between Hera and Argo. Now users are provided with acreate
command and will receive aDeprecationWarning
when thesubmit
is invoked
- Add support for sharing the IP of one Task to another Task, via env variables
- support for setting args instead of command
- Added
volume_mounts
definition back to script template inTask.get_script_def
.
- Added
TaskSecurityContext
to allow setting security settings to the task container. - Added
WorkflowSecurityContext
to allow setting security settings to all of the containers in the workflow.
- support for custom resources on
Resource
definitions
- support for multiple volumes (volumes, config maps, secrets, existing volumes)
- add support for bucket resource inputs with key only
- wait time for Test PyPi in CICD from 30 to 60 seconds
- add image pull policy on tasks
- add ability to mount config maps as volume in a task
- a
sleep
step to the new Hera version installation from Test PyPI to wait for PyPI indexing
- GitHub test index installation for CICD
- the underlying SDK from argo-workflows v5 to argo-workflows v6.3rc2
- don't require func to be specified when creating a task, running the task as only a container with commands
- fix where a subclass of a Task could not have the parent type as dependency
- the type for the
value
field ofEnvSpec
fromOptional[Union[BaseModel, Any]]
to onlyOptional[Any]
as dictionary values were not serialized/set properly as a consequence of Pydantic validation
- add default name/namespace handling to CronWorkflow create/suspend/resume methods
EnvSpec
to return thevalue
ifvalue
is of type string
- add support for exposing config map keys via env vars in Tasks
- add support for attaching a secret volume to Workflows and CronWorkflows
- add support for specifying labels on Workflows, CronWorkflows and Tasks
- add support for the timezone attribute of CronWorkflow and validate the specified timezone
- introduce
pytz
dependency for timezone validation
- The
daemon
keyword to the Task.deamon
will allow a workflow to proceed to the next task, so long as the container reaches readiness.
- Make value in Tolerations optional, as per Kubernetes requirements
setup.py
packages field to include hera exclusively post-removal of the underlyingv1
directory. With the removal of the underlying versioned subpackage (v1
) in 1.0.0 thesetup.py
file no longer installed the necessary modules as the wheel only included references for whatever subpackages were inhera.*
but nothera
itself (as a module)
v1
submodule of Hera to avoid internal versioning and external/package versioning
- location of all files from
v1
up one folder tohera
. Now everything will take the import form offrom hera.module import Object
rather thanfrom hera.v1.module import Object
- interface of services to take a full host rather than a single domain and put in effort to compute the final host. This will offer more freedom to users to select their own host scheme, for example. A flag for SSL verification was also introduced
- all volume types (existing, empty dir, and regular volume) are now packaged in the volumes module rather than separated
- an
overwrite_maxs
toResource
to allow users to set whether max resources should be set to min values when they are not specified
- underlying SDK of Hera, which moved from
argo-workflows
to the Argo Workflows repository (unpublished on PyPi) Python SDK. This was originally released in #38 but the publication process to PyPi failed. A fix was attempted in #43 but that published a broken version because thedependency_links
ofsetup.py
did not actually install the necessary dependency. As a consequence, the release was quickly deleted from PyPi because it was broken. The best course of action was to wait for the official release of the new SDK underargo-workflows==6.0.0
, in collaboration with the maintainers of https://github.com/argoproj/argo-workflows
- input/output artifact specifications
- fix returned value of validator method in EnvSpec class
- added support to
when
workflows API.
- ability to specify a service account name to run the workflow as. This is currently set on the workflow level only, which makes all the pods of tasks in a workflow use the same service account.
- the publication step of Hera. The
python
command will now build ansdist
and awheel
for the package - relocked the project to include
wheel
as a development dependency
- added initial support for cron workflows
- initial release of Hera