Skip to content

Commit

Permalink
Add 'main' param to template_fields in DataprocSubmitPySparkJobOperat…
Browse files Browse the repository at this point in the history
…or (#9154)
  • Loading branch information
kaxil authored Jun 5, 2020
1 parent 32ef0cd commit 9bcdada
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions airflow/providers/google/cloud/operators/dataproc.py
Original file line number Diff line number Diff line change
Expand Up @@ -1242,7 +1242,7 @@ class DataprocSubmitPySparkJobOperator(DataprocJobBaseOperator):
Start a PySpark Job on a Cloud DataProc cluster.
:param main: [Required] The Hadoop Compatible Filesystem (HCFS) URI of the main
Python file to use as the driver. Must be a .py file.
Python file to use as the driver. Must be a .py file. (templated)
:type main: str
:param arguments: Arguments for the job. (templated)
:type arguments: list
Expand All @@ -1256,7 +1256,7 @@ class DataprocSubmitPySparkJobOperator(DataprocJobBaseOperator):
:type pyfiles: list
"""

template_fields = ['arguments', 'job_name', 'cluster_name',
template_fields = ['main', 'arguments', 'job_name', 'cluster_name',
'region', 'dataproc_jars', 'dataproc_properties']
ui_color = '#0273d4'
job_type = 'pyspark_job'
Expand Down

0 comments on commit 9bcdada

Please sign in to comment.