Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add new community provider: Flyte #22643

Closed
wants to merge 5 commits into from
Closed

Add new community provider: Flyte #22643

wants to merge 5 commits into from

Conversation

samhita-alla
Copy link

Signed-off-by: Samhita Alla [email protected]

This PR adds a new community provider to allow Airflow users to interact with Flyte from within Airflow.

A bit about Flyte: Flyte is an open-source, container-native, structured programming and distributed processing platform that enables highly concurrent, scalable, and maintainable workflows for machine learning and data processing pipelines.

As a more significant chunk of the users who are into pipelines are using Airflow, it'd be really helpful to have a provider that bridges the gap between Airflow and Flyte, to help the Airflow users retain their existing pipelines and use Flyte from within the Airflow DAGs to run the machine learning jobs (say).

We've had this operator in the back of our minds for a long time; here's the issue.

Code Interface

The provider defines a hook, an operator, and a sensor. When the user instantiates the AirflowFlyteOperator, it creates a FlyteRemote (Flyte's Python API) object, triggers the execution in the Flyte environment, waits for the execution to complete, and finally, returns the execution name. This implementation is carried out using the methods defined in the hook. The executions are triggered synchronously by default. One can set asynchronous to True if the execution needs to be polled asynchronously; the polling would be handled by the sensor.

Tests

Unit tests have been added and tested. Also, I spun up an Airflow instance to validate the code in real-time.

TODO

  • Update the flytekit version from 0.32.0b0 to 0.32.0 after the latter release is out.

^ Add meaningful description above

Read the Pull Request Guidelines for more information.
In case of fundamental code change, Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in UPDATING.md.

Signed-off-by: Samhita Alla <[email protected]>
Signed-off-by: Samhita Alla <[email protected]>
Signed-off-by: Samhita Alla <[email protected]>
Signed-off-by: Samhita Alla <[email protected]>
@boring-cyborg
Copy link

boring-cyborg bot commented Mar 31, 2022

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
Here are some useful points:

  • Pay attention to the quality of your code (flake8, mypy and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: [email protected]
    Slack: https://s.apache.org/airflow-slack

Comment on lines +104 to +109
task_id = re.sub(r"[\W_]+", "", context["task"].task_id)[:5]
self.execution_name = task_id + re.sub(
r"[\W_]+",
"",
context["dag_run"].run_id.split("__")[-1].lower(),
)[: (20 - len(task_id))]
Copy link
Author

@samhita-alla samhita-alla Mar 31, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm currently generating a deterministic-and-unique execution name that is to be used to name a Flyte execution. It's a combination of the task_id and run_id. I'm using both because with task_id, I wouldn't be able to create unique task names whenever a task runs more than once, and with run_id I wouldn't be able to create unique task names within the same DAG cause run_id remains the same for all tasks; hence came up with this logic.

The execution name cannot exceed 20 characters (a restriction imposed by Flyte), and hence, I'm trimming the two strings. Please let me know if there's a better way to create unique execution names, even with the task being repeated multiple times within the same DAG (with different task_ids, of course), or run multiple times.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant