diff --git a/docs/apache-airflow/core-concepts/index.rst b/docs/apache-airflow/core-concepts/index.rst index dab97ea3ff8ad..47dff5ef73f06 100644 --- a/docs/apache-airflow/core-concepts/index.rst +++ b/docs/apache-airflow/core-concepts/index.rst @@ -18,7 +18,7 @@ Core Concepts ============================= -Here you can find detailed documentation about each one of the core concepts of Apache Airflow™ and how to use them, as well as a high-level :doc:`architectural overview `. +Here you can find detailed documentation about each one of the core concepts of Apache Airflow® and how to use them, as well as a high-level :doc:`architectural overview `. **Architecture** diff --git a/docs/apache-airflow/index.rst b/docs/apache-airflow/index.rst index 3897be717866e..258c8a126cc1d 100644 --- a/docs/apache-airflow/index.rst +++ b/docs/apache-airflow/index.rst @@ -18,7 +18,7 @@ What is Airflow™? ========================================= -`Apache Airflow™ `_ is an open-source platform for developing, scheduling, +`Apache Airflow® `_ is an open-source platform for developing, scheduling, and monitoring batch-oriented workflows. Airflow's extensible Python framework enables you to build workflows connecting with virtually any technology. A web interface helps manage the state of your workflows. Airflow is deployable in many ways, varying from a single process on your laptop to a distributed setup to support even diff --git a/docs/apache-airflow/installation/setting-up-the-database.rst b/docs/apache-airflow/installation/setting-up-the-database.rst index 6c595563ca293..f72caca99bb11 100644 --- a/docs/apache-airflow/installation/setting-up-the-database.rst +++ b/docs/apache-airflow/installation/setting-up-the-database.rst @@ -18,7 +18,7 @@ Setting up the database ----------------------- -Apache Airflow™ requires a database. If you're just experimenting and learning Airflow, you can stick with the +Apache Airflow® requires a database. If you're just experimenting and learning Airflow, you can stick with the default SQLite option. If you don't want to use SQLite, then take a look at :doc:`/howto/set-up-database` to setup a different database. diff --git a/docs/apache-airflow/installation/supported-versions.rst b/docs/apache-airflow/installation/supported-versions.rst index 4e160f434d451..66ed14254ca30 100644 --- a/docs/apache-airflow/installation/supported-versions.rst +++ b/docs/apache-airflow/installation/supported-versions.rst @@ -21,7 +21,7 @@ Supported versions Version Life Cycle `````````````````` -Apache Airflow™ version life cycle: +Apache Airflow® version life cycle: .. This table is automatically updated by pre-commit scripts/ci/pre_commit/supported_versions.py .. Beginning of auto-generated table diff --git a/docs/apache-airflow/installation/upgrading.rst b/docs/apache-airflow/installation/upgrading.rst index 67f18534e912a..38f17fdaa2983 100644 --- a/docs/apache-airflow/installation/upgrading.rst +++ b/docs/apache-airflow/installation/upgrading.rst @@ -57,7 +57,7 @@ when you choose to upgrade airflow via their UI. How to upgrade ============== -Reinstall Apache Airflow™, specifying the desired new version. +Reinstall Apache Airflow®, specifying the desired new version. To upgrade a bootstrapped local instance, you can set the ``AIRFLOW_VERSION`` environment variable to the intended version prior to rerunning the installation command. Upgrade incrementally by patch version: e.g., diff --git a/docs/apache-airflow/security/index.rst b/docs/apache-airflow/security/index.rst index 7d186a09fbb14..b9f79e2ee7063 100644 --- a/docs/apache-airflow/security/index.rst +++ b/docs/apache-airflow/security/index.rst @@ -21,7 +21,7 @@ Security This section of the documentation covers security-related topics. Make sure to get familiar with the :doc:`Airflow Security Model ` if you want to understand -the different user types of Apache Airflow™, what they have access to, and the role Deployment Managers have in deploying Airflow in a secure way. +the different user types of Apache Airflow®, what they have access to, and the role Deployment Managers have in deploying Airflow in a secure way. Also, if you want to understand how Airflow releases security patches and what to expect from them, head over to :doc:`Releasing security patches `. diff --git a/docs/apache-airflow/security/releasing_security_patches.rst b/docs/apache-airflow/security/releasing_security_patches.rst index f98e464dfe542..2fe60eceee50b 100644 --- a/docs/apache-airflow/security/releasing_security_patches.rst +++ b/docs/apache-airflow/security/releasing_security_patches.rst @@ -18,7 +18,7 @@ Releasing security patches ========================== -Apache Airflow™ uses a consistent and predictable approach for releasing security patches - both for +Apache Airflow® uses a consistent and predictable approach for releasing security patches - both for the Apache Airflow package and Apache Airflow providers (security patches in providers are treated separately from security patches in Airflow core package). diff --git a/tests/providers/slack/transfers/test_sql_to_slack_webhook.py b/tests/providers/slack/transfers/test_sql_to_slack_webhook.py index 512e3175b7af6..3c71fab26a617 100644 --- a/tests/providers/slack/transfers/test_sql_to_slack_webhook.py +++ b/tests/providers/slack/transfers/test_sql_to_slack_webhook.py @@ -285,7 +285,7 @@ def test_partial_deprecated_slack_conn_id(self, slack_conn_id, slack_webhook_con slack_conn_id=slack_conn_id, slack_webhook_conn_id=slack_webhook_conn_id, sql_conn_id="fake-sql-conn-id", - slack_message="", + slack_message="", ).expand(sql=["SELECT 1", "SELECT 2"]) dr = dag_maker.create_dagrun() @@ -304,7 +304,7 @@ def test_partial_ambiguous_slack_connections(self, dag_maker, session): slack_conn_id="slack_conn_id", slack_webhook_conn_id="slack_webhook_conn_id", sql_conn_id="fake-sql-conn-id", - slack_message="", + slack_message="", ).expand(sql=["SELECT 1", "SELECT 2"]) dr = dag_maker.create_dagrun(session=session) diff --git a/tests/system/providers/slack/example_slack.py b/tests/system/providers/slack/example_slack.py index bbb47374b37d4..c20cf8afa8150 100644 --- a/tests/system/providers/slack/example_slack.py +++ b/tests/system/providers/slack/example_slack.py @@ -41,7 +41,7 @@ task_id="slack_post_text", channel=SLACK_CHANNEL, text=( - "Apache Airflow™ is an open-source platform for developing, " + "Apache Airflow® is an open-source platform for developing, " "scheduling, and monitoring batch-oriented workflows." ), ) @@ -57,7 +57,7 @@ "text": { "type": "mrkdwn", "text": ( - "** " + "** " "is an open-source platform for developing, scheduling, " "and monitoring batch-oriented workflows." ), diff --git a/tests/system/providers/slack/example_slack_webhook.py b/tests/system/providers/slack/example_slack_webhook.py index 63f166253f009..98905d1b9ee67 100644 --- a/tests/system/providers/slack/example_slack_webhook.py +++ b/tests/system/providers/slack/example_slack_webhook.py @@ -39,7 +39,7 @@ task_id="slack_webhook_send_text", slack_webhook_conn_id=SLACK_WEBHOOK_CONN_ID, message=( - "Apache Airflow™ is an open-source platform for developing, " + "Apache Airflow® is an open-source platform for developing, " "scheduling, and monitoring batch-oriented workflows." ), ) @@ -55,7 +55,7 @@ "text": { "type": "mrkdwn", "text": ( - "** " + "** " "is an open-source platform for developing, scheduling, " "and monitoring batch-oriented workflows." ),