Skip to content

Commit

Permalink
Merge pull request apache#3 from ayush-san/zstaging
Browse files Browse the repository at this point in the history
Zstaging
  • Loading branch information
ayush-san authored Jan 16, 2020
2 parents 78c11d8 + 14f8887 commit 9ab4503
Show file tree
Hide file tree
Showing 4 changed files with 14 additions and 4 deletions.
2 changes: 1 addition & 1 deletion airflow/config_templates/airflow-prod.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -286,7 +286,7 @@ error_logfile = -
# This is only applicable for the flask-admin based web UI (non FAB-based).
# In the FAB-based web UI with RBAC feature,
# access to configuration is controlled by role permissions.
expose_config = True
expose_config = False

# Set to true to turn on authentication:
# https://airflow.apache.org/security.html#web-authentication
Expand Down
2 changes: 1 addition & 1 deletion airflow/config_templates/airflow-staging.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -286,7 +286,7 @@ error_logfile = -
# This is only applicable for the flask-admin based web UI (non FAB-based).
# In the FAB-based web UI with RBAC feature,
# access to configuration is controlled by role permissions.
expose_config = False
expose_config = True

# Set to true to turn on authentication:
# https://airflow.apache.org/security.html#web-authentication
Expand Down
9 changes: 9 additions & 0 deletions scripts/docker/entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,15 @@ set -e

: "${AIRFLOW__CORE__FERNET_KEY:=${FERNET_KEY:=$(python -c "from cryptography.fernet import Fernet; FERNET_KEY = Fernet.generate_key().decode(); print(FERNET_KEY)")}}"

export AIRFLOW__SCHEDULER__STATSD_HOST

if test "$AWS_EXECUTION_ENV" = "AWS_ECS_EC2"
then
instance_ip=$(curl --silent http://169.254.169.254/1.0/meta-data/local-ipv4)
AIRFLOW__SCHEDULER__STATSD_HOST=$instance_ip
fi
echo statsd host is "$AIRFLOW__SCHEDULER__STATSD_HOST"

echo Starting Apache Airflow with command:
echo airflow "$@"

Expand Down
5 changes: 3 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -262,7 +262,8 @@ def write_version(filename=os.path.join(*["airflow", "git_version"])):
'yarn-api-client~=0.3.6',
'sasl~=0.2.1',
'overrides',
'sqlparser']
'sqlparser',
'airflow-prometheus-exporter']

all_dbs = postgres + mysql + hive + mssql + hdfs + vertica + cloudant + druid + pinot \
+ cassandra + mongo
Expand Down Expand Up @@ -307,7 +308,7 @@ def write_version(filename=os.path.join(*["airflow", "git_version"])):
datadog + zendesk + jdbc + ldap + kerberos + password + webhdfs + jenkins +
druid + pinot + segment + snowflake + elasticsearch + azure_data_lake + azure_cosmos +
atlas + azure_container_instances + cgroups + virtualenv + flask_oauth + atlas + emr +
async_packages + zomato_custom)
async_packages + zomato_custom + statsd)

# Snakebite & Google Cloud Dataflow are not Python 3 compatible :'(
if PY3:
Expand Down

0 comments on commit 9ab4503

Please sign in to comment.