Skip to content

Latest commit

 

History

History
166 lines (136 loc) · 7.25 KB

README.md

File metadata and controls

166 lines (136 loc) · 7.25 KB

Reference Implementation for Stackdriver Metric Export

This implementation is meant to demonstrate how to use the projects.timeseries.list API, PubSub and BigQuery to store aggregated Cloud Monitoring metrics for long-term analysis in BigQuery.

Deployment Instructions

  1. Clone the source repo
git clone https://github.com/GoogleCloudPlatform/stackdriver-metrics-export
cd stackdriver-metrics-export
  1. Enable the APIs
gcloud services enable compute.googleapis.com \
    cloudscheduler.googleapis.com \
    cloudfunctions.googleapis.com \
    cloudresourcemanager.googleapis.com
  1. Set your PROJECT_ID variable, by replacing [YOUR_PROJECT_ID] with your GCP project id
export PROJECT_ID=[YOUR_PROJECT_ID]
  1. Create the BigQuery tables Create a Dataset and then a table using the schema JSON files
bq mk metric_export
bq mk --table --time_partitioning_type=DAY metric_export.sd_metrics_export_fin ./bigquery_schemas/bigquery_schema.json
bq mk --table --time_partitioning_type=DAY metric_export.sd_metrics_stats ./bigquery_schemas/bigquery_schema_stats_table.json
bq mk --table metric_export.sd_metrics_params  ./bigquery_schemas/bigquery_schema_params_table.json
  1. Replace the JSON token in the config.py files Generate a new token and then replace that token in the each of config.py files. Use this same token in the Cloud Scheduler.
TOKEN=$(python -c "import uuid;  msg = uuid.uuid4(); print(msg)")
LIST_PROJECTS_TOKEN=$(python -c "import uuid;  msg = uuid.uuid4(); print (msg)")
sed -i s/16b2ecfb-7734-48b9-817d-4ac8bd623c87/$TOKEN/g list_metrics/config.py
sed -i s/16b2ecfb-7734-48b9-817d-4ac8bd623c87/$TOKEN/g get_timeseries/config.py
sed -i s/16b2ecfb-7734-48b9-817d-4ac8bd623c87/$TOKEN/g write_metrics/config.py
sed -i s/16b2ecfb-7734-48b9-817d-4ac8bd623c87/$TOKEN/g list_projects/config.json
sed -ibk "s/99a9ffa8797a629783cb4aa762639e92b098bac5/$LIST_PROJECTS_TOKEN/g" list_projects/config.json
sed -ibk "s/YOUR_PROJECT_ID/$PROJECT_ID/g" list_projects/config.json
  1. Deploy the App Engine apps Run gcloud app create if you don't already have an App Engine app in your project and remove the line service: list-metrics from app.yaml.

Note: The default service account for App Engine has the project Editor permission. If you don't use the default service account, you need to grant the App Engine service account sufficient permissions for Cloud Monitoring, Pub/Sub, Cloud Storate, and BigQuery.

cd list_metrics
pip install -t lib -r requirements.txt
echo "y" | gcloud app deploy

Copy the URL from the Deployed service output and add it to the LIST_METRICS_URL variable. The following is an example. Please replace PROJECT_ID and REGION_ID with the real values

export LIST_METRICS_URL=https://list-metrics-dot-PROJECT_ID.REGION_ID.r.appspot.com

Do the same for the other services:

cd ../get_timeseries
pip install -t lib -r requirements.txt
echo "y" | gcloud app deploy

Copy the URL from the Deployed service output and add it to the GET_TIMESERIES_URL variable. The following is an example. Note: PROJECT_ID and REGION_ID are replaced with the real values.

export GET_TIMESERIES_URL=https://get-timeseries-dot-PROJECT_ID-REGION_ID.r.appspot.com
cd ../write_metrics
pip install -t lib -r requirements.txt
echo "y" | gcloud app deploy

Copy the URL from the Deployed service output and add it to the WRITE_METRICS_URL variable. The following is an example. Note: PROJECT_ID and REGION_ID are replaced with the real values.

export WRITE_METRICS_URL=https://write-metrics-dot-PROJECT_ID-REGION_ID.appspot.com
  1. Create the Pub/Sub topics and subscriptions after setting YOUR_PROJECT_ID

Now, get the get_timeseries and write_metrics URLs and create the Pub/Sub topics and subscriptions

gcloud pubsub topics create metrics_export_start
gcloud pubsub subscriptions create metrics_export_start_sub --topic metrics_export_start --ack-deadline=60 --message-retention-duration=10m --push-endpoint="$LIST_METRICS_URL/push-handlers/receive_messages"

gcloud pubsub topics create metrics_list
gcloud pubsub subscriptions create metrics_list_sub --topic metrics_list --ack-deadline=60 --message-retention-duration=30m --push-endpoint="$GET_TIMESERIES_URL/push-handlers/receive_messages"

gcloud pubsub topics create write_metrics
gcloud pubsub subscriptions create write_metrics_sub --topic write_metrics --ack-deadline=60 --message-retention-duration=30m  --push-endpoint="$WRITE_METRICS_URL/push-handlers/receive_messages"
  1. Create a service account for the list_projects function
gcloud iam service-accounts create \
gce-list-projects \
--description "Used for the function that lists the projects for the GCE Footprint Cloud Function"
export LIST_PROJECTS_SERVICE_ACCOUNT=gce-list-projects@$PROJECT_ID.iam.gserviceaccount.com 

9 Assign IAM permissions to the service account

gcloud projects add-iam-policy-binding  $PROJECT_ID --member="serviceAccount:$LIST_PROJECTS_SERVICE_ACCOUNT" --role="roles/compute.viewer"
gcloud projects add-iam-policy-binding  $PROJECT_ID --member="serviceAccount:$LIST_PROJECTS_SERVICE_ACCOUNT" --role="roles/pubsub.publisher"
  1. Deploy the list_projects function
cd ../list_projects 
gcloud functions deploy list_projects \
--trigger-topic metric_export_get_project_start \
--runtime nodejs18 \
--entry-point list_projects \
--service-account=$LIST_PROJECTS_SERVICE_ACCOUNT
  1. Deploy the Cloud Scheduler job
gcloud scheduler jobs create pubsub metric_export \
--schedule "*/5 * * * *" \
--topic metric_export_get_project_start \
--message-body "{ \"token\":\"$(echo $LIST_PROJECTS_TOKEN)\"}"

Run the tests

  1. Run the job from the scheduler
gcloud scheduler jobs run metric_export
  1. Test the app by sending a PubSub message to the metrics_export_start topic
gcloud pubsub topics publish metrics_export_start --message "{\"token\": \"$TOKEN\"}" 

You can send in all of the parameters using the following command

gcloud pubsub topics publish metrics_export_start --message "{\"token\": \"$TOKEN\"}, \"start_time\": \"2019-03-13T17:30:00.000000Z\", \"end_time\":\"2019-03-13T17:40:00.000000Z\",\"aggregation_alignment_period\":\"3600s\"}"
  1. Verify that the app is working appropriately by running the end-to-end testing

Configure your project_id and lookup the batch_id in the config.py file.

cd test
export PROJECT_ID=$(gcloud config get-value project)
export TIMESTAMP=$(date -d "-2 hour"  +%Y-%m-%dT%k:%M:00Z)
export BATCH_ID=$(gcloud logging read "resource.type=\"gae_app\" AND resource.labels.module_id=\"list-metrics\" AND logName=\"projects/$PROJECT_ID/logs/appengine.googleapis.com%2Frequest_log\" AND protoPayload.line.logMessage:\"batch_id:\" AND timestamp >= \"$TIMESTAMP\"" --limit 1 --format json | grep "batch_id:" | awk '{ print substr($3,1,32); }')
sed -i s/YOUR_PROJECT_ID/$PROJECT_ID/g config.py
sed -i s/R8BK5S99QU4ZZOGCR1UDPWVH6LPKI5QU/$BATCH_ID/g config.py

python end_to_end_test_run.py