-
Notifications
You must be signed in to change notification settings - Fork 4.1k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* script skeleton * add API call to source_definitions to fetch E2E Test Source definition ID * createSource implementation * add destination creation logic implementation * get definition IDs, catalogId, and implement connection creation * add cleanup script and write created ids to a file that can be cleaned up * make cloud header a command-line argument, other cleanup * script comments fix * remove kube references and fix indentation * temp commit - don't push * remove discover catalog function * more cleanups * more cleanups * cleanup help text * exit codes and show how many connections left * add README Co-authored-by: Xiaohan Song <[email protected]>
- Loading branch information
1 parent
a4474ce
commit 7cedfa4
Showing
8 changed files
with
613 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1 @@ | ||
cleanup/ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,103 @@ | ||
# Load Testing Airbyte | ||
|
||
## Overview | ||
To perform a stress test of an Airbyte deployment, the `load_test_airbyte.sh` shell script is useful to quickly and easily create many connections. | ||
This script creates a new E2E Test Source, E2E Test Destination, and a configurable number of connections in the indicated workspace. | ||
|
||
## Instructions | ||
From your top-level `/airbyte` directory, run the following to perform a load test: | ||
|
||
``` | ||
./tools/bin/load_test/load_test_airbyte.sh -W <workspace id> -C <num_connections> | ||
``` | ||
|
||
|
||
By default, the script assumes that the Airbyte instance's server is accessible at `localhost:8001`. This is the default server location when | ||
deploying Airbyte with `docker-compose up`. | ||
|
||
Additionally, the E2E Test Source created by the script will take 10 minutes to complete a sync by default. | ||
|
||
These defaults can be overridden with flags. All available flags are described as follows: | ||
|
||
``` | ||
-h | ||
Display help | ||
-W <workspace id> | ||
Specify the workspace ID where new connectors and connections should be created. | ||
Required. | ||
-H <hostname> | ||
Specify the Airbyte API server hostname that the script should call to create new connectors and connections. | ||
Defaults to 'localhost'. | ||
-P <port> | ||
Specify the port for the Airbyte server. | ||
Defaults to '8001'. | ||
-X <header> | ||
Specify the X-Endpoint-API-UserInfo header value for API authentication. | ||
For Google Cloud Endpoint authentication only. | ||
-C <count> | ||
Specify the number of connections that should be created by the script. | ||
Defaults to '1'. | ||
-T <minutes> | ||
Specify the time in minutes that each connection should sync for. | ||
Defaults to '10'. | ||
``` | ||
|
||
|
||
### Load Testing on Kubernetes | ||
|
||
To load test a deployment of Airbyte running on Kubernetes, you will need to set up port-forwarding to the `airbyte-server` deployment. | ||
This can be accomplished with the following command: | ||
|
||
``` | ||
kubectl port-forward deployment/airbyte-server -n ab 8001:8001 | ||
``` | ||
|
||
This will make the Airbyte server available at `localhost:8001` | ||
|
||
|
||
### Authentication | ||
|
||
If your deployment of Airbyte happens to use Google Cloud Endpoints for authentication, you can use the `-X` option to pass | ||
an `X-Endpoint-API-UserInfo` header value. | ||
|
||
|
||
## Cleanup | ||
The `load_test_airbyte.sh` script writes created IDs to files in the script's `/cleanup` directory. To delete resources that were created by the load | ||
test script, you can run `cleanup_load_test.sh`, which reads IDs from the `/cleanup` directory and calls the Airbyte API to delete them. | ||
|
||
|
||
### Cleanup Instructions | ||
To run the cleanup script, from the top-level `airbyte` directory, run the following: | ||
|
||
``` | ||
./tools/bin/load_test/cleanup_load_test.sh -W <workspace_id> | ||
``` | ||
|
||
All available cleanup script flags are described as follows: | ||
|
||
``` | ||
-h | ||
Display help | ||
-W <workspace id> | ||
Specify the workspace ID from where connectors and connections should be deleted. | ||
Required. | ||
-H <hostname> | ||
Specify the Airbyte API server hostname that the script should call to delete connectors and connections. | ||
Defaults to 'localhost'. | ||
-P <port> | ||
Specify the port for the Airbyte server. | ||
Defaults to '8001'. | ||
-X <header> | ||
Specify the X-Endpoint-API-UserInfo header value for API authentication. | ||
For Google Cloud Endpoint authentication only. | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,152 @@ | ||
#!/usr/bin/env bash | ||
set -o errexit | ||
set -o nounset | ||
|
||
<<comment | ||
This script cleans up an earlier load test. It reads from cleanup files that the load test script writes to | ||
in order to determine which IDs to delete. | ||
comment | ||
|
||
cd "$(dirname "$0")" | ||
source load_test_utils.sh | ||
|
||
function showhelp { | ||
echo -e """Usage $(dirname $0)/cleanup_load_test [OPTIONS] | ||
cleanup_load_test deletes resources that were created from an earlier load test. | ||
Available OPTIONs: | ||
${CLEAR}-h | ||
${GREEN}Display help | ||
${CLEAR}-W <workspace id> | ||
${GREEN}Specify the workspace ID from where connectors and connections should be deleted. | ||
Required. | ||
${CLEAR}-H <hostname> | ||
${GREEN}Specify the Airbyte API server hostname that the script should call to delete connectors and connections. | ||
Defaults to 'localhost'. | ||
${CLEAR}-P <port> | ||
${GREEN}Specify the port for the Airbyte server. | ||
Defaults to '8001'. | ||
${CLEAR}-X <header> | ||
${GREEN}Specify the X-Endpoint-API-UserInfo header value for API authentication. | ||
For Google Cloud Endpoint authentication only. | ||
""" && exit 1 | ||
} | ||
|
||
hostname=localhost | ||
api_port=8001 | ||
x_endpoint_header="" | ||
|
||
while getopts "hW:H:P:X:kN:" options ; do | ||
case "${options}" in | ||
h) | ||
showhelp | ||
;; | ||
W) | ||
workspace_id="${OPTARG}" | ||
;; | ||
H) | ||
hostname="${OPTARG}" | ||
;; | ||
P) | ||
api_port="${OPTARG}" | ||
;; | ||
X) | ||
x_endpoint_header="${OPTARG}" | ||
;; | ||
*) | ||
showhelp | ||
;; | ||
esac | ||
done | ||
|
||
function setup { | ||
if test -z "$workspace_id"; then | ||
echo "error: must set a workspace id with -W" | ||
exit 1 | ||
fi | ||
|
||
echo "set workspace_id to ${workspace_id}" | ||
echo "set hostname to ${hostname}" | ||
echo "set api_port to ${api_port}" | ||
|
||
setCleanupFilesForWorkspace $workspace_id | ||
} | ||
|
||
function deleteConnections { | ||
while test -s $CONNECTION_CLEANUP_FILE | ||
do | ||
connectionId=$(readFirstLineFromFile $CONNECTION_CLEANUP_FILE) | ||
callApi "connections/delete" "{\"connectionId\":\"$connectionId\"}" | ||
echo "deleted connection with ID $connectionId" | ||
|
||
# deletion succeeded, so remove the ID from the cleanup file | ||
removeFirstLineFromFile $CONNECTION_CLEANUP_FILE | ||
done | ||
|
||
if ! test -s $CONNECTION_CLEANUP_FILE | ||
then | ||
rm $CONNECTION_CLEANUP_FILE | ||
echo "removed cleanup file $CONNECTION_CLEANUP_FILE" | ||
fi | ||
} | ||
|
||
function deleteSources { | ||
while test -s $SOURCE_CLEANUP_FILE | ||
do | ||
sourceId=$(readFirstLineFromFile $SOURCE_CLEANUP_FILE) | ||
callApi "sources/delete" "{\"sourceId\":\"$sourceId\"}" | ||
echo "deleted source with ID $sourceId" | ||
|
||
# deletion succeeded, so remove the ID from the cleanup file | ||
removeFirstLineFromFile $SOURCE_CLEANUP_FILE | ||
done | ||
|
||
if ! test -s $SOURCE_CLEANUP_FILE | ||
then | ||
rm $SOURCE_CLEANUP_FILE | ||
echo "removed cleanup file $SOURCE_CLEANUP_FILE" | ||
fi | ||
} | ||
|
||
function deleteDestinations { | ||
while test -s $DESTINATION_CLEANUP_FILE | ||
do | ||
destinationId=$(readFirstLineFromFile $DESTINATION_CLEANUP_FILE) | ||
callApi "destinations/delete" "{\"destinationId\":\"$destinationId\"}" | ||
echo "deleted destination with ID $destinationId" | ||
|
||
# deletion succeeded, so remove the ID from the cleanup file | ||
removeFirstLineFromFile $DESTINATION_CLEANUP_FILE | ||
done | ||
|
||
if test -z $DESTINATION_CLEANUP_FILE | ||
then | ||
rm $DESTINATION_CLEANUP_FILE | ||
echo "removed cleanup file $DESTINATION_CLEANUP_FILE" | ||
fi | ||
} | ||
|
||
############ | ||
## MAIN ## | ||
############ | ||
|
||
if [[ $# -eq 0 ]] ; then | ||
showhelp | ||
exit 0 | ||
fi | ||
|
||
setup | ||
|
||
deleteConnections | ||
|
||
deleteSources | ||
|
||
deleteDestinations | ||
|
||
echo "Finished!" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,47 @@ | ||
{ | ||
"sourceId": "replace_source_id", | ||
"destinationId": "replace_destination_id", | ||
"syncCatalog": { | ||
"streams": [ | ||
{ | ||
"config": { | ||
"syncMode": "full_refresh", | ||
"cursorField": [], | ||
"destinationSyncMode": "overwrite", | ||
"primaryKey": [], | ||
"aliasName": "data_stream", | ||
"selected": true | ||
}, | ||
"stream": { | ||
"name": "data_stream", | ||
"jsonSchema": { | ||
"type": "object", | ||
"properties": { | ||
"column1": { | ||
"type": "string" | ||
} | ||
} | ||
}, | ||
"supportedSyncModes": [ | ||
"full_refresh" | ||
], | ||
"defaultCursorField": [], | ||
"sourceDefinedPrimaryKey": [] | ||
} | ||
} | ||
] | ||
}, | ||
"prefix": "", | ||
"namespaceDefinition": "source", | ||
"namespaceFormat": "${SOURCE_NAMESPACE}", | ||
"scheduleType": "basic", | ||
"scheduleData": { | ||
"basicSchedule": { | ||
"units": 24, | ||
"timeUnit": "hours" | ||
} | ||
}, | ||
"name": "replace_connection_name", | ||
"operations": [], | ||
"status": "active" | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
{ | ||
"name": "End-to-End Testing (/dev/null)", | ||
"destinationDefinitionId": "replace_destination_definition_id", | ||
"workspaceId": "replace_workspace_id", | ||
"connectionConfiguration": { | ||
"type": "SILENT" | ||
} | ||
} |
Oops, something went wrong.