-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🎉 New Destination: Google Cloud Storage #4329
Conversation
Thank you @MaxwellJK, this is huge! I will review it tonight. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, halfway through the review I noticed that this PR currently does not work for GCS. All the writers are related to S3. Maybe you have not pushed the latest changes?
Many code should be shared between the S3 and GCS destinations. I think it's fine to have some duplicated code for now. We can DRY them when we add the Azure destination. However, I think code related to the JsonToAvroSchemaConverter
, at least, should be shared.
@sherifnada, where do you think we should put the shared code for S3 and GCS (and future Azure)? What about a module in airbyte-integrations/connectors/destination-object-storage
? Or one module per format (if necessary), like destination-format-csv
and destination-format-parquet
.
import java.util.function.Consumer; | ||
|
||
|
||
public class GCSConsumer extends FailureTrackingAirbyteMessageConsumer { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would you mind changing the all-capital GCS
in the calss names to Gcs
? The argument is that in this way, the acronym is more readable. This is especially the case when there are multiple acronyms, like the GCSCcsv-
classes. I think GcsCsvWriter
is more clear than GCSCsvWriter
. It is also more consistent in that both GCS
and CSV
are acronyms, so they should have the same format (and it probably should not be GCSCSVWriter
).
This is also the naming convention recommended by Effective Java:
A strong argument can be made in favor of capitalizing only the first letter: even if multiple acronyms occur back-to-back, you can still tell where one word starts and the next word ends. Which class name would you rather see,
HTTPURL
orHttpUrl
?
(Third edition, page 289 - 290.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not a problem :)
protected void startTracked() throws Exception { | ||
|
||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit. Too many empty lines here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
"default": "", | ||
"description": "The region of the GCS bucket.", | ||
"enum": [ | ||
"-- North America --", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm glad you like it :)
GCSWriterFactory formatterFactory = new ProductionWriterFactory(); | ||
return new GCSConsumer(GCSDestinationConfig.getGCSDestinationConfig(config), configuredCatalog, formatterFactory, outputRecordCollector); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit. Missing newline.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you mean between line 72 and line 73? if so, fixed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh, it's not between line 72 and 73. It's at the end of this file. You can see the following symbol as a reminder from GitHub:
This is a nitpick. The argument is here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interesting read! Thanks for it :)
fixed now btw
config.get("gcs_bucket_region").asText(), | ||
config.get("access_key_id").asText(), | ||
config.get("secret_access_key").asText(), | ||
config.get("project_id").asText(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
project_id
is an optional field according to spec.json
. So this line may throw NPE if project_id
is not specified.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think I ended up not using project_id
at all so I will just remove it completely from here and spec.json
. Need to double check though
import java.util.UUID; | ||
|
||
/** | ||
* This class takes case of the generation of the CSV data sheet, including the header row and the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit. There is a typo in my code:
* This class takes case of the generation of the CSV data sheet, including the header row and the | |
* This class takes care of the generation of the CSV data sheet, including the header row and the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed
|
||
public class CsvSheetGenerators { | ||
|
||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This class can be deleted completely. Its function has been replaced by the CsvSheetGenerator.Factory
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks, didn't notice it
import org.slf4j.Logger; | ||
import org.slf4j.LoggerFactory; | ||
|
||
public class GCSCsvWriter extends BaseGCSWriter implements GCSWriter { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This class is an exact copy of the S3CsvWriter
. It does not work for GCS. The client passed into the constructor is for S3, and I think StreamTransferManager
only works for S3 as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As per google documentation the client passed into the constructor can be an S3 client as long as the endpoint is GCS (other than HMAC keys of course). Let me test it again (I only tried once or twice) and confirm
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow, this is very handy. Sorry that I did not know this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not a problem, I found out 2 weeks ago and was amazed! :)
|
||
integrationTestJavaImplementation project(':airbyte-integrations:bases:standard-destination-test') | ||
integrationTestJavaImplementation project(':airbyte-integrations:connectors:destination-gcs') | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are some S3 and AWS dependencies here. They are probably not necessary.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
some are still needed as per google documentation
// hadoopConfig.set("fs.gs.auth.service.account.private.key", config.getSecretAccessKey()); | ||
// hadoopConfig.set("fs.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem"); | ||
// hadoopConfig.set("fs.AbstractFileSystem.gs.impl", "com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS"); | ||
// hadoopConfig.set("google.cloud.auth.service.account.enable", "true"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All the configs associated with GCS are commented out. Currently this writer only works for S3. Maybe you forgot to push the latest changes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hey @tuliren my bad, these are different tests I was doing to check how Airbyte was connecting to GCS: all those are useful in case of service account (username, password and authkey).
I can test it again just to make sure I didn't mess up with the files while creating the PR, but it does work with GCS
After my test, i'll remove what is not needed.
@MaxwellJK, since you have tested it locally, the acceptance test probably also works as is. Would you mind including that to the PR as well (like copy the S3 CSV and Parquet destination acceptance test)? It is fine if you prefer to skip that. We can do it ourselves later. |
…file by making S3 a dependency for GCS
…into destination-gcs
There are two main issues:
I will prepare a PR that fixes these issue, merge this PR, and then immediately merge my fixes. |
* Adding Google Cloud Storage as destination * Removed few comments and amended the version * Added documentation in docs/integrations/destinations/gcs.md * Amended gcs.md with the right pull id * Implemented all the fixes requested by tuliren as per #4329 * Renaming all the files * Branch alligned to S3 0.1.7 (with Avro and Jsonl). Removed redundant file by making S3 a dependency for GCS * Removed some additional duplicates between GCS and S3 * Revert changes in the root files * Revert jdbc files * Fix package names * Refactor gcs config * Format code * Fix gcs connection * Format code * Add acceptance tests * Fix parquet acceptance test * Add ci credentials * Register the connector and update documentations * Fix typo * Format code * Add unit test * Add comments * Update readme Co-authored-by: Sherif A. Nada <[email protected]> Co-authored-by: Marco Fontana <[email protected]> Co-authored-by: [email protected] <[email protected]> Co-authored-by: Marco Fontana <[email protected]> Co-authored-by: Sherif A. Nada <[email protected]>
The changes have been included in #4784 and merged to master. |
* Adding Google Cloud Storage as destination * Removed few comments and amended the version * Added documentation in docs/integrations/destinations/gcs.md * Amended gcs.md with the right pull id * Implemented all the fixes requested by tuliren as per #4329 * Renaming all the files * Branch alligned to S3 0.1.7 (with Avro and Jsonl). Removed redundant file by making S3 a dependency for GCS * Removed some additional duplicates between GCS and S3 * Revert changes in the root files * Revert jdbc files * Fix package names * Refactor gcs config * Format code * Fix gcs connection * Format code * Add acceptance tests * Fix parquet acceptance test * Add ci credentials * Register the connector and update documentations * Fix typo * Format code * Add unit test * Add comments * Update readme Co-authored-by: Sherif A. Nada <[email protected]> Co-authored-by: Marco Fontana <[email protected]> Co-authored-by: [email protected] <[email protected]> Co-authored-by: Marco Fontana <[email protected]> Co-authored-by: Sherif A. Nada <[email protected]>
* init the new connector source-zendesk-support * Finished a development of ZenDesk streams * Source ZenDesk: finished * Source ZenDesk: remove unused test files * Source ZenDesk: format and validate code * Source Zendesk: update docs * Remove unused files * add a stream_slices logic for ticket_comments stream * 🎉 Python CDK: Allow setting network adapter args on outgoing HTTP requests (#4493) * 🎉 Destination S3: support `anyOf` `allOf` and `oneOf` (#4613) * Support combined restrictions in json schema * Bump s3 version * Add more test cases * Update changelog * Add more test cases * Update documentation * Format code * SAT: verify `AIRBYTE_ENTRYPOINT` is defined (#4478) * save changes required for work; TODO locate all places that need to be updated to make test working * move new test inside test_spec * apply suggestions * change return type + add check env = space_joined_entrypoint * requested * add check entrypoint with env * bump SAT --version && changelog update * merge && fix changelog * changes * add dynamic docker runner creator + test having properties * update the names * change names * make fixtures * upd text * Update airbyte-integrations/bases/source-acceptance-test/unit_tests/test_spec_unit.py Co-authored-by: Eugene Kulak <[email protected]> * requested changes * Update airbyte-integrations/bases/source-acceptance-test/unit_tests/test_spec_unit.py Co-authored-by: Eugene Kulak <[email protected]> * Update airbyte-integrations/bases/source-acceptance-test/unit_tests/test_spec_unit.py Co-authored-by: Eugene Kulak <[email protected]> * apply requested changes * change names (requested) * move binary strings to standard with convertation in builder * fixing merge-conflict side effect Co-authored-by: Eugene Kulak <[email protected]> * Migrate Quickstart to use PokeAPI (#4615) * Migrate Quickstart to use PokeAPI * Words words words Co-authored-by: Abhi Vaidyanatha <[email protected]> * Left isn't right (#4616) Co-authored-by: Abhi Vaidyanatha <[email protected]> * Create on on-oci-vm.md (#4468) * Create on on-oci-vm.md Deployment guide for Airbyte on Oracle Cloud Infrastructure (OCI) VM * Update on-oci-vm.md Adding the image links and uploading images to the repository * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update docs/deploying-airbyte/on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update on-oci-vm.md * Add files via upload * Update on-oci-vm.md * Add files via upload * Update on-oci-vm.md * Update on-oci-vm.md Co-authored-by: Abhi Vaidyanatha <[email protected]> * 🐛 platform: Fix silent failures in sources (#4617) * add oracle dpeloyment guide to summary.md (#4619) * Mailchimp fix url-base (#4621) * minimal change to show acceptance test failure * exactly fix * bump version and readme * upd * 🎉 New Source: Paypal Transaction (#4240) * Added spec.json * Initialization * added oauth2 autorization * added spec, check, discover + catalogs/configurared_catalogs * updated request_params * added paging, slicing (1d) * Use oath2 for paypal * incremental sync, acceptance test * incremental sync, acceptance test * Added spec.json * Initialization * added oauth2 autorization * added spec, check, discover + catalogs/configurared_catalogs * updated request_params * added paging, slicing (1d) * Use oath2 for paypal * incremental sync, acceptance test * incremental sync, acceptance test * Added spec.json * Initialization * added oauth2 autorization * added spec, check, discover + catalogs/configurared_catalogs * updated request_params * added paging, slicing (1d) * Use oath2 for paypal * incremental sync, acceptance test * updated slices and api limits, added validation for input dates * added tests, fixed cursor related information in schemas and configured catalogs, removed old comments, re-arranged Base PaypalTransactionStream class * added input param 'env' to support production and sandbox envs * added support for sandbox option, updated pattern for optional end date option * added github secrets * added support for sandbox option, updated pattern for optional end date option * fixed Copyright date, removed debug mesages * added docs * fix for test failure - The sync should produce at least one STATE message * removed optional parameter 'end_date' * removed detailed info about balances schema * Delete employees.json * Delete customers.json * Added requests_per_minute rate limit * added unit tests, added custom backoff * added test for stream slices with stream state * removed comments * updated docs pages * fixed format for json files * fixed types in schemas and link to the schema. fixed primary key for Transactions stream * updated stream slices * Updated tests, unified stream_slices for both streams, all instance variables instantiated directly in __init__ method * added CHANGELOG.md * Added build seeds * fixed closing double quotation mark * added paypal entry in builds.md * add fixture helper * added paypal transaction generator script * fixed styling * maximum allowed start_date is extracted from API response now. * fixed schemas * fixed schemas - removed datetime * now maximum_allowed_start_date is identified by last_refreshed_datetime attr in API response. * added possibility to specify additional properties Co-authored-by: Sherif Nada <[email protected]> * set db version after full import is complete (#4626) * set db version after full import is complete * check db version in the last step * add comment * Fix docs formatting * Redirect old link to upgrading tutorial (#4635) Co-authored-by: Abhi Vaidyanatha <[email protected]> * Fix broken link in SUMMARY.md * Airflow Demo: Remove superset in down.sh (#4638) * Remove superset in down.sh * Clean up superset containers before creating them in up.sh Co-authored-by: Abhi Vaidyanatha <[email protected]> * Airflow demo: Clean up scripts and more clearly describe actions (#4639) * Airflow demo: Script cleanup * Correct docker compose name for airflow file * Final fixes * Clean up airbyte destination Co-authored-by: Abhi Vaidyanatha <[email protected]> * 🎉 Add documentation for configuring Kube GCS logging. (#4622) * Bump version: 0.27.0-alpha → 0.27.1-alpha (#4640) * 0.27.1 Platform Patch Notes (#4644) Co-authored-by: Abhi Vaidyanatha <[email protected]> * 🎉 New Source: Zendesk Sunshine (#4359) * pre-PR * add git config * format * Update airbyte-integrations/connectors/source-zendesk-sunshine/requirements.txt upd requirements.txt remove extra Co-authored-by: Eugene Kulak <[email protected]> * Update airbyte-integrations/connectors/source-zendesk-sunshine/source_zendesk_sunshine/streams.py backoff time int to float (btw real return type in headers is integer) Co-authored-by: Eugene Kulak <[email protected]> * requested changes * fix newline absence && rm unnecessary temp file * url_base to property * rm extra var coming property * rm extra var coming property * save * finishing updating the documentation * forgotten definition * add nullable to pass the test * fix date in the log Co-authored-by: Eugene Kulak <[email protected]> * 0.27.1 Connector Patch Notes (#4646) Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update connector certification table. (#4647) Co-authored-by: Abhi Vaidyanatha <[email protected]> * 🐛 Stub out the GCP Env Var in Docker to prevent noisy and harmless errors. (#4642) * Add this to prevent noisy errors. * Add hint to Airflow guide about local example (#4656) Co-authored-by: Abhi Vaidyanatha <[email protected]> * fix version for kube automatic migration support (#4649) * format zendesk sunshine connector (#4658) * 🎉 New source: Dixa (#4358) * Turn on MYSQL normalization flag. (#4651) * Turn on normalization flag. Bump versions * Combine admin and settings (#4525) * Add side menu component * Add side menu to settings page. Remove admin link from sidebar * Move NotificationPage * Move ConfigurationPage * Add Sources and Destinations pages to Settings. Delete Admin page * Add MetricsPage * Edit Notifications and Metrics pages * Update feedback for metrics and notification pages * Add update icons data to side menu * Add AccountPage * Job history purging (#4575) * WIP: Job history purging * Created test cases that handle variations of job history purging configuration * Typo fix * Expanded test cases to control for job history on multiple connections at once. * Handle latest job with saved state correctly regardless of order of ids * Whitespace * Externalized sql. Cleaned up constants. * Cleaned up test case persistence code and structure * Whitespace and formatting per standard tooling. * 0.27.1 Announcement Summary (#4678) Co-authored-by: Abhi Vaidyanatha <[email protected]> * 🐛 Source Sendgrid: add start_time config and correct primary_key (#4682) * add start_time config and correct primary_key * correct integration tests * correct type * config txt and primary_key * test to show how automatic migration handles deprecated definitions (#4655) * test to show definitions not present in latest seed would be deleted in automatic migration * format * add deprecated config being used scenario * Source dixa: fix unit tests (#4690) * introduce common abstraction for CDC via debezium (#4580) * wip * add file * final structure * few more updates * undo unwanted changes * add abstract test + more refinement * remove CDC metadata to debezium * rename class + add missing property * move debezium to bases + upgrade debezium version + review comments * downgrade version + minor fixes * reset to minutes * fix build * address review comments * should return Optional * use common abstraction for CDC via debezium for mysql (#4604) * use new cdc abstraction for mysql * undo wanted change * pull in latest changes * use renamed class + move constants to MySqlSource * bring in latest changes from cdc abstraction * format * bring in latest changes * pull in latest changes * use common abstraction for CDC via debezium for postgres (#4607) * use cdc abstraction for postgres * add files * ready * use renamed class + move constants to PostgresSource * bring in the latest changes * bring in latest changes * pull in latest changes * Source Dixa: Pin tz in ConversationExport.ms_timestamp_to_datetime (#4696) * Source Dixa: add to connector index (#4701) * allow injecting filters for server (#4677) * allow injecting filters * fmt * upgrade postgres version for new cdc abstraction (#4702) * Fix dependencies for Superset demo (#4705) * Fix superset dependency location * Add some Superset setup Co-authored-by: Abhi Vaidyanatha <[email protected]> * 📚 add SSH instructions for OCI VM setup (#4684) Co-authored-by: Sherif A. Nada <[email protected]> * upgrade mysql version for new cdc abstraction (#4703) * Update with ALTER TABLE statements (#4707) Co-authored-by: Abhi Vaidyanatha <[email protected]> * remove unused deps (#4512) Co-authored-by: Davin Chia <[email protected]> * fix config init race condition (#4679) * 🐛 Destination S3: fix minio output for parquet format * Bump destination s3 version (#4718) * Fix scheduler race condition. (#4691) * Periodic connector tests workflow: add `Accept` header per github docs recommendation (#4722) * allow launching integration tests from workflow dispatch (#4723) * Bump version: 0.27.1-alpha → 0.27.2-alpha (#4724) * 🐛 Source Square: Update _send_request method due to changes in Airbyte CDK (#4645) * 🎉 Destination Snowflake: tag snowflake traffic with airbyte ID to enable optimizations from Snowflake (#4713) * 🎉 New source: Typeform (#4541) Typeform source: Forms and Responses streams * Upgrade postgres and redshift destination to remove basic_normalization attribute (#4725) * upgrade snowflake,redshift,postgres to remove basic_normalization * undo snowflake * undo snowflaketest * fix broken assertions for automatic migration tests (#4732) * Slightly improve sed-based yaml parsing (#4721) Previous sed did not handle the valid `profile: foo` * throw exception if we close engine before snapshot is complete + increase timeout for subsequent records (#4730) * throw exception if we close engine before snapshot is complete + increase timeout for subsequent records * add comment + bump postgres version to use new changes * allow publishing airbyte-server to local maven repo (#4717) * allow publishing airbyte-server to local maven repo * Stub this out so the name that is created is airbyte-server-0.27.1-alpha.jar and not airbyte-server-0.27.1-alpha-all.jar. * Add comments. * see if this fixes build Co-authored-by: Davin Chia <[email protected]> * CDK: Add initial Destination abstraction and tests (#4719) Co-authored-by: Eugene Kulak <[email protected]> * Update docs on GitHub connector now that its Airbyte native (#4739) Co-authored-by: Abhi Vaidyanatha <[email protected]> * Remove statement about Postgres connector being based on Singer (#4740) Co-authored-by: Abhi Vaidyanatha <[email protected]> * fix flaky migration acceptance test (#4743) * upgrade fabric8 client (#4738) * 🎉 Source MSSQL: implementation for CDC (#4689) * first few classes for mssql cdc * wip * mssql cdc working against unit tests * increment version * add cdc acceptance test * tweaks * add file * working on comprehensive tests * change isolation from snapshot to read_committed_snapshot * finalised type tests * Revert "change isolation from snapshot to read_committed_snapshot" This reverts commit 20c6768. * small docstring fix * remove unused imports * stress test fixes * minor formatting improvements * mssql cdc docs * finish off cdc docs * format fix * update connector version * add to changelog * fix for sql server agent offline failing cdc enable on tables * final structure * few more updates * undo unwanted changes * add abstract test + more refinement * remove CDC metadata to debezium * use new cdc abstraction for mysql * undo wanted change * use cdc abstraction for postgres * add files * pull in latest changes * ready * rename class + add missing property * use renamed class + move constants to MySqlSource * use renamed class + move constants to PostgresSource * move debezium to bases + upgrade debezium version + review comments * downgrade version + minor fixes * bring in latest changes from cdc abstraction * reset to minutes * bring in the latest changes * format * fix build * address review comments * bring in latest changes * bring in latest changes * use common abstraction for CDC via debezium for sql server * remove debezium from build * finalise PR * should return Optional * pull in latest changes * pull in latest changes * address review comments * use common abstraction for CDC via debezium for mysql (#4604) * use new cdc abstraction for mysql * undo wanted change * pull in latest changes * use renamed class + move constants to MySqlSource * bring in latest changes from cdc abstraction * format * bring in latest changes * pull in latest changes * use common abstraction for CDC via debezium for postgres (#4607) * use cdc abstraction for postgres * add files * ready * use renamed class + move constants to PostgresSource * bring in the latest changes * bring in latest changes * pull in latest changes * lower version for tests to run on CI * format * Update docs/integrations/sources/mssql.md Co-authored-by: Sherif A. Nada <[email protected]> * addressing review comments * fix for testGetTargetPosition * format changes Co-authored-by: George Claireaux <[email protected]> Co-authored-by: Sherif A. Nada <[email protected]> * bump up MSSQL version for cdc (#4694) * first few classes for mssql cdc * wip * mssql cdc working against unit tests * increment version * add cdc acceptance test * tweaks * add file * working on comprehensive tests * change isolation from snapshot to read_committed_snapshot * finalised type tests * Revert "change isolation from snapshot to read_committed_snapshot" This reverts commit 20c6768. * small docstring fix * remove unused imports * stress test fixes * minor formatting improvements * mssql cdc docs * finish off cdc docs * format fix * update connector version * add to changelog * fix for sql server agent offline failing cdc enable on tables * final structure * few more updates * undo unwanted changes * add abstract test + more refinement * remove CDC metadata to debezium * use new cdc abstraction for mysql * undo wanted change * use cdc abstraction for postgres * add files * pull in latest changes * ready * rename class + add missing property * use renamed class + move constants to MySqlSource * use renamed class + move constants to PostgresSource * move debezium to bases + upgrade debezium version + review comments * downgrade version + minor fixes * bring in latest changes from cdc abstraction * reset to minutes * bring in the latest changes * format * fix build * address review comments * bring in latest changes * bring in latest changes * use common abstraction for CDC via debezium for sql server * remove debezium from build * finalise PR * should return Optional * pull in latest changes * pull in latest changes * address review comments * use common abstraction for CDC via debezium for mysql (#4604) * use new cdc abstraction for mysql * undo wanted change * pull in latest changes * use renamed class + move constants to MySqlSource * bring in latest changes from cdc abstraction * format * bring in latest changes * pull in latest changes * use common abstraction for CDC via debezium for postgres (#4607) * use cdc abstraction for postgres * add files * ready * use renamed class + move constants to PostgresSource * bring in the latest changes * bring in latest changes * pull in latest changes * lower version for tests to run on CI * bump up mssql version for cdc * format * Update docs/integrations/sources/mssql.md Co-authored-by: Sherif A. Nada <[email protected]> * addressing review comments * fix for testGetTargetPosition * format changes Co-authored-by: George Claireaux <[email protected]> Co-authored-by: Sherif A. Nada <[email protected]> * fixed broken links and styling (#4747) * Fix enabling connection in refresh catalog mode (#4527) * Fix enabling connection in refresh catalog mode * Do not update deprecated connectors (#4674) * Do not update deprecated connectors * Fix various connectorDefinition issues: disappearing button, wrong id used for destination update * 🐛 Source Slack: add float_ts field (#4683) * rename float_ts to ts cursor_field * add float_ts * change float_ts to number * change channel_msg * bump version * increase default timeout_seconds slack acc test * timeout_seconds to 1750 * timeout_seconds to 3600 :p * add changelog for slack connector * copy docs to webapp docker image (#4522) * use kube service user for pod sweeper (#4737) * use kube service user for pod sweeper * add pod sweeper logs * temporarily switch to stable for testing * temporarily remove building steps for kube testing since it can use prod images * output date strings from date command * load stable images * remove loading since it can pull the images * increase window for success storage to two hours * revert test logging changes * 🐛 Source GitHub: fix bug with `IssueEvents` stream and add handling for rate limiting (#4708) * Few updates for GitHub source Set correct `cursor_field` for `IssueEvents` stream. Add rate limit handling. Add handling for 403 error. Add handling for 502 error. Co-authored-by: Eugene Kulak <[email protected]> Co-authored-by: Sherif A. Nada <[email protected]> * 🐛 Fix some api-spec errors. (#4742) * Source PostHog: Use account information for checking the connection (#4692) * this should fix the check if no records in annotations stream * update schemas for new SAT requirements && apply user hint upgrade on wrong api key * save schema upd * upd insights schema * upd insights schema2 * upd insights schema3 * upd insights schema4 * upd insights schema5 (null is joking) * upd insights schema6 (null is joking) * upd insights schema7 * upd insights schema8 * upd insights schema8 * bump version && docs * SAT: Improve error message when data mismatches schema (#4753) * improve message when data mismatch schema Co-authored-by: Eugene Kulak <[email protected]> * increase sleep duration + show logs in CI (#4756) * Fixed cockroachdb repo image (#4758) * Bump version: 0.27.2-alpha → 0.27.3-alpha (#4761) * update kube docs (#4749) * fix kube overlay version (#4765) * Split Platform and Connector Builds (#4514) * remove second docs check in build(#4766) * Restore template generator and fix formatting. (#4768) * connector generate: fix chown logic (#4774) * Remove example use cases from docs (#4775) Co-authored-by: Abhi Vaidyanatha <[email protected]> * Update README.md * 🎉 All java connectors: Added configValidator to check, discover, read and write calls (#4699) * Added configValidator to java connectors * 🎉 Stripe Source: Fix subscriptions stream to return all kinds of subscriptions (including expired and canceled) (#4669) #4669 Stripe Source: Fix subscriptions stream to return all kinds of subscriptions (including expired and canceled) Co-authored-by: Oleksandr Bazarnov <[email protected]> * Add note about orphaned Airbyte configs preventing automatic upgrades (#4709) * Add note about removing orphaned Airbyte configs * Remove excess baggage * Add a resetting section to make this more clear. Co-authored-by: Abhi Vaidyanatha <[email protected]> * Patch 0.27.2 and 0.27.3 platform notes (#4792) Co-authored-by: Abhi Vaidyanatha <[email protected]> * Connector notes for 0.27.3 (#4794) Co-authored-by: Abhi Vaidyanatha <[email protected]> * Add new logo to GitHub page (#4796) Co-authored-by: Abhi Vaidyanatha <[email protected]> * 🎉 New Destination: Google Cloud Storage (#4784) * Adding Google Cloud Storage as destination * Removed few comments and amended the version * Added documentation in docs/integrations/destinations/gcs.md * Amended gcs.md with the right pull id * Implemented all the fixes requested by tuliren as per #4329 * Renaming all the files * Branch alligned to S3 0.1.7 (with Avro and Jsonl). Removed redundant file by making S3 a dependency for GCS * Removed some additional duplicates between GCS and S3 * Revert changes in the root files * Revert jdbc files * Fix package names * Refactor gcs config * Format code * Fix gcs connection * Format code * Add acceptance tests * Fix parquet acceptance test * Add ci credentials * Register the connector and update documentations * Fix typo * Format code * Add unit test * Add comments * Update readme Co-authored-by: Sherif A. Nada <[email protected]> Co-authored-by: Marco Fontana <[email protected]> Co-authored-by: [email protected] <[email protected]> Co-authored-by: Marco Fontana <[email protected]> Co-authored-by: Sherif A. Nada <[email protected]> * 🐛 CDK: Fix logging of initial state value (#4795) * Update abstract_source.py * bump * CHANGELOG.md Co-authored-by: Eugene Kulak <[email protected]> * bug fix: use register api (#4811) * 🐛 Add missing dependencies for acceptance tests to run. (#4808) * 🎉 Add Python Destination Template (#4771) * Format. (#4814) * 🎉 Migrate config persistence to database (#4670) * Implement db config persistence * Fix database readiness check * Reduce logging noise * Setup config database in config persistence factory * Update documentation * Load seed from yaml files * Refactor config persistence factory * Add one more test to mimic migration * Remove unnecessary changes * Run code formatter * Update placeholder env values * Set default config database parameters in docker compose Co-authored-by: Christophe Duong <[email protected]> * Default setupDatabase to false * Rename variable * Set default config db parameters for server * Remove config db parameters from the env file * Remove unnecessary environment statements * Hide config persistence factory (#4772) * Remove CONFIG_DATABASE_HOST * Use builder in the test * Simplify config persistence builder * Clarify config db connection readiness * Format code * Add logging * Fix typo Co-authored-by: Christophe Duong <[email protected]> * Add a config_id only index * Reuse record insertion code * Add id field name to config schema * Support data loading from legacy config schemas * Log missing logs in migration test * Move airbyte configs table to separate directory * Update exception message * Dump specific tables from the job database * Remove postgres specific uuid extension * Comment out future branch * Default configs db variables to empty When defaulting them to the jobs db variables, it somehow does not work. * Log inserted config records * Log all db write operations * Add back config db variables in env file to mute warnings * Log connection exception to debug flaky e2e test * Leave config db variables empty `.env` file does not support variable expansion. Co-authored-by: Christophe Duong <[email protected]> Co-authored-by: Charles <[email protected]> * 🎉 Source intercom: migration to CDK (#4676) * Added Intercom implementation * Updated segments docs * Updated _send_request method to new airbyte-cdk version * Updated cursor field to datetime string * Added filtering by state for incremental sync * Updated cursor paths for test incremental sync * Added dict type validation to get_data method * Updated catalog * Updated typing for start_date * Updated singer seed to cdk seed * Updated connector docs * Updated sample config file * Sorted streams alphabetically * Removed placeholder comments * Renamed rate_limit to queries_per_hour * Updated common sleep time to backoff_time method * 🎉 New source: Pipedrive connector (#4686) * Add pipedrive source initial * Add initial schemas. Add MVP source implementation. * Implement MVP streams * Complete MVP streams implementation * Apply schema format * Add test creds * Update streams.py Fix schemas * Update replication_start_date format. Add extra pagination condition * Refactor streams, remove unused classes. * Add pipedrive.md docs file. Add Pipedrive source definitions. * Add json source definition. * Update spec.json * Add docs mentions throughout the project files * Make number of Concurrent Jobs configurable. (#4687) * Explicitly pin ec2 runner version to 2.2.1. (#4823) This was a mismash before, partially my fault. Explicitly pinning for now. * 🐛 Source Facebook: Improve rate limit management (#4820) * Improve rate limit management * bump version * facebook-marketing.md update the changelog * format and fix * Source Facebook: fix formatting and publish new version (#4826) * format * disable schema validation * fix urls in AdCreatives stream, enable SAT for creatives * format Co-authored-by: Eugene Kulak <[email protected]> * Code generator: Update generator to chown docs and config definition directories (#4819) * Python Demo Destination: KVDB (#4786) * 📚 CDK: Add python destination tutorial (#4800) * 📚 Source Shopify: migrate to new sandbox, update API version to 2021-07 (#4830) (#4830) Source Shopify: migrate to new sandbox, update API version to 2021-07 Co-authored-by: Oleksandr Bazarnov <[email protected]> * 🐛 Source Instagram: Read previous state format and upgrade it (#4805) * few fixes for user_insights state * support old state format * format * bump Co-authored-by: Eugene Kulak <[email protected]> * Add placeholder (#4816) * Add update button (#4809) * Point to new location for connector build status history (#4840) * Update GAds docs to indicate incremental support * Add openreplay (#4685) * Add openreplay * Add env variables for openreplay * Add openreplay env for k8s * 🎉 Source mixpanel: migration to CDK (#4566) * Mixpanel initiation * copied schemas and specs file from singer connector * authentication and a few streams * Added Funnels + FunnelsList * Added example of funnel response * added incremental Funnels stream with tests * added Annotations, CohortMembers, Engage, Cohorts, Funnels * added Revenue * fixed formatting * fixed variable names * fixed cohort_members and updated export streams * moved start_date and date checks into SourceMixpanel class * added error handling * added unit test, update docs and ci creds * fix url base for export stream * added full and incremental read for export stream * updated acceptance tests, added limit correction based on number of streams, export cursor is stored in datatime string * Funnel stream - added complex state which contains state for each funnel * added attribution windows support and project timezone config * fixed formatting * added default timezone * added dynamic schema generation for Engage and Export streams * fixed formatting * fixed ability to pass start_date in datetime format as well * fixed ability to pass start_date in datetime format as well * added additional_properties field for dynamic schemas. updates regex for start_date matching to support old config file * fixed formatting * export stream - convert all values to default type - string * added schema ref * added new properties for funnel stream * fixed formatting in funnel schema * added build related files * update changelog * fixed and added comments, renamed rate_limit variable * fixed formatting * changed normalization for reserved mixpanel attributes like $browser * alphabetise spec fields * added description about API limit handling * updated comment * Add openreplay variable (#4844) * 🐛 Sendgrid source: Gracefully handle malformed responses from sendgrid API (#4839) * Update job description (#4848) * Update job description * Create senior-product-manager * Create founding-account-executive * Update senior-product-manager * Update SUMMARY.md * Add py destination tutorial to summary.md (#4853) * Update CHANGELOG.md * 🐛 Kube: Fix Source Ports not releasing. (#4822) Closes #4660 . On further investigation, it turns out we were not releasing the source ports. This is because of how the Process abstraction works - waitFor calls close under the hood. We were only calling waitFor if the process was still alive. This is determined by the exitValue which comes from the Kubernetes pod's termination status. However, these ports are a local resource and no close calls means they were left dangling, leading to the behaviour we see today. Explicitly call close after retrieving the exit value of the Kubernetes pod. This better follows traditional assumptions around Processes - if the process returns some exit value, it means all resources associated with that process have been cleaned up. Also, - add in a bunch of debug logging for the future. - have better names for Kubernetes workers to make operations easier. * use new AMI ID for connector builds (#4855) * Wait for config volume to be ready (#4835) * Do not create config directory in fs persistence construction * Run kube acceptance test only for testing purpose * Wait for config volume to be ready * Move config volume wait for fs persistence construction * Restore ci workflow * Prune imports * 🎉 New source: US census (#4228) Co-authored-by: Sherif Nada <[email protected]> * publish US Census (connector) (#4857) Co-authored-by: Daniel Mateus Pires <[email protected]> Co-authored-by: Daniel Mateus Pires <[email protected]> * 🐛 Source JIRA: Fix DBT failing normalization on `Labels` schema. (#4817) (#4817) 🐛 Source JIRA: Fix DBT failing normalization on `Labels` schema. Co-authored-by: Oleksandr Bazarnov <[email protected]> * Rename founding-account-executive to founding-account-executive.md * Tweak ConfigNotFoundException class (#4821) * Use internal_api_host env variable * Source ZenDesk: format and validate code * refactor import / export endpoints to use the same code path as auto migration (#4797) * fix build (#4865) * 📝 Add server version requirement for mysql normalization (#4856) * 🐛 Destination MySQL: fix problem if source has a column with json (#4825) * [4583] Fixed MySQL destination of fails is source has a column with json data * hotfix: rename senior PM file to add .md * 📚 improve mongo docs and param descriptions (#4870) * Remove duplicated seed repository (#4869) * add workspace helper (#4868) * add workspace helper * fmt * switch to a fixed limit * 🐛 Fix Oracle spec to declare `sid` instead of `database` param, Redshift to allow `additionalProperties`, MSSQL test and spec to declare spec type correctly (#4874) * Kube: Better Port Abstraction. (#4829) Introduce a better port abstraction whose primary purpose is to confirm that ports are released when the Kube Pod Process is closed. This prevents issues like #4660 I'm also opening more ports so we can run at least 10 syncs in parallel. * Source Zendesk: update docs * Remove unused files * add a stream_slices logic for ticket_comments stream * remove changes of other connections * add secret Zendesk keys to command configs * 🐛 Source Zendesk Support: add dummy unit test * add dummy integration test * fix Zendesk not loading username and facebook/twitter id #4373 * sort streams alphabetically * fix test issue with the unsupport field validate_output_from_all_streams * add info to source_definitions.yaml * remove json_schema from configured_catalog.json * add backoff logic * add unit tests * move part of unit tests to integration tests * fix test dependencies * add a build status Co-authored-by: Maksym Pavlenok <[email protected]> Co-authored-by: Sherif A. Nada <[email protected]> Co-authored-by: LiRen Tu <[email protected]> Co-authored-by: vovavovavovavova <[email protected]> Co-authored-by: Eugene Kulak <[email protected]> Co-authored-by: Abhi Vaidyanatha <[email protected]> Co-authored-by: Abhi Vaidyanatha <[email protected]> Co-authored-by: Shadab Mohammad <[email protected]> Co-authored-by: midavadim <[email protected]> Co-authored-by: Subodh Kant Chaturvedi <[email protected]> Co-authored-by: Davin Chia <[email protected]> Co-authored-by: Oliver Meyer <[email protected]> Co-authored-by: Artem Astapenko <[email protected]> Co-authored-by: Jenny Brown <[email protected]> Co-authored-by: Marcos Marx <[email protected]> Co-authored-by: Jared Rhizor <[email protected]> Co-authored-by: Charles <[email protected]> Co-authored-by: Varun B Patil <[email protected]> Co-authored-by: Dmytro <[email protected]> Co-authored-by: Yaroslav Dudar <[email protected]> Co-authored-by: Brian Krausz <[email protected]> Co-authored-by: George Claireaux <[email protected]> Co-authored-by: oleh.zorenko <[email protected]> Co-authored-by: Eugene Kulak <[email protected]> Co-authored-by: Eugene <[email protected]> Co-authored-by: John Lafleur <[email protected]> Co-authored-by: Anna Lvova <[email protected]> Co-authored-by: Marco Fontana <[email protected]> Co-authored-by: [email protected] <[email protected]> Co-authored-by: Marco Fontana <[email protected]> Co-authored-by: Christophe Duong <[email protected]> Co-authored-by: Serhii Lazebnyi <[email protected]> Co-authored-by: Vadym <[email protected]> Co-authored-by: Vladimir remar <[email protected]> Co-authored-by: Oleksandr <[email protected]> Co-authored-by: Oleksandr Bazarnov <[email protected]> Co-authored-by: Daniel Mateus Pires <[email protected]> Co-authored-by: Daniel Mateus Pires <[email protected]> Co-authored-by: jrhizor <[email protected]>
@MaxwellJK, previously because your commits were merged together within my PR, GitHub did not include you as a contributor to this project. I am really sorry about that. To fix it, I reopened and merged this PR (and immediately reverted the changes, because they are deprecated now). You are now in the contributor list: https://github.com/airbytehq/airbyte/graphs/contributors. Thank you again for working on this. |
What
This pull request will add Google Cloud Storage as a new destination for Airbyte,
How
The connector is mostly derived from S3 0.1.6, including then CSV and Parquet files.
Few changes have been applied to
destination-jdbc
as one of the methods was missing (GcsStreamCopier.java
->attemptGcsWriteAndDelete
).In addition, a new folder called
destination-gcs
has been created with all the libraries needed.Recommended reading order
destination-gcs/*
destination-jdbc/*
Pre-merge Checklist
Expand the checklist which is relevant for this PR.
Connector checklist
- [ ] Issue acceptance criteria met - [ ] PR name follows [PR naming conventions](https://docs.airbyte.io/contributing-to-airbyte/updating-documentation#issues-and-pull-requests) - [ ] Secrets are annotated with `airbyte_secret` in output spec - [ ] Unit & integration tests added as appropriate (and are passing) * Community members: please provide proof of this succeeding locally e.g: screenshot or copy-paste acceptance test output. To run acceptance tests for a Python connector, follow instructions in the README. For java connectors run `./gradlew :airbyte-integrations:connectors::integrationTest`. - [ ] `/test connector=connectors/` command as documented [here](https://docs.airbyte.io/contributing-to-airbyte/building-new-connector#updating-an-existing-connector) is passing. * Community members can skip this, Airbyters will run this for you. - [ ] Code reviews completed - [ ] Credentials added to Github CI if needed and not already present. [instructions for injecting secrets into CI](https://docs.airbyte.io/contributing-to-airbyte/building-new-connector#using-credentials-in-ci). - [ ] Documentation updated - [ ] `README.md` - [ ] `docs/SUMMARY.md` if it's a new connector - [ ] Reference docs in the `docs/integrations/` directory. - [ ] Changelog in the appropriate page in `docs/integrations/...`. See changelog [example](https://docs.airbyte.io/integrations/sources/stripe#changelog) - [ ] Build status added to [build page](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/builds.md) - [ ] Build is successful - [ ] Connector version bumped like described [here](https://docs.airbyte.io/contributing-to-airbyte/building-new-connector#updating-a-connector) - [ ] New Connector version released on Dockerhub by running the `/publish` command described [here](https://docs.airbyte.io/contributing-to-airbyte/building-new-connector#updating-a-connector) - [ ] No major blockers - [ ] PR merged into master branch - [ ] Follow up tickets have been created - [ ] Associated tickets have been closed & stakeholders notified
Connector Generator checklist
- [ ] Issue acceptance criteria met - [ ] PR name follows [PR naming conventions](https://docs.airbyte.io/contributing-to-airbyte/updating-documentation#issues-and-pull-requests) - [ ] If adding a new generator, add it to the [list of scaffold modules being tested](https://github.com/airbytehq/airbyte/blob/master/airbyte-integrations/connector-templates/generator/build.gradle#L41) - [ ] The generator test modules (all connectors with `-scaffold` in their name) have been updated with the latest scaffold by running `./gradlew :airbyte-integrations:connector-templates:generator:testScaffoldTemplates` then checking in your changes - [ ] Documentation which references the generator is updated as needed.