Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update Databricks naming #13722

Merged
merged 10 commits into from
Jun 14, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -70,10 +70,10 @@
dockerImageTag: 0.1.6
documentationUrl: https://docs.airbyte.io/integrations/destinations/clickhouse
releaseStage: alpha
- name: Databricks Delta Lake
- name: Databricks Lakehouse
destinationDefinitionId: 072d5540-f236-4294-ba7c-ade8fd918496
dockerRepository: airbyte/destination-databricks
dockerImageTag: 0.2.1
dockerImageTag: 0.2.2
documentationUrl: https://docs.airbyte.io/integrations/destinations/databricks
icon: databricks.svg
releaseStage: alpha
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -996,12 +996,12 @@
- "overwrite"
- "append"
- "append_dedup"
- dockerImage: "airbyte/destination-databricks:0.2.1"
- dockerImage: "airbyte/destination-databricks:0.2.2"
spec:
documentationUrl: "https://docs.airbyte.io/integrations/destinations/databricks"
connectionSpecification:
$schema: "http://json-schema.org/draft-07/schema#"
title: "Databricks Delta Lake Destination Spec"
title: "Databricks Lakehouse Destination Spec"
type: "object"
required:
- "accept_terms"
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Databricks Delta Lake Destination Connector Bootstrap
# Databricks Lakehouse Destination Connector Bootstrap

The Databricks Delta Lake Connector enables a developer to sync data into a Databricks cluster. It does so in two steps:
This destination syncs data to Delta Lake on Databricks Lakehouse. It does so in two steps:

1. Persist source data in S3 staging files in the Parquet format.
2. Create delta table based on the Parquet staging files.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,5 +16,5 @@ ENV APPLICATION destination-databricks

COPY --from=build /airbyte /airbyte

LABEL io.airbyte.version=0.2.1
LABEL io.airbyte.version=0.2.2
LABEL io.airbyte.name=airbyte/destination-databricks
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Destination Databricks Delta Lake
# Destination Databricks Lakehouse

This is the repository for the Databricks destination connector in Java.
For information about how to use this connector within Airbyte, see [the User Documentation](https://docs.airbyte.io/integrations/destinations/databricks).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
"supported_destination_sync_modes": ["overwrite", "append"],
"connectionSpecification": {
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Databricks Delta Lake Destination Spec",
"title": "Databricks Lakehouse Destination Spec",
"type": "object",
"required": [
"accept_terms",
Expand Down
5 changes: 3 additions & 2 deletions docs/integrations/destinations/databricks.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Databricks Delta Lake
# Databricks Lakehouse

## Overview

This destination syncs data to Databricks Delta Lake. Each stream is written to its own [delta-table](https://delta.io/).
This destination syncs data to Delta Lake on Databricks Lakehouse. Each stream is written to its own [delta-table](https://delta.io/).

This connector requires a JDBC driver to connect to the Databricks cluster. By using the driver and the connector, you must agree to the [JDBC ODBC driver license](https://databricks.com/jdbc-odbc-driver-license). This means that you can only use this connector to connect third party applications to Apache Spark SQL within a Databricks offering using the ODBC and/or JDBC protocols.

Expand Down Expand Up @@ -104,6 +104,7 @@ Under the hood, an Airbyte data stream in Json schema is first converted to an A

| Version | Date | Pull Request | Subject |
| :--- | :--- | :--- | :--- |
| 0.2.2 | 2022-06-13 | [\#13722](https://github.com/airbytehq/airbyte/pull/13722) | Rename to "Databricks Lakehouse". |
| 0.2.1 | 2022-06-08 | [\#13630](https://github.com/airbytehq/airbyte/pull/13630) | Rename to "Databricks Delta Lake" and add field orders in the spec. |
| 0.2.0 | 2022-05-15 | [\#12861](https://github.com/airbytehq/airbyte/pull/12861) | Use new public Databricks JDBC driver, and open source the connector. |
| 0.1.5 | 2022-05-04 | [\#12578](https://github.com/airbytehq/airbyte/pull/12578) | In JSON to Avro conversion, log JSON field values that do not follow Avro schema for debugging. |
Expand Down