Skip to content

Commit

Permalink
destination-s3: fix doc formatting
Browse files Browse the repository at this point in the history
  • Loading branch information
stephane-airbyte committed Oct 23, 2024
1 parent 367e2f0 commit 6ae1bd2
Showing 1 changed file with 38 additions and 30 deletions.
68 changes: 38 additions & 30 deletions docs/integrations/destinations/s3.md
Original file line number Diff line number Diff line change
Expand Up @@ -214,36 +214,44 @@ Use an existing or create new
destination**.
3. On the destination setup page, select **S3** from the Destination type dropdown and enter a name
for this connector.
4. Configure fields: _ **Access Key Id** _ See
[this](https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys)
on how to generate an access key. _ See
[this](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html)
on how to create a instanceprofile. _ We recommend creating an Airbyte-specific user. This user
will require
[read and write permissions](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html)
to objects in the staging bucket. \_ If the Access Key and Secret Access Key are not provided, the
authentication will rely either on the Role ARN using STS Assume Role or on the instanceprofile.
5. _ **Secret Access Key** _ Corresponding key to
the above key id. _ Make sure your S3 bucket is accessible from the machine running Airbyte. _
This depends on your networking setup. _ You can check AWS S3 documentation with a tutorial on
how to properly configure your S3's access
[here](https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-control-overview.html). _ If
you use instance profile authentication, make sure the role has permission to read/write on the
bucket. _ The easiest way to verify if Airbyte is able to connect to your S3 bucket is via the
check connection tool in the UI. _ **S3 Bucket Name** _ See
[this](https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.html) to
create an S3 bucket. _ **S3 Bucket Path** _ Subdirectory under the above bucket to sync the data
into. _ **S3 Bucket Region** _ See
[here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html#concepts-available-regions)
for all region codes. _ **S3 Path Format** _ Additional string format on how to store data under
S3 Bucket Path. Default value is `${NAMESPACE}/${STREAM_NAME}/${YEAR}_${MONTH}_${DAY}_${EPOCH}_`.
_ **S3 Endpoint** _ Leave empty if using AWS S3, fill in S3 URL if using Minio S3.

- **S3 Filename pattern** \* The pattern allows you to set the file-name format for the S3
staging file(s), next placeholders combinations are currently supported: `{date}`,
`{date:yyyy_MM}`, `{timestamp}`, `{timestamp:millis}`, `{timestamp:micros}`, `{part_number}`,
`{sync_id}`, `{format_extension}`. Please, don't use empty space and not supportable
placeholders, as they won't recognized.
4. Configure fields:
- **Access Key Id**
- See [this](https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys) on how to generate an access key.
- See [this](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html) on how to create a instanceprofile.
- We recommend creating an Airbyte-specific user. This user will require [read and write permissions](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference_policies_examples_s3_rw-bucket.html)
to objects in the staging bucket.
- If the Access Key and Secret Access Key are not provided, the
authentication will rely either on the Role ARN using STS Assume Role or on the instanceprofile.
- **Secret Access Key**
- Corresponding key to the above key id.
- Make sure your S3 bucket is accessible from the machine running Airbyte.
- This depends on your networking setup.
- You can check AWS S3 documentation with a tutorial on how to properly configure your S3's access [here](https://docs.aws.amazon.com/AmazonS3/latest/userguide/access-control-overview.html).
- If you use instance profile authentication, make sure the role has permission to read/write on the
bucket.
- The easiest way to verify if Airbyte is able to connect to your S3 bucket is via the
check connection tool in the UI.
- **S3 Bucket Name**
- See [this](https://docs.aws.amazon.com/AmazonS3/latest/userguide/create-bucket-overview.html) to create an S3 bucket.
- **S3 Bucket Path**
- Subdirectory under the above bucket to sync the data
into.
- **S3 Bucket Region**
- See
[here](https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html#concepts-available-regions)
for all region codes.
- **S3 Path Format**
- Additional string format on how to store data under
S3 Bucket Path. Default value is `${NAMESPACE}/${STREAM_NAME}/${YEAR}_${MONTH}_${DAY}_${EPOCH}_`.
- **S3 Endpoint**
- Leave empty if using AWS S3, fill in S3 URL if using Minio S3.
- **S3 Filename pattern**
- The pattern allows you to set the file-name format for the S3
staging file(s), next placeholders combinations are currently supported: `{date}`,
`{date:yyyy_MM}`, `{timestamp}`, `{timestamp:millis}`, `{timestamp:micros}`, `{part_number}`,
`{sync_id}`, `{format_extension}`.
- Please, don't use empty space and not supportable
placeholders, as they won't recognized.
<!-- /env:oss -->

6. Click `Set up destination`.
Expand Down

0 comments on commit 6ae1bd2

Please sign in to comment.