The Snowplow S3 Loader consumes records from an Amazon Kinesis stream and writes them to S3.
There are 2 file formats supported:
- LZO
- GZip
The records are treated as raw byte arrays. Elephant Bird's BinaryBlockWriter
class is used to serialize them as a Protocol Buffers array (so it is clear where one record ends and the next begins) before compressing them.
The compression process generates both compressed .lzo files and small .lzo.index files (splittable LZO). Each index file contain the byte offsets of the LZO blocks in the corresponding compressed file, meaning that the blocks can be processed in parallel.
The records are treated as byte arrays containing UTF-8 encoded strings (whether CSV, JSON or TSV). New lines are used to separate records written to a file. This format can be used with the Snowplow Kinesis Enriched stream, among other streams.
Technical Docs | Setup Guide | Roadmap | Contributing |
---|---|---|---|
Technical Docs | Setup Guide | Roadmap | Contributing |
Snowplow S3 Loader is copyright 2014-2023 Snowplow Analytics Ltd.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this software except in compliance with the License.
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.